Hacker Newsnew | past | comments | ask | show | jobs | submit | cabel's commentslogin

It's totally ok that you wouldn't have had the same gut feeling — but I absolutely did. I wrote that because it seemed like the binder was unnecessary information posed specifically in the photo to try to gain my trust. I think that's largely because the binder was unnecessary information posed specifically in the photo to try to gain my trust.

Bonus trivia I didn't mention: if you look at the invoice preceding the one in the binder, it has the logo for the 1992 (!) Barcelona Olympics at the top. A real weird thing to put in your 'Bills 2000 — 2010' binder! I do think this binder is totally real, but should actually be labelled 'My Forged Invoices' ;)


> I wrote that because it seemed like the binder was unnecessary information posed specifically in the photo to try to gain my trust

As opposed to just being something he grabbed to use as a quick kind of table? I mean out of the two possibilities of just quickly throwing something on the couch to use as a platform to take a photo or purposely trying to set up a staged scene to look 'casual', I think Occam would say the first is significantly more likely.

Nothing wrong with having suspicions or gut feelings, but this just seemed a stretch too far to me personally. Nice article though!


"Not only that but Emails Sent seems like a useless vanity metric."

We use it on our board to track the volume of support e-mails going out the door for each support agent. It's very useful!


If you measure something, people will adapt their behavior to what you're measuring. If you measure productivity by volume of mails, people will try to increase the volume of mails at the expense of what is in those mails.

From that perspective, I think dashboards are useful as a sort of warning signal for things going wrong, but not as indicators of whether things are going right. So, the right way to approach it in my opinion is to measure whether mails are being sent at all, because that would indicate a major issue.


You're assuming it's a productivity measure and not a "sheesh, Bob looks like he could use a hand today" sort of thing.


In this case, AirPlay is our easier-to-understand way of saying "the results you're getting are exactly the same as when you stream iOS video using AirPlay". (Which is really weird for a video-out dongle! Especially since the former, non-Lightning one did proper video out.)

I've tweaked the post to make it clearer that AirPlay isn't _necessarily_ the _exact_ mechanism being used! It could just be H.264 or MPEG or whatever.

Mostly I'd love to know exactly what this chip/system does, so if anyone here with far more advanced hardware knowledge than any of us have feels up to hacking around, that'd be amazing! :)

The key takeaways from the post are:

1. It's feels unusual that a AV adapter would have a full SoC ARM-based CPU with RAM etc, not just a video encoder/decoder chip. Is it unusual? Let me know. :)

2. It's a bummer that the Video Out isn't very good, and not true 1080p, and has MPEG artifacts.

(Thanks for sharing the link.)


> Mostly I'd love to know exactly what this chip/system does, so if anyone here with far more advanced hardware knowledge than any of us have feels up to hacking around, that'd be amazing! :)

Its converting an encoded steam to HDMI output. There aren't enough pins on a lightning connector to directly output HDMI and even if it could, you still need a transceiver somewhere (not particularly trivial in terms of space or power to stuff in the phone).

The SoC is most likely a little ARM core to manage stuff (Cortex-M0/3) with a video decoder and HDMI transceiver, the 256MB of ram is there primarily for the decoder to use. It also needs to mux the audio stream into the HDMI encoding.

As for 'it runs iOS', your really splitting hairs, its pretty normal for the master devise to load a slave device with its firmware when plugged in (rather than storing the firmware in flash on the slave). The fact that they would use some knocked down version of iOS isn't terribly surprising. Embedded versions of more powerful OSs are used all the time; There is a non-zero chance your stove and microwave are 'running linux'.


> There aren't enough pins on a lightning connector to directly output HDMI and even if it could, you still need a transceiver somewhere (not particularly trivial in terms of space or power to stuff in the phone).

MicroUSB+MHL accomplishes the same thing quite well on the current crop of Android phones. Why Apple chose the method they did is still quite odd.


MHL is an option but it only fixes the number of pins issue. It has the same problem as HDMI out: you need a transceiver capable of multi-Gbit data rates. You also have to get the video output to the transceiver. Its not the easiest thing to do inside a phone, its really not the easiest thing to do when your next phone release is focused on 'thinner and lighter'.

Apple already made a strong commitment to AirPlay, so they already had a focus on building a fast, smooth, low power encoder that could encode the entire screen. Once encoded, the stream is probably only a few Mbit/s, a data rate that can easily be transmitted with single ended protocols like SPI. Almost all SoCs already have multiple SPI buses so no need to change any hardware.

Their solution may seem inelegant to some, but I think it is great. They managed to support a feature with almost zero hardware costs on the core device (depending on how you assign the lightning connector). A feature that I'm willing to bet only a very small % of all users will ever use. That is a big win, not having to dump extra hardware into a device that only 5% of all users will ever activate does great things for your margin and design flexibility.


> There is a non-zero chance your stove and microwave are 'running linux'.

Unless Linux comes with a BSD license, there is, in fact, zero chance. Apple is known to run a NetBSD variant on the airport routers - I'd say that is what is likely or whatever the hell a "stub version of iOS" means.


What does BSD vs GPL have to do with my stove or microwave?


There is a false assumption that manufacturers don't use Linux because of GPL. Most products don't use Linux because they don't need it, and they use something like FreeRTOS or another commerical RTOS. Plenty of companies use Linux in all sorts of home appliances, mostly TV products, but an increasing number of fridges from companies like Samsung do have Linux and they do release the code. What manufacturers don't do is release the applications that they run on the OS (despite the false assumption that they should).


The GPL requires that any changes (made to code used in released products) be released under the GPL. I'm unaware of any home appliance manufacturer that freely releases the OS of their products under the GPL, as they would have to if they were using Linux.


If they did, they would probably do it the way Amazon does with Kindle or Samsung does with Android. A giant tarball you can download from somewhere on their site, but not necessarily easy to just know about (like a link on the homepage).

If you go here: http://opensource.samsung.com you can see that they have the OS for some TV's. This is the typical way they comply (with unuseful, unannotated giant dumps).

LG actually has some appliances:

http://www.lg.com/global/support/opensource/index


I'm sorry, that was a brainfart- I thought you meant the connector and the host device and not quite literally a random microwave and oven :)


It's not that weird at all, in a sense. With this mystery cleared up, all the evidence is basically consistent with "Lightning" being just USB 2.0 over a proprietary connector plus some proprietary authentication chips in the cables. There is literally nothing it does that could not be done with a standard micro USB 2.0 connector.


Lightning is much, much, much more sophisticated than the brain-damage that is USB [1].

[1] http://brockerhoff.net/blog/2012/09/23/boom-pins/


Knowing what we know now that blog post is kind of amusing. Apple's Lightning appears to be less sophisticated than the DHL video out over micro-USB connectors it complains about. It's like Apple said "screw trying to get a proper video out, we'll just cram it all down USB 2.0 and lossily compress the hell out of it to get it to fit, then stick an entire ARM SoC in the adapter to decompress it again" - it's the most kludgy solution imaginable. (I have no idea whether Lightning actually uses USB 2.0, but it seems to be in the same ballpark bandwidth-wise.)


Yeah, someone mentioned something similar elsewhere in the thread and it certainly makes sense to me. It fits in with the comparison to Miracast, the need for hardware support for full screen mirroring over AirPlay. I can see the adapter processing a stream in a similar format or some such as AirPlay (without being a classic "AirPlay receiver" which is what I was hung up on)

For your two points:

1. I think that comes with the territory of being a Lightnin accessory and makes sense if they're doing some sort of compression or something before hand. It certainly fits with Apple's persona. Control the content, don't let any-old-body create a video adapter. They have to deal with Apple's handshake/stream/whatever because Apple doesn't pin-out the HDMI for the Lightning connector...

2. That is a major bummer and seems like one of those "details" that Apple would get right. I'm certainly curious the more I think about it as well.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: