Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're certainly entitled to that opinion, as is Apple, but it basically flies against the definition of anything else in photography. "Straight Out Of Camera" is about taking the time and effort to nail something so that when you hit the shutter button, exposure, focus, composition are all "as the camera saw it", not "as the camera saw it and then image manipulation software found the subject, and applied a mask to the image to burn out the non-subjective areas while keeping the exposure of the subject as-is, if not enhanced".

Edit: Hell, even Apple's own iPhone X page up now says this:

"A new feature in Portrait mode, Portrait Lighting produces impressive studio‑quality lighting effects."

"Create beautiful selfies with sharp foregrounds and artfully blurred backgrounds."

Somehow, these are "effects" which don't fall under the umbrella of post-processing.

But this is a nitpick, admittedly. Nothing wrong with the feature or whatever, but it amused me to hear "No processing. Just [applied post processing]."



I wouldn't consider it post-processing because it's using data that's only available live. Making a live 3D mapping of a face and applying lighting effects to it will give you much better quality than applying lighting effects after the image has been saved.


"Create beautiful selfies with sharp foregrounds and artfully blurred backgrounds."

This occurs optically, by choosing the proper depth of field, and an appropriate focus point. The blurring of the background is "bokeh" and used in portraits, none of which is considered "post processing."

Clearly the iphone guts aren't nearly as capable as my 5D2, but my opinion is that if it's software that can produce a reasonable approximation of something that can be done optically (or in the case of studio lighting with a pair of strobes), it's fine to not be pedantic.


Unless I misread you, this is not what is happening here. That tiny camera/sensor combo is unable to do that narrow of a depth of field. There's a blur, but nowhere near that extent.

The “bokeh” here is completely software generated, and made quite the buzz last year. I remember reading a technical post by an Apple engineer explaining how they reinvented a way to do lens blur instantaneously on the phone.


How is it post-processing when it is happening automatically when you click? It's not something YOU have to do or take a creative call on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: