Sunday, August 13, 2017

Depth Sensing with Apple's Dual Cameras

Apple's developer conference videos are always worth digging through, even if you are not a programmer. Some give an insight into Apple's technology, and a few clues as to where Apple is headed. Apple's dual cameras on the iPhone 7 Plus does a neat parlor trick, mainly a simulated bokeh effect. Watch this video:
https://developer.apple.com/videos/play/wwdc2017/507/

I have read that Samsung had this blurred background effect "two years ago", which I am not sure is true. It is a "selective focus" mode where you select where the area of focus is to be on the photo. It is not really bokeh, but it's something.

The difference with Apple's implementation is that they are using the dual cameras to measure depth. This allows them to apply effects based on this information. This is an entirely different way of working than Samsung's approach. For one thing, since the depth information is recorded with the image, this bokeh effect can be applied after the image has been taken. In the demos in the video, the programmers show other effects that can be applied after the fact. And the most important thing of all is that all of this is available in a newly minted API that any programmer can use for their imaging or photography application. Plus they have now implemented dual camera capture. Plus, there is ARkit (more on this later).

There is a companion video where Apple's programmers show how to work with images that have depth data. It's interesting as well, but not necessarily as eye-opening for me as the video I have linked to.

I hope that Apple manages to squeeze dual cameras into their "non-plus" version of their next phone. I'm there.

No comments: