Science

How Portrait Mode Works on the iPhone 7+

Apple is pushing Portrait Mode on iPhone users. Here's how it works.

Apple

How does Apple get that snazzy blur on photos from the iPhone 7+? You’ve probably seen it — the product of the newly-finalized “portrait mode” the company has been pushing in a recent string of ads.

The image below could almost be an image from a DSLR, or other high-end camera with a wide-aperture lens attached. “Wide-aperture” means the lens has a bigger opening for light to flow through, which creates a narrow plane of focus. That’s how you get the tight, sharp focus of an image like the guy in the red hat below.

Here’s what a portrait mode image looks like at its best, in an Apple press image:

Apple shows off 'Portrait Mode' in a press image

Apple

And here are images shot with a Nikon d800 with an 85mm f/1.8 Nikon lens:

To take Apple’s marketing at its word, Portrait Mode is a way to make your subject beautifully stand out from the world around, even in a busy scene, as illustrated in this Apple promotional video for the feature, released on May 1:

That your subject can stand out from the world behind them is kinda true … up to a point.

What Apple Portrait Mode really does

The iPhone 7+ in Portrait Mode mimics the looks of bokeh, or the orb-filled out-of-focus texture of a wide-aperture lens image. (In the wedding photo above, all that green background is bokeh.)

But Portrait Mode bokeh is not a result of the iPhone’s lens.

High-end smartphones like the iPhone 7+ actually do have fairly wide-aperture lenses, but they’re so tiny that bokeh only becomes noticeable when the subject is very close to the camera:

Move your smartphone farther away, and its natural bokeh will significantly recede, until everything is in focus.

The iPhone 7+ Portrait Mode uses some fancy computing to extend the range of noticeable bokeh to about eight feet.

Trying out Portrait Mode in the Apple Store.

Rafi Letzter/Inverse

How iPhone 7+ Portrait Mode works

The iPhone 7+ uses its twin rear cameras to see depth — just like you likely use your two eyes to see in 3D (unless you’re like me, and only have one half-decent eyeball in your head). It builds a rough depth map of the world it sees, and figures out which parts of the frame are closer to or farther away from its lenses.

Shooting an iPhone with an iPhone

Rafi Letzter/Inverse

At that point, it’s fairly straightforward for the iPhone to fuzz out the parts of the image that aren’t its focal point.

Most of the time, you’ll want to avoid it

You’ll notice that the effect isn’t perfect — the plane of focus tends to come in a bit sharply, unlike real-world bokeh. And edges of objects can warp and blur when the iPhone doesn’t judge them properly.

Apple is smart enough not to let the iPhone even try to use Portrait Mode beyond about eight feet. Try to shoot this way at a distance, and the depth effect simply won’t activate.

Portrait mode doesn't do anything if your subject is too far away.

Rafi Letzter/Inverse

In general, phones with Portrait Mode-like effects, such as the earlier Huawei P9, seem to work best very close to their subjects, with subjects placed immobile against well-defined backgrounds, and in broad daylight. That’s the kind of situation where a phone can nearly mimic a true DSLR — though you’ll still spot the telltale signs of computer intervention.

Here's a really successful example of the depth effect from a Huawei P9

Rafi Letzter/Inverse

On the balance, Portrait Mode is a pretty cool gimmick, and a nice preview of the potential of computational photography. But most of the time, you’re probably better off not using it at all.

See also:

Related Tags