Science

iPhone 11: Mysterious ‘Deep Fusion’ Will Use A13 Chip to Fine-Tune Photos

Apple calls it 'mad science.'

by Sarah Wells
Apple

Get your online ordering fingers ready. Apple’s newest iPhone 11 goes on pre-sale Friday, and it’s set to pack an even more powerful camera.

The new phones will feature a trifecta of back-facing cameras, improved photo, video and editing quality, as well as a power-boosted processing chip, the A13 Bionic.

While there are plenty of features for users to get excited about, the company’s announcement on September 10 did still leave some features on the new phones unclear. Among those was a sneak peek at a new advanced photography system called Deep Fusion.

“It is computational photography mad-science,” Phil Schiller, Apple’s senior vice president of online marketing, said onstage during the announcement.

The name alone does suggest a certain Doc Brown vibe, but the actual science behind it is not quite so futuristic. Schiller describes that Deep Fusion will use the phone’s new boosted Neural Engine to help users take the best photo possible without any extra legwork on their part. In total, Deep Fusion will take nine images of varying exposure lengths for every single click of the camera’s shutter a user makes. Eight of these will be taken before the user even clicks the shutter button, followed by a ninth long exposure when the user presses the shutter button.

Then, in one second, Deep Fusion will go through the captured 24 million pixels to pick and choose the most detailed and lowest noise ones to stitch together a perfect Franken-photo.

“This is the first time a Neural Engine is responsible for generating the output image,” Schiller said onstage.

While the new feature may sound shiny and exciting at first glance, some users on Reddit were not quite so taken by it, with some criticizing Apple’s lack of transparency by not comparing side-by-side photos of images taken with Deep Fusion and those without. Others compared the feature to a day-time Night Mode.

The last comparison is certainly apt as Night Mode, as well as Portrait Lighting and Smart HDR features, also fuse A.I. and photography in a similar manner to take the guesswork out of good photography.

This is a philosophy Schiller made clear years earlier when unveiling the iPhone 5S.

“It used to be the way you take better pictures is you learn to be a better photographer,” Schiller said on stage in 2013. “For most of us, we just want to take a better picture, and have the iPhone take a better picture for us.”

Yet, despite all the pomp and circumstance, Deep Fusion is not a feature users will find on the iPhone 11 come delivery day on September 20. Instead, the company says the feature will be dropping later this fall during a software update, timing which Computer World has speculated may line up with Google’s announcement of the Pixel 4.

While competing with the Pixel 4 may be part of the reason Apple is releasing Deep Fusion later, it’s also not the first time the company has teased a new feature only to release it in a software update. Portrait Mode was also released in iOS 10.1 in October 2016, the month after the iPhone 7 Plus that supported the feature hit store shelves.

Related Tags