Apple’s Behind-the-Scenes iPhone X Video Shows Feature Development
The company revealed how a key feature was born.
Apple has lifted the lid on the creation process behind one of the iPhone X’s most unique features. Portrait Lighting, also available on the iPhone 8 Plus, uses depth information to change the brightness around a subject’s face. On Wednesday, the company uploaded a video that shows the expert advice it received during production.
“We studied the art of portraiture from paintings to photographs,” the company said in its video. “We worked with global image makers and the world’s best photographers, combining timeless lighting principles with advanced machine learning to create an iPhone that takes studio-quality portraits without the studio.”
The feature takes advantage of the A11 Bionic chip, a new component unveiled at the company’s September 2017 iPhone event at the Steve Jobs Theater at Apple’s new Cupertino campus. Touted as a breakthrough in on-device machine learning, the chip powers advanced artificial intelligence capable of taking depth information and using it to create new effects. Apps like Focos have revealed the phone captures a high degree of this information.
Watch the video below:
Apple’s offering packs a number of built-in effects:
- Natural is the simplest effect, retaining the same blurred background from the Portrait Mode found in the iPhone 7 Plus that launched in 2016.
- Studio is similar to Portrait Mode, but the subject’s face is illuminated.
- Contour aims to create a more striking contrast in light and shadows.
- Stage Light uses the depth information to darken the background entirely, giving the effect of the subject standing on the stage.
- Stage Light Mono is the same as Stage Light, but it uses a monochrome filter.
The depth information is captured differently depending on the situation. In the case of the rear camera on the iPhone 8 Plus and iPhone X, the phone uses the difference between the two rear lenses to calculate the information. On the iPhone X, the front camera can use the same TrueDepth sensors used for face unlocking to scan for layers.
The company is not stopping there, though. I an interview earlier this week, CEO Tim Cook hinted that the company is already working on products well into the 2020s.