How Apple Plans to Use a Third Rear iPhone Camera to Auto-Repair Photos

Your photos are about to get a lot smoother.

Apple plans to use a third rear camera on its next iPhone to improve users’ photos, a report explained Wednesday. The new camera, which has been heavily rumored for nearly a year, is expected to make it easier for amateur photographers to take high-quality pictures in an instant.

The new lens would join the dual lens system first introduced with the iPhone 7 Plus, which enables users to switch between a wider angle and telephoto view. The Bloomberg report explains that a third camera would offer an even wider field of view, which could offer functionality similar to last year’s Samsung Galaxy A7. This would enable the device software to repair photos and videos automatically, bringing in subjects just slightly outside of the shot’s frame. The company is also considering doubling the length of Live Photos, short videos captured automatically with every photo, to six seconds. These improvements come alongside a faster Face ID sensor, faster processor, and a possible switch to USB-C. The triple-lens setup, however, is only planned for the most expensive of the company’s three planned new phones.

The iPhone uses a dual-lens camera on high-end devices.
The iPhone uses a dual-lens camera on high-end devices.

See more: The Latest 2019 iPhone Rendering Showcases an Incredibly Bold New Design

The move appears to be a continuation of Apple’s goal to make photography simpler. Apple’s senior vice president of worldwide marketing Phil Schiller best summarized this when introducing the iPhone 5S in 2013, comparing a cumbersome equipment bag to a tiny smartphone and adding: “For most of us, we just want to take a picture and have the iPhone take a better picture for us.” In December 2017, photography site Flickr stated the iPhone was the most popular camera used for website uploads.

Users may want to hold off for the 2020 iPhone for the best photos. That device, the report explains, will come with a laser-powered 3D sensor that can detect objects from up to 15 feet away, compared to the dot-projection sensor used for Face ID that only works around 50 centimeters away. This would enable new augmented reality features and could pave the way for a set of augmented reality glasses. The lenses are expected to launch in the same year as the laser-shooting iPhone.

Apple tends to announce its new flagship iPhones in September, meaning a launch of the triple-camera design could arrive in just eight months’ time.

Before that, Apple is expected to unveil a new version of iOS in the summer — which could provide more clues about the company’s direction.