Apple’s next iPhone will reportedly include a “time of flight” sensor that would enable 3D image capture and improved portrait photography. It will also feature an improved version of Face ID for facial recognition.
Both details were included in a Barclays analyst note spotted by CNBC.
What is time of flight? — Time of flight sensors essentially work by blanketing a scene with light and analyzing how long it takes for the light to be reflected back into the camera. Portrait photographs, where a subject is in focus but the background is blurred, sometimes display weird artifacts around the subject when the camera struggles to capture its outline properly — a ToF sensor could potentially help with better understanding the exact shape of objects.
There are AR applications here too — The sensor could also be used in developing 3D models for augmented reality applications, a field Apple has shown a lot of interest in through the launch of developer tools like ARKit. Apple CEO Tim Cook has been very optimistic about AR, going so far as to say that it, “is the future of the iPhone.”
Face ID on the front of the phone already uses a time of flight sensor to create a scan of your face.
It’s unclear how Face ID would change — The research note doesn’t indicate how Face ID would be improved, but Apple has worked on the feature and over time it has become faster and more reliable. At this point your iPhone will unlock almost as soon as you look at it. Maybe it'll finally be able to unlock with my face in my pillow? One can dream.
Well-regarded Apple analyst Ming-Chi Kuo previously reported on the addition of the time of flight sensor, and has always said the high-end models will support faster mmWave 5G networks.