The brain of the 2019 Phones has been put into production, and it is set to be the most cutting-edge processor Apple has ever put into a smartphone.
This so-called A13 processor will succeed the A12 Bionic chips that shipped with the iPhone XS, XS Max, and XR in 2018, and the A13 could imbue its next-generation handsets with an arsenal of new camera, augmented reality, and artificial intelligence capabilities.
The company has reportedly finished the processor’s design and passed them to Taiwan Semiconductor Manufacturing Co. (TSMC). The components could be put into mass production as early as May according to a Saturday Bloomberg report by Mark Gurman and Debby Wu. The duo cited anonymous sources at the Taiwanese company that said Apple is getting the ball rolling on the A13 ahead of its usual September iPhone keynote. And only days later more evidence emerged to fan the flames of what could be the next iPhone’s biggest features.
On Monday, Gurman tweeted a leaked image of three iPhone case moldings, which allegedly showcase the back panel of the three upcoming phones. The two high-end models (codenamed D43 and D44) are seen with three rear cameras and the new XR (or N104), which will come with two lenses. They were all arranged in a square camera bump that has been teased in the past and now seems to be what the new iPhones will be released with.
The camera arrangement’s shape has been a point of contention for Apple fans online. Some loath the design change, while others think customers need some time to get used to it. But what’s undeniable are the features that the new camera array and A13 chip could bring to the handset.
Here’s everything we know about how Apple will pair its upcoming processor with an all-new camera setup to take iPhones to new heights.
Apple A13 Chip: Release Date
The A13 chip is expected to debut on Apple’s 2019 iPhones. Ever since the release of the iPhone 5 in 2012, Apple has hosted an event to launch its flagship smartphones in September. So this fall is when we’ll likely catch a first glimpse at what the processor will enable the new iPhones to do.
Last year’s iPhone keynote was hosted on September 12, and the most recent iPhone lineup hit shelves on September 21. Expect a similar series of events this time around, with potentially a week or two of discrepancy barring any major announcements leading up to the fall.
A13 Chip: Image “Auto-Correction Feature”
The most compelling new perk that could come to the phones is a camera feature Gurman described as “an auto-correction feature to fit people back into a photo who may have been accidentally cut out.” This will likely be enabled by the third lens being installed on the D43 and D44 models.
They’re both getting an “ultra-wide angle” sensor able to take images with a much larger field of view. This new addition could be used to passively take wider images to make sure users don’t miss anything in their shot.
For example, taking a photo of a large group of people might take a couple of tries to make sure no one is cut out of the frame. But the new iPhones could automatically take a wide-angle shot even when users are on a different zoom so they don’t have to retake the picture.
A13 Chip: Artificial Intelligence-Powered Image Editing
Last year, Apple highlighted the “Neural Engine” that comes built into the 7-nanometer architecture of its A12 Bionic processor. This enables iPhone photographers to adjust the background blur of Portrait Mode images — an effect known as bokeh — after an image has already been taken. We could see improvements to these capabilities this year.
Apple’s neural engine advancements essentially build a machine learning model straight into iPhones to accomplish tasks, like editing background blur after an photo has been taken. Before the A12 Bionic, users would have needed to send their images to Apple servers, edit them there, and get them back on their phone. That’s slower and requires internet connection, both of which are cumbersome to make one simple photo tweak.
Now users can make Photoshop-like edits right from the iOS Photo app, and this year could see further upgrade to what they’re able to edit. Users might be able to retroactively edit the field of view in non-portrait mode pictures to give iPhone photographers even more powerful editing tools.
A13 Chip: A.I. Improvements to Siri
An improvement to the A13’s neural engine could also result in serious Siri upgrades. Currently, Apple’s voice assistant needs to route voice commands through Apple’s machine learning algorithms, housed in its servers. But figuring out a way to house those A.I. models inside the phone would be a Siri game changer.
The voice assistant could, say, set a timer, turn on your flashlight, set a reminder, and open apps while an iPhone is offline or on Airplane Mode. A previous Apple patent described this capability, and the company poached Google’s former chief of search and artificial intelligence, John Giannandrea, early in 2018 to boost Apple’s A.I. initiative. An executive could have the secret sauce to improve Siri in this way.
Google recently debuted the “Next-Generation Google Assistant” during its annual I/O developers conference on May 7. This update compressed a 100GB machine learning model into just half a gigabyte to fit it into its future Pixel phones. It’s possible that Giannandrea is trying to recreate this project with Apple as the patent would suggest.
Now the question is if the company will be able to pull it off by September or if an improved Siri will need to wait until 2020.
A13 Chip: Augmented Reality Levels Up
It’s no secret that Apple has made AR core to its iPhone strategy. Last year, it touted that its A12 bionic chip can detect flat surfaces in images to better overlay virtual objects. This lays the groundwork for future AR apps and games using its ARKit 2.0 development tools and, eventually, its AR glasses, which are expected to be an iPhone accessory.
Apple employees have been quietly meeting with AR firms and filing a flurry of AR headset patents that describe how they could work. Plus, a third lens on the high-end iPhone will give the processor more data to work with, which could lead to better holograms.
AR was a big focus for Apple in 2018, and with its headset predicted to launch as early as 2020, it will more than likely remain that way this year.
A13 Chip: Reverse Wireless Charging and Battery Life
Apple’s phone processor is also in charge of determining what apps the most battery power is being used for. Each new processor has brought about boosts in battery life, but this year the introduction of reverse wireless charging will require smarter battery savings to ensure users aren’t left at zero percent.
Just like Samsung’s Galaxy S10 series, Apple is expected to let users wirelessly charge their AirPods or Apple Watch on the new iPhones’ back panel. This feature can come in extra handy when users are traveling but could sap an iPhone’s juice completely.
Samsung’s S10+ comes with a whopping 4,100 mAh battery, while the iPhone XS Max comes in at 3,174 mAh. Apple will either need to seriously boost their battery’s capacity or make more effective use of power with its A13 chip.
A13 Chip: What Isn’t Coming
Don’t get your hopes up for 5G support on the 2019 iPhones. Apple has settled a major legal battle with Qualcomm and secured it the producer for its 5G modems in the future. But analysts previously told Inverse that it’s too late in the game for the 2019 iPhones to roll out with 5G.
Gurman didn’t mention anything about 5G support, and Apple CEO Tim Cook side-stepped questions about the future of broadband connectivity during its most recent earnings call. A majority of forecasts see Apple introducing the feature in 2020.
In the meantime, users could receive camera, AR, and Siri upgrades to hold them over.