The iPhone X’s new artificial intelligence tools will transform phones over the next 10 years by taking A.I. out of the cloud and taking the full power of machine learning offline.
CoreML, a new framework included in the iOS 11 software update released in September, lets developers access the on-phone processor for machine learning functions, instead of sending the data back to a server somewhere for analysis. This lets apps provide answers in an instant, according to one of the first developers to take advantage of Apple’s offerings.
“Offline A.I. I think is going to be the new buzzword for the next decade,” Borui Wang, the CEO of Polarr and developer of the app Album Plus, tells Inverse. “It will be almost at the same importance as ‘cloud computing’.”
Companies like Google and Amazon have gone for a largely cloud-based approach to their artificial intelligence offerings. The idea is that devices will send off data to a server somewhere for analysis, providing suggestions like how to respond to an email or what the weather’s like outside. Apple, with software designed to leverage its high-end mobile processors, wants to work those things out without transmitting data over the internet.
“That’s a very different proposition from all the other companies, and I think people should be aware of it,” Wang says.
CoreML, introduced at the company’s annual June developer conference this year, builds on Apple’s Metal graphics tools and applies it to a broader range of tasks. Apple is utilizing the graphics processors in the iPhone 6 and above to provide these machine learning tools. It’s not the first company to repurpose the chips in this way: Nvidia has started developing computers for autonomous cars built on its experise in this field.
“CoreML lets developers incorporate machine learning technologies into their apps, with all the processing done right on device, so it respects our customers’ data and privacy,” CEO Tim Cook told investors during the company’s earnings call in August.
The results are impressive. Inception V3 benchmarks show the iPhone 7 running on CoreML recognizes six times more images per minute than the Google Pixel and Samsung Galaxy S8. With the iPhone 8 and iPhone X, CoreML should run even faster due to the A11 Bionic chip that’s specifically designed for machine learning tasks.
A number of developers are already putting CoreML to work in their apps. Pinterest is using it to provide a visual search, while PadMapper is analyzing photos to help users rent out their home. VisualDX can aid doctors by using the camera to identify skin conditions.
Album Plus is aimed at organizing a user’s collection of photos. It can automatically enhance and edit photos, identify people, categorize receipts, rank similar photos based on aesthetics, and more. All of this is done offline, using Apple’s tools to make sure data doesn’t leave the device.
Polarr believes offline A.I. is going to be a big part of the coming years:
No one wants to upload their photos to a server that might leak their behavioral patterns to advertising companies, but most people still need the computing services provided by the cloud, such as image classification and search categorizations. The solution? Move the A.I. services offline to user devices.
Apple’s approach has an advantage for developers like Wang, who only have a few models of phone to worry about for reaching an audience of around 15 percent of global smartphone users. Android, where developers have far more components and setups to worry about, makes it trickier to guarantee a smooth experience for offline A.I. apps.
“As a developer, it’s a pain in the ass to figure out how to apply your model to 1,000 phones,” Wang says.
Offline solutions also mean users won’t need an internet connection to get smart responses to queries, and developers won’t need to maintain a server to provide answers. It’s early days, but CoreML could show a different path forward in the drive to more intelligent devices.
“I really believe in this,” Wang says. “Offline A.I. will definitely be the next big thing in mobile development.”
If you liked this article, check out this video: "Apple FaceID Doesn't Work at First-Ever Demo"