Apple’s recently unveiled iOS 13 software is inching toward its final form that’s expected to ship to iPhones worldwide in mid-September.
The tech giant seeded the “public beta 2” for the software Monday following its initial launch on June 25.
Both developers and curious Apple users now have the opportunity to test drive the pre-released software and its smorgasbord of new features. Many of the additions have elevated and fine-tuned how users can naturally interact with their iPhones and made it easier to communicate with others on the device.
Apple’s mobile device is now closer to being a pocket-sized electronic assistant that ever before. iOS 13 now reminds users to treat themselves on holiday weekends, and even uses augmented reality to enhance FaceTime calls.
Apple fans on Reddit and Twitter have scoured the betas on the hunt for tweaks, updates, and changes from iOS 12. Anyone interested in diving in themselves should enroll in Apple’s Beta Software Program.
Here are five of the most cutting-edge features spotted in the early versions of iOS 13 by us so far, and they all make the iPhone feel more futuristic than ever, pushing the envelope on what your phone can and should do for you.
5. iOS 13: Siri Suggests When You Should Relax
Siri Suggestions were introduced in iOS 12 and made it so the iPhone’s virtual assistant could try to predict what actions a user wants to take next. One example: It suggests lowering your screen brightness when you walk into a dark room. But the feature has gotten even smarter on iOS 13.
Redditor /u/SaltiMoPho received a notification recommending they turn off their morning alarm before a holiday. (In this case, it was Canada Day, but it’s safe to assume it works for other holidays, too, based on your location.)
4. iOS 13: iPhones Understand Scribbles
Notes is as straightforward of an app as they get. It’s the iPhone’s only built-in text editing software and comes in handy with grocery lists and quick class notes. But iOS 13 is making it easier than ever to actually jot down notes.
Users can now scribble down digital notes with their finger or stylus. To top it off, Redditor /u/siberium noticed that the app can even understood the cursive he wrote out and transcribed it into text. Struggling to read your own bad handwriting is a thing of the past — iPhones promise to do it for you.
3. iOS 13: Guides Electric Car Owners to EV Chargers
Any iPhone user with an electric car now gets a specialized notification every time their EV is low on battery. The only requirement is that the car support CarPlay, and the system will direct users to the nearest charging station.
Redditor /u/SlendyTheMan said they had 10 miles of range left when both CarPlay and their iPhone pushed them a notification to open the Maps app with directions to a list of EV chargers in the area around him. Both CarPlay and iOS are doing their part to eliminate range anxiety, it appears.
2. iOS 13: AirPods Can Read Your Messages With Siri
iPhone users won’t have to unlock their phone to read and respond to messages any more; Siri can do it for them. The only catch is that they’ll need a pair of AirPods to access iOS 13’s Announce Messages with the Siri feature.
When Redditor /u/qiuChuck connected his AirPods, he received a pop-up notifying him when he’d be able to use the earbuds to text people back, hands-free. Google has offered this capability on its smartphones since the release of the Pixel 3 in 2018. Apple might be late to the party, but it’s finally stepped into the present with this futuristic feature.
1. iOS 13: FaceTime Users Use AR to Fake Eye Contact
OK, this is the slightly creepy one: iOS 13 will let users pretend they’re paying attention to the person on the other end using AR. Multiple iPhone users on Twitter noticed the phone slightly alters their eyes using ARKit to make it seem like they’re looking straight into the camera when they’re actually looking at the screen.
The change is incredibly subtle, but will make it appear that users are making eye contact with whoever they’re FaceTiming. Users will need to go into the FaceTime settings menu and toggle the “FaceTimeAttention Correction” setting on if they want to try it out.
Software engineer Dave Schukin explained what’s going on under the hood to bring this feature to life.
“It simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly,” he writes in a tweet with an accompanying video. “Notice the warping of the line across both the eyes and nose.”
Users typically look at the screen when they’re FaceTiming to see the other person. So adding a slight filter to create artificial eye contact is useful for people who want to seem totally wrapped up in a conversation.