This week Apple previewed a number of fresh accessibility features for its lineup of smart devices, as well as a new option for communicating with AppleCare via sign language. One standout feature, called AssistiveTouch, allows those with limited mobility to use hand and arm motions to navigate the Apple Watch without ever actually touching the display.
By leveraging technology that already exists inside the Apple Watch — like the gyroscope and accelerometer — developers at the company have created a series of gestures for navigational purposes. The two primary motions Apple has shown off thus far are clenching with the whole fist and pinching with the thumb and index finger.
The feature is still in preview stages and will likely include more gestures in the future. It’s impressive even at this point, though, with uses ranging from the simplest of actions (answering a call with double clench) to the complex (navigating an action menu via wrist twists).
And AssistiveTouch is only one of many new accessibility features coming to the Apple ecosystem.
The full set of new features is pretty remarkable. Apple has put some serious resources behind making its devices more accessible.
Eye-pad — Third-party eye-tracking devices will soon be supported by iPadOS, making it possible for users to use iPads with just their eyes. Accessories certified by the Made For iPhone program (MFi) will be able to move a cursor around the iPad display with just eye movement, as well as complete actions like tapping.
Better VoiceOver — Apple is updating its VoiceOver screen-reading feature to provide better image descriptions for those who are blind or have low vision. The company says this update will allow VoiceOver to describe photos in better detail, down to a person’s positioning relative to other objects in the photo.
FaceTime with hearing aids — Apple devices already support connectivity to hearing aids through the MFi program, but the company is now getting ready to add support for bi-directional hearing aids. These hearing aids have microphones in them — so they’ll be useable for audio and video calls.
Along with this support, Apple is adding audiogram recognition to its operating systems, to adjust audio output frequencies based on a user’s hearing.
Sound machine — Soon the full suite of Apple devices will include a “Background Sounds” feature, which will essentially turn your iPhone into a white noise machine. Apple says it will include “balanced, bright, or dark noise, as well as ocean, rain or stream sounds.” The background noise will automatically mix with system notifications and other audio.
Apple has a few other — more minor — accessibility tricks up its sleeve as well, like the ability to customize display and text size settings for each individual app. The features will be rolling out some time this year.