You’ll soon be able to control your iPhone and iPad with your eyes

Controlling iPad with eye movement.
Apple

Apple has announced a bunch of new accessibility features that will arrive later this year for iPhone and iPad owners. Notable among them is the ability to interact with iOS and iPadOS interfaces using eye movement, which is something that’s seen in a similar system on Mac hardware.

The company calls it Eye Tracking, and it’s a system built on the Dwell Control foundations. So far, Dwell Control has been available as part of the Accessibility Keyboard on macOS, allowing users to execute mouse actions using eye and head gestures.

On the iPhone and iPad, Eye Tracking will merely require a few seconds to calibrate and will work using the front camera. Once enabled, it will let users with physical disabilities perform swipe and button gestures with their ocular movements.

Dwell actions are also available for the Vision Pro headset. On the pricey XR machine, they are bundled as part of the Assistive Touch system under accessibility settings. On Mac machines, eye and head gestures allow mouse click, drag and drop, and other core UI control gestures.

Music Haptics on iPhone.
Apple

For users with hearing challenges, Apple is adding a feature on iPhones called Music Haptics. Once activated, the Taptic Engine fitted inside an iPhone will produce vibrations in sync with the music playback using a mix of rhythmic taps, smooth vibrations, and textures.

This feature has already been certified for the millions of songs in the Apple Music library. Developers can also leverage the application programming interfaces (APIs) to enable vibration-based accessibility feedback to make their apps more inclusive and functionally rewarding for people with hearing issues.

Vocal Shortcuts on iPhone.
Apple

For people living with speech-related difficulties, Apple is adding a couple of new features to its phones and tablets. The first one is Atypical Speech, which relies on machine learning to identify the unique speech signature of a person so that it can help them perform tasks using voice commands.

Next in line is Vocal Shortcuts. This one allows users to record custom audio cues and then assign them as shortcuts for various on-device tasks, which could be single step or multi-step in nature. Apple says these features have been “designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke.”

Vehicle Motion Cues on iPhone.
Apple

Another related upcoming feature is Personal Voice. People who find it hard to say or read long sentences can create a Personal Voice using shorter phrases.

Apple has also developed a wellness feature that takes into account motion sickness for in-vehicle circumstances. The feature in question is called Vehicle Motion Cues, and once enabled, it will show animated dots on the screen that are dynamically aligned with a vehicle’s directional movement. The idea is to reduce sensory conflict, making it easier for users to read on-screen content.