Apple unveils new accessibility features for iPhone, iPad and Vision Pro


The subject seems to be close to Apple’s heart, and it was therefore during an event hosted in particular by Sarah Herrlinger, senior director of global accessibility policies and initiatives at the firm, that we were able to discover what Apple is preparing on this theme. The program is ambitious, as Sarah Herrlinger’s words demonstrate: “These new features will impact the lives of a wide range of users, providing new ways to communicate, control their devices and move around the world“. Let’s go into detail.

Advertising, your content continues below

Eye tracking

Eye Tracking is a function powered by artificial intelligence which will use the front camera of the iPhone and iPad. It requires the presence of a Neural Engine which arrived on the iPhone 12. The idea is to allow people suffering from a physical disability to control their smartphone or tablet simply with the movement of their eyes. This function will be integrated into iOS. It will therefore not require the installation of additional applications or peripherals. Its implementation is simple and quick with very rapid calibration and configuration. The iPhone or iPad will then automatically study the user’s patterns to improve the operation of the Eye Tracking feature, which will work in concert with Dwell Control, another accessibility solution already present. It will then be possible to use buttons virtually or to swipe through the interface with a simple movement of the eye. Eye Tracking works with almost all apps. To do this, we must remember that the APIs for this function have been deployed to developers from iOS 15. Its precision in low light would be good.

Advertising, your content continues below

Music Haptics

Deaf or hard of hearing people will now be able to enjoy music thanks to this new function which uses the Taptic Engine to generate vibrations following the rhythm and nuances of the pieces played. Apple specifies that a large part of the songs offered in Apple Music are already compatible, with the Californian brand offering several million titles. Third-party music apps can access this feature through an API.

Vocal Shorcuts

Voice shortcuts in the language of Molière are a new feature offering the ability to create custom commands for Siri. The user will be able to orally launch complex and completely personalized tasks as well as shortcuts. He will simply have to repeat the vocal sequence three times. This function benefits from a technology that Apple calls Listen for Atypical Speech. Using AI, it improves the speech recognition of people suffering from speech difficulties and diseases impacting speech such as stroke.

Vehicle Motion Cues

Motion sickness can ruin the lives of people who suffer from it, but here too Apple has a proposal to relieve them. This new feature aims to enable more comfortable use of iPhones and iPads in a moving vehicle. To do this, when said movements are detected, animated points appear on the screen to reduce sensory conflicts for the user. The function can be activated automatically.

Advertising, your content continues below

CarPlay

The interface dedicated to automotive screens is enriched with new accessibility features. It will be possible to navigate through the different menus and applications by voice only. Sound recognition is emerging to support hearing-impaired people. For example, if a siren is heard by the system, a message is displayed directly on the screen. The CarPlay interface can be configured to accommodate color blind and visually impaired people.

VisionOS

The operating system of the famous Vision Pro will also take a turn towards accessibility in a few months with the arrival of the Live Captions function for FaceTime. Subtitles can be automatically generated during a live conversation, as in the various on-board applications. VisionOS will support several types of hearing aids labeled Made for iPhone. The interface can be modified to adapt to users who are visually impaired or sensitive to flashing lights. The accessibility potential of Vision Pro already seems to convince Ryan Hudson-Peralta, consultant and co-founder of Equal Accessibility LLC, who declares: “Apple Vision Pro is without a doubt the most accessible technology I have ever used. As someone born without hands and unable to walk, I know the world wasn’t designed with me in mind, so it’s amazing to see that visionOS works. This speaks to the power and importance of accessible and inclusive design.

Improvements on all floors

Apple is also evolving all or almost all of the accessibility features present in iOS and iPadOS. VoiceOver sees new voices and a more customizable rotor (the control window). Magnifier benefits from a new Reader mode and its Detection mode can be launched via the Action button on the iPhone 15 Pro. Braille support is further expanding, as are the possibilities offered by the Hover Typing function, which makes text entry easier for visually impaired users. Many other improvements are planned to make life easier for many people.

Advertising, your content continues below



Source link -98