Recently, Apple released several new accessibility features, a few of which tease future artificial intelligence initiatives. These developments not only provide disabled persons more control, but they also provide insight into Apple’s upcoming artificial intelligence projects. These features include advanced voice recognition capabilities that improve voice control and dictation accuracy, demonstrating the potential of AI-driven accessibility solutions. Furthermore, users with mobility limitations may engage with technologies like AssistiveTouch more personally and naturally, thanks to the incorporation of machine learning algorithms.
Eye Tracking for People with Physical Limitations
One such feature lets users operate their iPhone and iPad with just their eyes, thanks to artificial intelligence. It’s called Eye Tracking. This invention, primarily for people with physical limitations, offers a fascinating look into Apple’s progress in artificial intelligence. Users only need to spend a few seconds calibrating Eye Tracking utilising the front-facing camera during its quick and easy setup process. Apple underlines that no data is shared with the firm and that all configuration and control data is kept secure on the device.
Also read: Is Google the New King of AI?
According to Apple, the Eye Tracking feature works with apps on both iPadOS and iOS, so no additional hardware or accessories are required. Users can use Dwell Control to activate different features and navigate across programs by using Eye Tracking. They can perform tasks like button presses, swipes, and gestures with just an eye movement.
Which Additional Features Have Been Added by Apple?
Apple has also added a new function called Listen for Atypical Speech, which uses on-device machine learning to improve Siri’s ability to understand a broader range of voices. Additionally, Vocal Shortcuts allows users to train Siri to use particular words or phrases for accelerated operations, including shortcuts or app launches. These improvements are intended for people who have trouble speaking clearly because of illnesses such as cerebral palsy or amyotrophic lateral sclerosis (ALS).
One of the new accessibility features unveiled today is called Music Haptics, which allows people with hearing loss or deafness to feel the beat of music through vibrations on their iPhones. Furthermore, Motion Cues for Vehicles are intended to make using an iPhone or iPad while driving easier. These are especially helpful for people who are prone to motion sickness. The software that powers the Apple Vision Pro, visionOS, is also being updated by Apple to include system-wide Live Captions.
In a press statement, Apple stated that these features “harness Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.” They also combine the power of Apple hardware and software. The iOS 18 and iPadOS 18 updates, anticipated to be released this fall, will probably come with them “later this year.”