Apple Showcases Software Updates For People With Disabilities

Apple unveiled that by end of this year there will be many powerful software updates across all of Apple’s operating systems to support people with disabilities.

Recently, Apple announced powerful software features developed for people with mobility, vision, hearing, and cognitive disabilities.

Apple unveiled that by end of this year there will be software updates across all of Apple’s operating systems to support people with disabilities. User with limb differences will be able to navigate Apple Watch using AssistiveTouch; iPad will support third-party eye-tracking hardware for easier control. VoiceOver screen reader will get even smarter using on-device intelligence to explore objects within images, which will help blind and low vision communities.

Source: Apple

In support of neurodiversity, the company is introducing new background sounds to help minimize distractions. For deaf or hard of hearing communities, Made for iPhone (MFi) will soon support new bi-directional hearing aids.

A new service called SignTime launching on Thursday, May 20, 2021; enables users to communicate with AppleCare and Retail Customer Care by using American Sign Language (ASL) in the US, BSL in the UK, or LSF in France, right in their web browsers. Now, at Apple Store locations, user can also use SignTime to remotely access a sign language interpreter without booking ahead of time.

AssistiveTouch for watchOS enables customers with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls. Built-in motion sensors like the gyroscope and accelerometer, in combination with the optical heart rate sensor and on-device machine learning enables Apple Watch to detect subtle differences in muscle movement and tendon activity, which enables users navigate a cursor on the display through a series of hand gestures.

iPadOS is bringing support for third-party eye-tracking devices to make it possible for people to control iPad using just their eyes. Compatible MFi devices can track where a user is looking onscreen and the pointer will move to follow the person’s gaze, while extended eye contact performs an action, like a tap.

Source: Apple

The company is introducing new features for VoiceOver, screen reader for blind and low vision communities. Customers can now explore even more details about the people, text, and other objects within images. You can now navigate a photo of a receipt like a table: by row and column, complete with table headers. VoiceOver, now, is also able to describe a person’s position along with other objects within images.

For additional details, you can visit the official announcement here.