Standing with a stance of accessibility is a human right, Apple recently unveiled new software tools for people with mobility, vision, hearing, and cognitive impairments. The new SignTime service, which connects Apple Store and Apple Support customers with on-demand sign language interpreters just started on Thursday, May 20.
Apple Accessibility Software Updates
On Global Accessibility Awareness Day, Tim Cook – CEO shared the News Apple Accessibility Software Updates with the Public on Twitter:
We believe everyone should have the tools they need to change the world. Accessibility is a fundamental right, and we’re always pushing the boundaries of innovation so that everyone can learn, create and connect in new ways. #GAAD https://t.co/oZwQNG7p5x
— Tim Cook (@tim_cook) May 19, 2021
The new SignTime service, which gives remote interpreter access to American Sign Language, British Sign Language, and French Sign Language. It is releasing first in the United States, the United Kingdom, and France. SignTime can also be used by customers visiting Apple Store locations to access sign language interpreters from afar without reserving before time.
AssistiveTouch for Apple Watch
For watchOS users, Apple is introducing AssistiveTouch. This will allow persons with limb differences in their upper bodies to benefit from Apple Watch without ever touching the display or controls. Also, the company claims that their watch will detect small changes in muscle movement and tendon activity. By using built-in motion sensors such as the gyroscope and accelerometer. As well as, the optical heart rate sensor and on-device machine learning.
Eye-Tracking Support for iPad
iPadOS will support third-party eye-tracking devices. It will allow users to control an iPad with their eyes. Compatible MFi devices, according to Apple, will track where a person is looking on the screen and adjust the pointer to follow their gaze.
Explore Images with VoiceOver
Apple’s built-in screen reader, VoiceOver for blind and low vision communities is improving to offer more visual information. Moreover, It will allow individuals to traverse photographs with text and data tables by rows and columns, as well as describe persons and things in photographs, according to Apple. With Markup, people will be able to contribute image descriptions as well.
Made for iPhone Hearing Aids and Audiogram Support
Apple also adds support for new bi-directional hearing aids to its MFi hearing devices program. This seems to be a significant improvement. The microphones in these new hearing aids allow folks who are deaf or hard of hearing to talk on the phone or FaceTime hands-free.
With the introduction of audiograms, users can upload their hearing test results to Headphone Accommodations to amplify quiet sound more readily and to alter certain frequencies to suit their hearing ability.
Apple introduces new background sounds in support of neurodiversity so that distractions can be minimal, peaceful, or restful. Also, these sounds include “balanced, bright, or dark noise, as well as ocean, rain, or stream sounds,”. These sounds can be programmed to play indefinitely to cover distracting or external noises.
Apple is also intending to add the opportunity to utilize mouth noises such as clicks or pops instead of physical buttons later this year. As well as, the option to customize display and font size settings in particular applications. Moreover, additional Memoji possibilities with cochlear implants, oxygen tubes, and soft helmets.
Apple believes that everyone should have access to the tools they need to make a positive difference in the world. They’re pushing innovation’s limitations so that everyone could learn, create, and connect in new ways.
Apple’s AI Latest Strategies: Progress And Future Goals