View Dyslexie font  |  View high contrast
Subscribe to the ADCET newsletter
Sector News

Apple shares it's newest Accessibility features

Apple celebrated Global Accessibility Awareness Day by sharing new Accessibility features, showcasing app developers innovating augmentative and alternative communication software, partnering with creators to explore how their disabilities translate into powerful creative opportunities and more.

One highlight is the announcement of a new interface for iPhone and iPad called “Assistive Access” to support individuals with cognitive requirements or older adults. This is a simple and easy to understand interface which distills apps and experiences to their essential features in order to lighten cognitive load. It will give the ability to customise native apps and be available to third party developers to leverage as well.

The Shortcuts Accessibility Gallery collection has been refreshed, including the Accessibility Assistant Shortcuts to the latest details. A new Shortcut that has been added is 'Remember This', which helps users with cognitive disabilities create a visual diary in Notes for easy reference and reflection. Add this to your iPad or iPhone.

Some of the other new features announced are:

  • Live Speech - With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.
  • Personal Voice - Personal Voice is a simple and secure way for users to create a voice that sounds like them. Users can create a Personal Voice by reading along with a randomised set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
  • Point and Speak - Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels. For example, while using a household appliance - such as a microwave - Point and Speak combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad. This is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment.

For more detail about these announcements visit Apples' Newsroom This link takes you away from the ADCET page.