Apple Unveils New Accessibility Features; Live Speech, Personal Voice and More

[ad_1]

Apple has unveiled new accessibility features to assist iPhone users with disabilities and impairments. The tools that are slated to arrive on iPhone, iPad and Mac later this year include Assistive Access, Live Speech, and Personal Voice among others. Assistive Access is meant for people with cognitive disabilities and it offers a customised experience for Phone and FaceTime. It provides a distinct interface with high-contrast buttons and large text labels. Live Speech, in contrast, is designed for assisting non-verbal people to communicate. Apple is also making it easier for people who are at risk of losing their ability to speak due to conditions like ALS with the Personal Voice functionality. This feature makes use of machine learning to generate a unique personal voice for each individual user.

To mark Global Accessibility Awareness Day on May 18, the Cupertino-based company announced a slew of new accessibility features on Wednesday (May 17) meant to aid users with speech, vision, and cognitive disabilities use Apple devices more effectively. Assistive Access, designed for iPad and iOS, is aimed at users with cognitive disabilities. It offers a customised experience by merging apps and provides a distinct interface with high-contrast buttons and large text labels. The Phone and FaceTime have been combined into a single Calls app, as well as messages, camera, photos, and music. This would make it easier for people with accessibility needs to talk to loved ones, share photos, and listen to music. Apple states that Assistive Access was designed with feedback from “people with cognitive disabilities and their trusted supporters”.

assistive access apple Assistive Access

Nonspeaking iPhone, iPad, and Mac users can type to speak during calls and conversations with Live Speech. This new feature is developed for people who are unable to speak or who have lost their speech over time. During normal calls, FaceTime and in-person conversations, users can type what they want to say and have it spoken out loud.

Other features previewed by Apple include Personal Voice, which allows users to generate an automated voice that sounds like them. The machine learning-supported tool is designed for people who are at risk of losing their ability to speak from conditions like ALS (amyotrophic lateral sclerosis). iPhone or iPad users can generate a Personal Voice for connecting with family and friends by reading randomly chosen text prompts into their microphones for 15 minutes. Personal Voice will be available for English speakers initially and can only be created on devices with Apple silicon.

Another new feature is Point and Speak, which is meant for people who are hard of vision. Point and Speak within the Magnifier app uses the camera and LiDAR scanner of the iPad and iPhone to enable visually disabled people to interact with physical objects such as home appliances that have several text labels. This functionality supports VoiceOver and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment. At present, this feature is available in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese, and Ukrainian languages.

Additional features coming later in the year include the ability for pairing Made for iPhone hearing devices with Mac units and a Voice Control guide with phonetic suggestions. Voice Control adds phonetic suggestions for text editing so users will be able to correct errors and choose the right word out of several that might sound similar. Users can access Voice Control in English, Spanish, French, and German languages to start with.

Further, Deaf or hard-of-hearing users will now also be able to pair Made for iPhone hearing devices directly to Mac and customise them for their hearing comfort. Users can also turn any switch into a virtual game controller using Switch Control.

With text size adjustment across Mac apps such as Finder, Messages, Mail, Calendar, and Notes, Apple is also making it easier for users with low vision. Additionally, users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.

Apple’s latest announcement comes as the tech giant gears up for its Worldwide Developers Conference (WWDC). The iPhone maker is expected to showcase iOS 17 and iPadOS 17 at the annual event that kickstarts on June 5.

Separately, Apple says the Apple BKC and Apple Saket, the iPhone maker’s retail stores in India have been designed with people with disabilities in mind. It claims that the distance between display tables allows wheelchairs to navigate easily, and staircases in the stores have brail on the sidebars for the visually impaired. Apple is offering portable hearing loops for customer store experiences as well.


Apple launched the iPad Pro (2022) and the iPad (2022) alongside the new Apple TV this week. We discuss the company’s latest products, along with our review of the iPhone 14 Pro on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.

[ad_2]
#Apple #Unveils #Accessibility #Features #Live #Speech #Personal #Voice

Leave a Comment