Apple unveils some of the new features in iOS 17 ahead of WWDC23

oriXone

Apple unveils some of the new features in iOS 17 ahead of WWDC23

Ahead, Apple, features, iOS, unveils, WWDC23

Apple has anticipated WWDC 2023 and has just announced accessibility news with its iOS 17 devices.

New accessibility features in iOS 17.

Who would have said in the past that it would be the same Apple that would unveil some new features of its next operating system before being presented. Specifically, those of Cupertino launched to present functions related to accessibility. An interface called “Assisted Access”, in addition to other voice-related features. Last year, it also announced a number of accessibility-related pre-WWDC news.

Accessibility Features in iOS 17 for iPhone and iPad Revealed by Apple

assisted access will work for users who have some sort of cognitive disability. Design innovations are used in apps and feature-embedded experiences that can ease cognitive load, Apple describes in the presentation. It will mainly help with the most used functions: connect with your family and loved ones, take and view photos and listen to music.

The experience will thus be simplified to be simpler and more intuitive. It will be through a single application to access FaceTime voice, audio and video calls. The buttons are high contrast and have a larger text size. There are also tools for “trusted collaborators”, so that someone can support a family member with the iPhone or iPad with these tools.

Another feature called “Live Speech” will allow users to type what they want to say, which will be said out loud during voice and FaceTime calls. It is also made to converse with a person in a physical way. Common phrases can be created and saved so they can be called out in a quick conversation.

Apple thought about users who could lose speech and visually impaired

Create a voice for people who are starting to lose their speech

Create a voice for people who are starting to lose their speech

“Personal Voice”a function also introduced today is designed for users at risk of losing the ability to speak. They will be able to create a voice that sounds exactly the way they want. For users who have been diagnosed with ALS (amyotrophic lateral sclerosis) or another disease that affects their speech, Personal Voice will be there to help them if needed.

The voice created will come from reading small random fragments of text. Approximately 15 minutes of audio will be recorded and “Machine Learning” will be used so that user information is private and secure. This creation can be used the moment they connect to any call or FaceTime.

Another important function is “Point and speak through the magnifying glass”it will serve to Visually impaired users can interact with physical objects by tagging them. For example, when using a home appliance, the function will combine camera, LiDAR scanner and machine learning to announce in text what the user moves with his finger on the screen itself. The feature will come with VoiceOver and will also be mixed with other previously released features such as door detection, person detection and image descriptions.

Point and Speak, new accessibility feature in iOS 17

Point and Speak, new accessibility feature in iOS 17

If you want to consult all the news in detail, you can consult the press release launched by those of Cupertino a few hours ago. Apple always at the forefront of accessibility issues, Apple devices show it every day.

Leave a Comment