Quick News Bit

Apple previews iOS 17 features: iPhones will be able to talk in your voice within 15 minutes

0

Personal Voice feature

Personal Voice is aimed at helping people with conditions such as ALS (amyotrophic lateral sclerosis) that put them at risk of losing their ability to speak. This feature essentially helps them create a personalised voice that will sound almost the same as their natural voice. One can do so by reading out a randomised set of text prompts and recording it for 15 minutes on their iPhone or iPad. Apple says that this feature uses on-device learning so the users’ information is private and secure.

How Personal Voice works?

So how does Personal Voice work? It synchronises with Live Speech, another new accessibility feature that lets users type a message and have it spoken out loud during phone, FaceTime calls, and in-person conversations as well. Users also have the ability to save commonly used phrases so it’s easier to just drop them during their conversations. Live Speech has been designed for people who are unable to speak or for people who cannot speak anymore. Personal Voice is integrated with Live Speech so the voice prompts that go out are in their synthesised voice.

More iOS 17 features in accessibility

  • Apple’s new set of features also include ‘Assistive Access‘ that creates a new and more convenient interface with high contrast buttons and large text labels. Here, the Phone and FaceTime are combined into one single app. It keeps only the widely used apps like Messages, Camera, Photos and Music. Users have the option to go for a grid-based layout or a row-based layout for those who prefer text.
  • To help users who prefer communicating visually, the Messages app has an emoji-only keyboard and the option to record a video message.
  • Apple has also added Point and Speak in the Magnifier app on iPhone and iPad. This feature will make it easier for people with vision disabilities to identify physical objects easily. It uses input from the camera, the LiDAR Scanner and on-device machine learning to read out the text on physical objects, a microwave for example. As users move their finger over each text, the iPhone or iPad will read out what’s written on it.
  • There are a couple of more accessibility features such as the ability to pair Made for iPhone hearing devices directly to Mac and further customise them for their hearing comfort. Then there’s Voice Control that adds phonetic suggestions for users who type with their voice so they can use the correct sounding word.
  • Users can also now pause GIFs in Safari and Messages.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment