الثلاثاء، 16 مايو 2023

Apple previews a lot of new accessibility features ahead of WWDC


On Tuesday, Apple reviewed a handful of new features for iPhoneAnd IPAD And mac Designed to enhance cognitive, visual and auditory accessibility and mobility beforehand World Accessibility Awareness Day. The features are scheduled to roll out later this year. This comes as Apple prepares for the Worldwide Developers Conferencewhich will be launched on June 5.

One feature, called Live Speech, is geared towards users who don’t speak or who have various speech patterns or disabilities. Live Speech allows someone to type what they want to say and then say it out loud. The feature can be used for personal conversations as well as over the phone and FaceTime. It works on iPhone, iPad, and Mac, and uses whatever voices are built into the device as Siri. You can say, “Nice to meet you, I…” and introduce yourself, for example, and you can also save favorite phrases like, “Can I have a black coffee please?”

Taking this feature forward is Personal Voice, which allows users at risk of speech loss to create a voice that looks like themselves and then have it speak out loud their written phrases. Personal Voice uses on-device machine learning. To train this feature, a person spends about 15 minutes speaking a series of text messages out loud on an iPhone or iPad.

Watch this: Technology accessibility is lagging behind. Here’s why change is needed

08:26

The Magnifier app on iPhone is also getting a new feature called Point and Speak, which allows users with visual impairments to point to objects with text labels and have their device read that text out loud. For example, someone could use this to locate the buttons on a microwave. Point and Speak uses your phone’s camera, lidar scanner, and on-device machine learning to find and recognize text as you move your finger across different objects. Point and Speak can be used in conjunction with other Magnifier features such as Expose peopleAnd Open the door and image descriptions, which help blind and visually impaired users navigate and learn about their surroundings.

Designed for people with cognitive impairments, Assistive Access provides a more focused device interface to reduce cognitive load. This includes large text labels and high contrast buttons on the iPhone’s home screen and across Calls, Messages, Camera, Photos, and Music. The experience can be tailored to different preferences. For example, someone who prefers video calling can use an emoji-only keyboard in messages or can record a video message to send.

Sarah Herlinger, Apple’s senior director of Global Access Policy and Initiatives, said in a statement.

Other accessibility updates coming this year include the ability to pair Made for iPhone Hearing aids directly to your Mac and to adjust text size more easily across Mac apps like Finder, Messages, Mail, Calendar, and Notes. Voice Control also adds voice suggestions, so users who type with their voice can choose the correct word if there are others that look like them, such as do, due, and dew.

Apple is also launching SignTime in Germany, Italy, Spain and South Korea on Thursday, allowing Apple Store customers to communicate with employees via sign language interpreters. The service is already available in the US, UK, Canada, France, Australia and Japan.

Apple is one of the many companies boosting its accessibility offerings. Other tech giants like Google have rolled out features like I notice, which helps blind and visually impaired users recognize objects and read documents using their phone’s camera. Last year, Google added a feature called Prompt frame for Pixel phones, which uses audio and haptic cues to give users precise instructions for framing their selfies.

Source link

ليست هناك تعليقات:

إرسال تعليق

AI Deepfake Ads: Tom Hanks, Gayle King Sound Warning

Tom Hanks is pretty recognizable, whether he’s holding a box of chocolates in Forrest Gump or wearing a space suit in Apollo 13. But should...