Apple Shows Off New Accessibility Capabilities That Combine Hardware, Software, And Machine Intelligence.

    8
    Apple shows off new accessibility

    Later this year, users with disabilities will have improved navigation, health, communication, and other software functionalities.

    Apple showed off new software features that make it easier for people with disabilities to browse, interact, and get the most out of their Apple devices. These significant enhancements combine Apple’s newest technologies to provide users with unique and customized tools, and they continue the company’s long-standing dedication to producing products that work for everyone.

    People who are blind or have low vision can use Door Detection on their iPhone and iPad to navigate the final few feet to their destination; users with physical and motor disabilities who rely on assistive features like Voice Control and Switch Control can fully control the Apple Watch from their iPhone with Apple Watch Mirroring; and the Deaf and hard of hearing community can follow Live Captions on iPhone, iPad, and Apple Watch using advancements in hardware, software, and machine learning. Apple is also adding over 20 more languages and locales to its industry-leading screen reader VoiceOver.These enhancements will be released as part of Apple platform software updates later this year.

    Apple’s senior director of Accessibility Policy and Initiatives, Sarah Herrlinger, stated, “Apple embeds accessibility into every facet of our work, and we are committed to providing the best products and services for everyone.” “We’re thrilled to introduce these new capabilities, which bring together the best of Apple’s intelligence and creativity to provide consumers more opportunities to use our devices in ways that best suit their needs and lives.”

    Door Detection for People Who Are Blind or Have Low Vision

    Apple has introduced Door Detection, a cutting-edge navigation feature for those who are blind or have limited vision. When users arrive at a new area, Door Detection may help them find a door, determine how far away they are from it, and define door characteristics like whether it is open or closed, and if it is closed, if it can be opened by pushing, turning a knob, or pulling a handle. Door Detection may also scan nearby signs and symbols, such as the room number in an office or the presence of an accessible entrance mark. With the LiDAR Scanner, this new feature will be available on iPhone and iPad devices, combining the capabilities of LiDAR, camera, and on-device machine learning.

    Magnifier, Apple’s built-in software for blind and poor vision people, will include a new Detection Mode with Door Detection. In Detection Mode, users with vision difficulties may utilize Door Detection, People Detection, and Image Explanations individually or concurrently, providing a one-stop shop with customized tools to assist them to navigate and receive detailed descriptions of their environment. Apple Maps will provide sound and haptic feedback for VoiceOver users to identify the beginning point for walking routes, in addition to navigation capabilities inside Magnifier.

    Increasing Apple Watch physical and motor accessibility

    Apple Watch Mirroring, which allows users to manage Apple Watch remotely from their associated iPhone, makes it more accessible than ever for persons with physical and motor limitations. Users may operate Apple Watch using iPhone’s assistive capabilities such as Voice Control and Switch Control, and employ inputs such as voice commands, sound actions, head tracking, or external Made for iPhone switches as a substitute to touching the Apple Watch display using Apple Watch Mirroring. Apple Watch Mirroring combines hardware and software, including AirPlay advancements, to ensure that customers who rely on these mobility capabilities may benefit from Apple Watch applications like Blood Oxygen, Heart Rate, Mindfulness, and others.

    Furthermore, users may manage Apple Watch using simple hand motions. A double-pinch motion on Apple Watch may now answer or finish a phone call, dismiss a notification, take a photo, play or stop media in the Now Playing app, and start, halt, or continue a workout, among other things. This expands on the Apple Watch’s breakthrough AssistiveTouch technology, which allows users with upper-body limb impairments to manage the Apple Watch with motions like pinching or clenching instead of tapping the screen.

    For Deaf and Hard of Hearing Users, Live Captions are now available on iPhone, iPad, and Mac.

    Apple is providing Live Captions for iPhone, iPad, and Mac for the Deaf and Hard of Hearing communities.

    3 Users can follow along with any audio material more readily, whether they’re on a phone or FaceTime call, utilizing a video conferencing or social networking app, streaming media content, or conversing with someone nearby. Users may also change the text size to make it easier to read. FaceTime’s Live Captions assign auto-transcribed language to call participants, making group video chats even more accessible to those with hearing impairments. When utilizing Live Captions on a Mac for calls, users can type an answer and have it read aloud to other participants in the conversation in real-time… User information is kept private and safe because Live Captions are created on the device.

    New Languages and More for VoiceOver

    VoiceOver, Apple’s industry-leading screen reader for blind and low-vision users, now supports over 20 new geographies and languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese.

    Users may now select from hundreds of extra voices in a variety of languages that are appropriate for assistive features. These new languages, places, and voices will be supported via Speak Selection and Speak Screen accessibility features. On the Mac, VoiceOver users may use the new Text Checker tool to discover common formatting errors such as duplicate spaces or missing capital letters, making the document or email inspection much easier.