Google Launches Accessibility-Focused Project Gameface on Android to Offer Hands-Free Navigation

Google has expanded its Project Gameface, an open-source project aimed at making tech devices more accessible, to Android, and now it can be used to control the smartphone interface. The project was first introduced during Google I/O 2023 as a hands-free gaming mouse that can be controlled using head movements and facial expressions. These were designed for those who suffer from physical disabilities and cannot use their hands or voice to control devices. Keeping the functioning same, the Android version adds a virtual cursor to allow users to control their device without touching it.

In an announcement made on its developer-focused blog post, Google said, “We’re open-sourcing more code for Project Gameface to help developers build Android applications to make every Android device more accessible. Through the device’s camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive and personalised control.” Further, the company asked developers to use the tools to add accessibility features to their apps as well.

Project Gameface collaborated with the Indian organisation Incluzza which supports people with disability. Using the collaboration, the project learned how its technologies can be expanded to different use cases such as typing a message, looking for jobs, and more. It used MediaPipe’s Face Landmarks Detection API and Android’s accessibility service to create a new virtual cursor for Android devices. The cursor moves following the user’s head movement after tracking it using the front camera.

The API recognises 52 facial gestures including raising an eyebrow, opening the mouth, moving the lips, and more. These 52 movements are used to control and map a wide range of functions on the Android device. One interesting feature is dragging. Users can use this to swipe the home screen. To create a drag effect, users will have to define a start and end point. It can be something like opening the mouth and moving the head, and once the endpoint is reached, closing the mouth again.

Notably, while this technology has been made available on GitHub, it is now up to developers to build apps using this option to make it more accessible to users. Apple also recently introduced a new feature that uses eye-tracking to control the iPhone.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Apple Watch Series 9, Watch Ultra 2 ‘Double Tap’ Gesture Also Works on Some Galaxy Watch Models: Details

Apple Watch Series 9 and Apple Watch Ultra 2 were launched by the company alongside the iPhone 15 series of smartphones at the company’s September launch event on Tuesday. While Apple’ did not introduce major improvements related to health measurement this year, the company teased support for a new Double Tap gesture that will allow users to perform certain actions on the latest smartwatch models. These include stopping timer, snoozing an alarm, or pausing music. However, you can also use the same gesture on some Samsung Galaxy Watch and Apple Watch models.

Both Apple and Samsung offer different features that allow users to use a few hand gestures to perform certain tasks. It is called Universal Gestures on compatible Galaxy Watch models, as 9to5Google points out, while on an Apple Watch, the feature is called AssistiveTouch. Both features can be used on recent smartwatch models from the two companies, and require the use of updated software for the wearables.

You can use these finger and wrist-based gestures on any Apple Watch that is running Watch OS 8 — which means it will work on the Apple Watch Series 4 and newer models, as well as the first-generation Apple Watch SE. You can open the Settings app on your Apple Watch and tap on Accessibility > AssistiveTouch > Hand Gestures. You can then select from four options: Clench, Double Clench, Pinch, Double Pinch and customise the actions that will be performed when using these gestures.

Similarly, if you have any smartwatch launched as part of the following series: Samsung Galaxy Watch 6, Galaxy Watch 5, and Galaxy Watch 4, you will have access to the Universal Gestures feature. You can visit the settings section on your Galaxy Watch that is running One UI Watch 5 and tap on Accessibility > Interaction and dexterity > Universal gestures. Once the feature is enabled, you can choose from pinch and double pinch gestures, making a fist and making a fist twice, to perform certain tasks. You can also shake your writs to enable the universal gestures feature.

Apple Watch owners can further customise the actions to perform tasks like launching Siri, Apple Pay, showing all installed apps, switching to the previous app, and even holding the side button which can be used to turn off the watch. We will learn more about how the new Double Tap gesture works with the existing accessibility features on the new Apple Watch Series 9 and Apple Watch Ultra 2 models when the devices are available to consumers later this month.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Apple Introduces New Accessibility Features Including Door Detection, Live Captions

Apple on Tuesday announced a list of accessibility features that are aimed to help users with disabilities. The new features, which are coming to the iPhone, Apple Watch, and Mac later this year, are claimed to use hardware, software, and machine learning advancements to help people who have low vision or are visually impaired or the ones with physical or motor disabilities. The features include Door Detection for iPhone and iPad users, Apple Watch Mirroring, and live captions. Apple also announced updates to VoiceOver, with 20 additional locales and languages.

One of the most useful accessibility features that Apple introduced as a part of its latest updates is Door Detection that uses LiDAR sensor on the latest iPhone or iPad models to help navigate users to locate a door. The feature uses a combination of LiDAR, camera, and on-device machine learning to understand how far users are from the door and describe the door’s attributes, including whether it is open or closed, the company said.

If the door is closed, the Door Detection feature can help people open it by pushing, turning a knob, or pulling a handle. It is also claimed to read signs and symbols around the door, such as the room number, and even recognise the presence of an accessible entrance symbol.

The Door Detection feature, which will work with the iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, iPad Pro 11-inch (2020), iPad Pro 11-inch (2021), and the iPad Pro 12.9-inch (2020) and iPad Pro 12.9-inch (2021), will be available through the pre-installed Magnifier app.

Apple’s Magnifier app will have a new Detection Mode to enable access to the Door Detection feature. It will also have People Detection and Image Descriptions as the two new features that can work alone or simultaneously along with Door Detection to help support people with visually impaired or low vision.

Alongside the updates within Magnifier, Apple Maps will also get sound and haptic feedback for users who have enabled VoiceOver to help them identify the starting point for walking direction, the company announced.

The Apple Watch will also get dedicated Apple Watch Mirroring support to let users control the smartwatch remotely using their paired iPhone. The new offering will help users control Apple Watch using iPhone’s assistive features, including Voice Control and Switch Control. Users can use inputs such as voice commands, sound actions, head tracking, and even external Made for iPhone switches as alternatives to tap the Apple Watch display.

All this will assist people with physical and motor disabilities.

Apple said that Apple Watch Mirroring uses hardware and software integration on the system, including AirPlay advancements, to allow users to use features including Blood Oxygen and Heart Rate tracking and Mindfulness app. The mirroring feature will work with the Apple Watch Series 6 and later models.

Apple Watch users will also get double-pinch gesture support. This will help users answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout — all by using the double-pinch gesture. It will work with AssistiveTouch on Apple Watch.

For deaf users or those with hearing impairments, Apple announced Live Captions on the iPhone, iPad, and Mac. It will be available later this year in beta in English for users in the US and Canada on the iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon.

Live Captions will work with any audio content, including phone and FaceTime calls, as well as video conferencing or social media app, streaming media content, and even in case users are having a conversation with someone next to them, the company said.

Apple is bringing Live Captions to iPhone, iPad, and Mac users
Photo Credit: Apple

 

Users can adjust font size for ease of reading. The feature in FaceTime will also attribute auto-transcribed dialogue to call participants to make it more convenient for users with hearing disabilities to communicate with each other over video calls.

On Mac, Live Captions will come along with the option to type a response and have it spoken aloud in real time to others who are part of the conversation, Apple said. It also claimed that Live Captions will be generated on device — keeping privacy and user safety in mind.

Apple’s native screen reader — VoiceOver — is also getting 20 additional locales and languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. There will also be dozens of new voices that are touted to be optimised for assistive features across all supported languages.

The new languages, locales, and voices will also be available for Speak Selection and Speak Screen features. Further, VoiceOver on Mac will work along with the new Text Checker tool to fix formatting issues such as duplicated spaces or misplaced capital letters.

Apple also introduced some additional accessibility features to celebrate the Global Accessibility Awareness Day this week. These features include Siri Pause Time that will help users adjust how long the voice assistant waits before responding to a request, Buddy Control to ask care provider or friend to play a game, and a customisable Sound Recognition that is claimed to be customised to recognise sounds that are specific to a person’s environment, like their home’s unique alarm, doorbell, or appliances.

The preloaded Apple Books app will also include new themes and customisation options such as bolding text and adjusting line, character, and word spacing to deliver a more accessible reading experience to users. Further, the Shortcuts app on Mac and Apple Watch starting this week will help recommend accessibility features based on user preferences using a new Accessibility Assistant shortcut.

Apple Maps will also get a new guide from the National Park Foundation, Park Access for All, to help users discover accessible features, programmes, and services to explore in parks across the US Guides from Gallaudet University. It will additionally highlight businesses and organisations that value, embrace, and prioritise the Deaf community and signed languages.

Users will also get accessibility-focussed apps and stories from developers in the App Store as well as the Transforming Our World collection in Apple Books with stories by and about people with disabilities. Apple Music will also highlight the Saylists playlists where each will focus on a different sound.

Similarly, the Apple TV app will feature the latest hit movies and shows featuring authentic representation of people with disabilities.

Users will also get the ability to explore guest-curated collections from the accessibility community’s standout actors, including Marlee Matlin (“CODA”), Lauren Ridloff (“Eternals”), Selma Blair (“Introducing, Selma Blair”), and Ali Stroker (“Christmas Ever After”), among others.

The Apple Fitness+ service this week will also bring trainer Bakari Williams who will use American Sign Language (ASL) highlight features, including Audio Hints, which are short descriptive verbal cues to support visually impaired or low vision users, and Time to Walk and Time to Run episodes becoming “Time to Walk or Push” and “Time to Run or Push” for wheelchair users.

ASL will also be a part of every workout and meditation on Apple Fitness+, and all videos will include closed captioning in six languages. Trainers will also demonstrate modifications in each workout to help people requiring accessibility assistance join in.

Apple is additionally launching SignTime to connect Apple Store and Apple Support customers with on-demand ASL interpreters. SignTime is already available for customers in the US using ASL, the UK using British Sign Language (BSL), and France using French Sign Language (LSF). Furthermore, Apple Store locations around the world have already started offering live sessions throughout the week to help customers discover accessibility features on iPhone, and Apple Support social channels are showcasing how-to content, the company said.

Check out our Latest News and Follow us at Facebook

Original Source

Exit mobile version