Google Launches Accessibility-Focused Project Gameface on Android to Offer Hands-Free Navigation

Google has expanded its Project Gameface, an open-source project aimed at making tech devices more accessible, to Android, and now it can be used to control the smartphone interface. The project was first introduced during Google I/O 2023 as a hands-free gaming mouse that can be controlled using head movements and facial expressions. These were designed for those who suffer from physical disabilities and cannot use their hands or voice to control devices. Keeping the functioning same, the Android version adds a virtual cursor to allow users to control their device without touching it.

In an announcement made on its developer-focused blog post, Google said, “We’re open-sourcing more code for Project Gameface to help developers build Android applications to make every Android device more accessible. Through the device’s camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive and personalised control.” Further, the company asked developers to use the tools to add accessibility features to their apps as well.

Project Gameface collaborated with the Indian organisation Incluzza which supports people with disability. Using the collaboration, the project learned how its technologies can be expanded to different use cases such as typing a message, looking for jobs, and more. It used MediaPipe’s Face Landmarks Detection API and Android’s accessibility service to create a new virtual cursor for Android devices. The cursor moves following the user’s head movement after tracking it using the front camera.

The API recognises 52 facial gestures including raising an eyebrow, opening the mouth, moving the lips, and more. These 52 movements are used to control and map a wide range of functions on the Android device. One interesting feature is dragging. Users can use this to swipe the home screen. To create a drag effect, users will have to define a start and end point. It can be something like opening the mouth and moving the head, and once the endpoint is reached, closing the mouth again.

Notably, while this technology has been made available on GitHub, it is now up to developers to build apps using this option to make it more accessible to users. Apple also recently introduced a new feature that uses eye-tracking to control the iPhone.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Indus Gets Accessibility Features With Support From Google’s AI-Powered Project Gameface

Pune-based developer SuperGaming has partnered with Google to add some accessibility features to its upcoming Indo-futuristic battle royale game, Indus. We’ve yet to see direct gameplay from the PC version, but the studio has confirmed that it has added custom support for Project Gameface, enabling further inclusivity that lets players control in-game actions using head movements and various facial expressions. The feature is largely aimed towards gamers suffering from cognitive or motor challenges and can be set up via a simple face scan through the Gameface app. SuperGaming recently showcased this technology at the Google I/O event in Bengaluru, letting attendees try it first-hand.

Project Gameface is an open-source hands-free technology that uses a standard webcam to scan and read your face, so you can map facial gestures to actions on a mouse and keyboard. Gesture sizes can also be adjusted so the software doesn’t mistake any involuntary reflexes as an in-game action. In a sense, the technology will try and make use of any expressions that could pull off in-game actions, ranging from eyebrow movement, mouth movement, grins, or even head tilts that’d best be configured to move the camera around. It is worth noting that Google’s tool is still in development, but from the short teaser SuperGaming released, we can gather that omnidirectional camera movement is present, albeit a bit laggy. Then again, it could just be an unoptimised, in-development PC version, which still hasn’t received any gameplay trailers.

Earlier this month, Bandai Namco Entertainment — best known for publishing Elden Ring and Dark Souls — invested in SuperGaming to build out its ‘IP metaverse.’ The developer is currently working on its battle-royale shooter Indus, which is set on a floating island called Virlok. In it, you play as the Mythwalker, a hired gun working for the COVEN, on the prowl for the rare mineral Cosmium, which can alter space and time. As seen with other BRs, players are dropped onto the map to scavenge for supplies, survive, and kill anyone who stands in their way to emerge as the last man standing. At a certain interval, however, the Cosmium will spawn at a random spot on the map, which upon collection, grants the player victory, regardless of whoever’s alive in the game.

Indus can be played in both first and third-person modes, with the latter briefly switching to FPS when aiming down sights — rather than simply tightening the hip-fire reticle. The game features uniquely designed operators to choose from, but special, tailored abilities will only be added post-launch. As a mobile game, it will be free-to-play from launch day and feature monetisation methods similar to other titles in the category, where one could purchase cosmetics and other in-game items from the store — no pay-to-win mechanics confirmed yet.

Pre-registration for Indus is now live on the Google Play Store. Further details on the PC and console versions are expected in the future.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Exit mobile version