The latest Android 12 beta by google includes an interesting feature.
Users can now use their eyes, to navigate their android phones. Yes, that’s right, a quick look from side to side can be used to access your home screen, settings and apps.
Even more interesting is the fact you can teach your android phone to open specific apps and panels with certain facial gestures. Say, opening your mouth or raising your eyebrows.
These include enhanced visual feedback to show how long you have held a gesture, enhanced audio feedback to play a sound when something on-screen changes in response to a gesture and keeping the screen on when Camera Switches is enabled.
Whilst a feature that may get you a few looks in public, it works surprisingly well, which begs the question of why Google has not publicised this feature nearly as much as they arguably should.
This new face gesture system is part of Android’s suite of accessibility services. This system could be invaluable for those who are incapable of full movement and fine motor control of arms, hands and fingers, like those with Parkinson’s.
The system allows users to configure up to six possible face and eye gestures.