New Android accessibility features let users control phones with facial movements
New Android accessibility features let users control phones with facial movements
The imperative to improve smartphone use for people with limited motor capabilities has resulted in some truly cool — and hopefully helpful — new features.
Thursday, Google announced an expansion of its accessibility settings as well as a new app that will let people navigate their phones with facial gestures.
The feature within the Android Accessibility Suite is called Camera Switches. Previously, Google let users who could not navigate phones with the touchscreens connect a manual switch device that let them scroll and select.
Now, the new “switch” is an Android phone’s camera and a person’s face. Users can assign facial gestures like look left or right, smile, or raise eyebrows to functions like click, back, scroll, and more. The phone’s camera will be able to recognize the gestures and translate them into action.
Camera Switches will be rolling out to Android phones (not just Pixels!) in the coming weeks.
The second feature requires downloading an app, but takes the idea of gesture control further. The app, Project Activate, will let people use their phones to send common messages or perform actions, or even verbalize on their behalf, by customizing gestures. An example Google gives is that the app can be programmed to have a gesture like looking left send a text to a caretaker that says, “Please come here.”
Google notes that, based on feedback from users and advocates, giving people the ability to customize both the gestures and the action taken by the phone was crucial for these features to be useful. With gesture control, it’s interesting to see how helping people with disabilities navigate the world with their smartphones pushes engineers to create innovative and, frankly, super cool products in their own right.