Assistive technology has come a long way. Today, there are apps like Live Transcribe that translates speech to text on a phone’s screen in real-time. Then there is Signily, an American Sign Language keyboard app that comes with both left- and right-handed gestures, a QWERTY layout and numbers from 1-30. There are also plenty of object-recognition apps out there, like iDentifi, which are useful for people suffering from low vision or other visual impairments.
Now, a new app uses the power of a person’s gaze to build conversations and express themselves. Look to Speak is a recently launched app by Google that allows people to use their eyes to select pre-written phrases and have them spoken aloud from their Android device. The app also lets you edit, customise phrases and toggle gaze settings.
Compatible with Android 9.0 and above, including Android One, Look to Speak is the result of a ‘Start with One’ project on the Experiments with Google platform. “Start with One, Invent for Many” is essentially a collection of experiments that started by working with one person to make impactful tools for them and their community, the Experiments with Google website explains.
The key person behind Look to Speak, apart from a small group from Google, is artist Sarah Ezekiel, who was diagnosed with motor neurone disease in 2000. Earlier this year, Ezekiel, anad her speech and language therapist Richard Cave, collaborated with Google to explore how machine learning on smaller devices could make eye-gaze communication technology more accessible to people with different types of temporary, permanent or situational disabilities.
“Throughout the design process, we reached out to a small group of people who might benefit from a communication tool like this. What was amazing to see was how Look To Speak could work where other communication devices couldn’t easily go—for example, in outdoors, in transit, in the shower and in urgent situations,” Cave, a speech and language therapist for more than a decade, writes in a post on Google’s official blog The Keyword.
Once inside the app, a user can look left, right or up to seamlessly choose what they want to convey from a list of pre-written phrases. Each time they look off screen, an option is selected, a tutorial video explains. From the app’s main screen, a user can look left or right to select the list that contains the phrase they want the device to speak out loud. Each time you choose a list, the phrases narrow down. This process continues till you reach the desired phrase. Looking up, at any point during this process, lets a user cancel and allows them to start the selection process all over again. Looking up also allows a user to snooze the app. Once in snooze mode, the app can be activated again through a sequence of gazes. The app can also be reactivated by tapping anywhere on the screen.
As Cave explains in the blogspot, eye gaze technology already helps people type messages on a communication device and share them using just eye movement. In that sense, the Look to Speak app is a useful step up.