Genius hacker uses AI to help Amazon Alexa respond to sign language
As smart as voice assistants like Siri, Alexa and the Google Assistant are, they are in danger of leaving deaf people behind, due to most smart speakers not having a screen.
Given their ability to help around the home - switching on lights, adjusting the heating, locking doors and controlling other devices - it would be a shame if smart speakers were unable to assist consumers who might benefit more from their abilities than the rest of us.
- How to get started with your new Amazon Echo with Alexa
- How to organize your life with Alexa
- 9 useful ways Alexa can help out in the kitchen
Spotting their shortfall, software developer Abhishek Singh has come up with a solution, by creating a program which uses a webcam to read sign language, then speaks this aloud to a nearby Alexa device.
When the device replies, it's spoken messages are then heard by the program and written on a computer screen for the deaf user to read.
A video uploaded by Singh to YouTube shows how responsive the application is, recognizing his signing and turning the gestures into spoken words in a matter of seconds - and certainly quick enough for Alexa to understand what's going on.
The video demonstrates how a deaf person can use sign language to ask Alexa to read the weather forecast, convert a measurement from feet into meters, say the current time, and add items to the user's shopping list.
Singh says he used the TensorFlow.js deep learning software to teach the app sign language, which is read out by a laptop using Google's text-to-speech system, which the nearby Amazon Echo speaker hears and responds to.
"The project was a thought experiment inspired by observing a trend among companies of pushing voice-based assistants as a way to create instant, seamless interactions," Singh told Fast Company. "If these devices are to become a central way we interact with our homes or perform tasks, then some thought needs to be given to those who cannot hear or speak. Seamless design needs to be inclusive in nature," he added.
Although Singh's solution uses a laptop placed near to an Echo speaker, it doesn't take much imagination to see how such a system could be added to the Amazon Echo Show and Echo Look smart speakers, which include a webcam and a display. Sign language could also be added to the upcoming Google Smart Displays, which are due on sale later this month and give the Google Assistant a screen and camera.