Researchers discover lasers can communicate with microphones
Smart speakers made and sold by Amazon, Google and Facebook can be hijacked by shining a laser pointer at their microphone.
Exploiting this bizarre vulnerability, a hacker could potentially shine a laser at the microphone from outside the target's home, then ask the smart speaker to open the garage door, unlock and start the owner's car, or control a wide range of connected smart home devices.
The researchers published their findings in an extensive white paper this week, on a website named after the hack, called Light Commands, and documented the hack in a series of YouTube videos, embedded below.
The flaw is shared by a wide range of devices which use microphones to communicate with a voice assistant, including smartphones and tablets, as well as thermostats, security cameras, smart speakers and displays. Researchers found the hack worked over large distances and through glass windows. They demonstrated the hack working along a 110 meter hallway, aimed using a telephoto camera lens and a tripod.
The researchers said: "The implications of injecting unauthorized voice commands vary in severity based on the type of commands that can be executed through voice...We show how an attacker can use light-injected voice commands to unlock the victim's smart lock-protected home doors, or even locate, unlock and start various vehicles."
Four of the researchers are from the University of Michigan, plus a fifth from the University of Electro-Communications in Tokyo.
As well as controlling smart home devices like lights, plugs and a thermostat, the vulnerability could also allow hackers to make online purchases by issuing a buying command to Amazon Alexa, or make phone calls and send messages using Google Assistant - so long as they knew the names of contacts in the victim's address book.
Honeywell Home T9 WIFI Smart Thermostat with 1 Smart Room Sensor, Touchscreen Display, Alexa and Google Assist
Fortunately, the most serious side of this hack - asking a voice assistant to open or unlock a door - in most cases requires the attacker to also know the victim's PIN, which the voice assistant needs before performing an unlock action. The researchers suggest this could be brute-forced (automatically guessed repeatedly) but this would take time.
Explaining why microphones can be controlled with light, the researchers say on their website: "The main discovery behind light commands is that in addition to sound, microphones also react to light aimed directly at them. Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio."
Smart speaker owners can protect themselves against such a hack by setting up their assistant's voice match system, where the speaker only responds to their voice. That way, hackers would have to impersonate the target's voice to get the device to do what they ask.
Owners could also physically block the microphone of the device, or move it away from any window. The researchers suggest manufacturers could address the issue by ensuring smart speakers only reply when at least two of their microphones hear a voice command, or by fitting a component which blocks light from reaching the microphones.
Although a hack like this is relatively complex, the researchers list the equipment required to carry it out. Many of these products can be bought from Amazon, but building the entire system would cost over $500.
Sonos One (Gen 2) - Voice Controlled Smart Speaker with Amazon Alexa Built-in - White
Unboxing in 10 Seconds - Lenovo Smart Display with Google Assistant www.youtube.com