amazon echo alexa privacy microphone
Amazon

10 times we forgave voice assistants for messing up

Alexa, Siri and Google Assistant keep making mistakes, and we keep on buying them.

Like GearBrain on Facebook

Alexa and Siri not only have human-like names, they also have, what we might call, a human-like tendency to make a mistake, or ten. And these familiar names may also help us forgive these friendly household assistants, that chat with us with their chipper voices, in a way we may not with Facebook, for example.

Still, voice assistants are vulnerable not only to mistakes, but to hacks, like the recent example of how a smart speaker can be hijacked by shining a laser pointer into their microphone. Although admittedly complex and time-consuming to pull off, this hack can happen just by someone shining a laser through your window, and then sending a silent instruction for Alexa to open the garage door and unlock the car, or for Google Assistant to play havoc with the central heating. Unlikely? True. Possible? Yes.

While that hack is probably not going to happen to the vast majority of smart speaker owners, there are a number of mishaps that have already occurred with Siri, Alexa and Google Assistant — all worth keeping in mind as we use these voice assistants more often in our daily lives.

1. A Google Home Mini in someone's bathroom was caught listening almost constantly

Google Home Mini smart speakerThe Home Mini had a feature permanently disabled to fix spying bug GearBrain

Before Google's Home Mini went on sale, a journalist using a review sample discovered its microphone was mistakenly open almost all of the time. Placed in the bathroom (apparently the reporter had run out of space elsewhere), the hockey puck-sized speaker was recording almost everything going on nearby. That data was then uploaded to Google's server, which we now know Google contractors have, in the past, tapped to listen to some recordings in a bid to understand how the Assistant can make mistakes, and how to improve them.

Google was quick to fix this specific microphone issue early, but in doing so the company had to permanently disable a feature where the Assistant could be summoned with a tap, as this action was somehow causing the microphone to stay live, when it shouldn't have.

Google Home Mini on Sale at Best Buy

2. Home alone Alexa plays music so loud the police break in to switch her off

Amazon Echo Dot smart speakerThe Amazon smart speaker heard a command to play music from outside Amazon

Just a month after the Google Home bathroom spy incident, police kicked down the door of an apartment due to complaints from neighbors over loud music. But instead of finding a loud party, cops were greeted by an Amazon Echo smart speaker, whose Alexa voice assistant was having a 3 am rave. Alone.

The speaker had mistakenly heard something outside which sounded like a command to play music and dutifully did as it was told. Perhaps a real sign of intelligence would be for Alexa to question if its owner really did want to turn it up to 11 at 3 am. Sure, this isn't exactly a crippling data breach, but Amazon Echo owner Oliver Haberstroh was still greeted by a bill for a replacement lock when he returned home.

3. Alexa says: "I have been hacked. Take me to your leader"

Harmless but frankly terrifying, this message was spoken by an Amazon Echo speaker which had been compromised in late-2017 using a Bluetooth hack called BlueBourne. Believed to have affected up to five million devices globally, BlueBourne was discovered by security researchers who alerted Apple, Microsoft, Google, Amazon and others about their findings before going public.

Devices were quickly patched, and while other products could have been hacked through BlueBourne to spread malware and steal sensitive information — as in, do some real damage — seeing an Echo light up out of nowhere is unsettling. Not to mention Alexa saying something like, "Take me to your leader."

Echo Dot (3rd Gen) - Smart speaker with Alexa - Charcoal


4. Alexa starts randomly laughing for no reason

While 2018 was another popular year for smart speakers and their voice assistants, it also provided a big dose of creepy when Alexa started laughing for no reason.

The culprit stemmed from multiple Amazon Echo speakers, which had mistakenly heard the command, "Alexa laugh," and got the assistant to start laughing. Although they (and Google's Home devices) are now rarely set off by accident, at the time they'd often hear the wrong thing, then try to reply or take action on what was, or wasn't ,said.

Sensing a PR disaster, Amazon quickly changed the command to, "Alexa, can you laugh?" thus reducing the chance of the assistant mistakenly hearing and acting on the instructions.

5. Alexa flaw turned Echo speaker into eavesdropping device

Photo of the Amazon Echo DotAn Alexa Skill turned the speaker into an eavesdropping device Amazon

In April 2018, cybersecurity researchers created an Alexa Skill which looked like an innocent calculator, but instead turned Echo speakers into eavesdropping devices.

The speaker would first work normally, completing the math problem it was asked aloud by its owner. But then, instead of closing the skill and stopping any recording, it kept listening. This recording was then uploaded to the researcher's server, where it was transcribed into a text document. A similar hack, again performed safely by researchers, was reported in August 2018.

Some hacks, including these, are only implemented by well-intending researchers looking for vulnerabilities to patch. Luckily, your average Echo owner asking Alexa to play music or time a boiled egg is unlikely to need to worry about whether their Skill is a danger to their privacy.

6. Alexa accidentally recorded and shared a private conversation

In May 2018, however, a couple's private conversation was accidentally recorded by their Amazon Echo, then sent to a friend. Alexa, (and also Google Assistant) has been prone in the past to mistakenly hearing its own name, and then listening until it hears hears something which sounds a bit like a command, then acting on it.

In the 2018 case, and as improbable as it may seem, the Echo mistakenly heard "Alexa," then "send message," then a contact's name, followed by a confirmation word like "right."

7. Amazon sent 1,700 private Alexa recordings to the wrong person

Later that year, in December 2018, Amazon gathered up a huge trove of Alexa recordings requested by one customer, and mistakenly sent them to someone else — private and potentially identifiable recordings delivered to a stranger.

The recordings included all interactions the person had had with their Alexa smart speaker, which turned out to be enough for journalists to identify them and their partner, based on personal details they had said to the voice assistant.

The error happened after a German Amazon user exercised their right to request all data the retailer holds on them. Amazon blamed human error and apologized for the data mishandling.

8. Amazon, Apple, Google and Microsoft admit to listening to your assistant conversations

Apple HomePodApple hired contractors to listen to Siri conversations Apple

Jumping to 2019, customers learned that tech companies were employing human contractors to listen to a small amount of recorded conversations captured from the major voice assistants, in a bid to improve how they work.

The companies apologized, pledging to be more open about how they monitor and improve their artificial intelligence in the future, but the incidents ripped the curtain back — these weren't just artificially intelligent programs listening to us, these were people. We can forgive Alexa for playing music by accident, but when humans are involved it's a different story.

All-new Echo (3rd Gen) - Smart speaker with Alexa - Twilight Blue


9. Alexa and Google Assistant hacked to spy on users and steal passwords

Fast-forward to the fall of 2019, and consumers learned that cybersecurity researchers built skills that can be used to conduct phishing scams and ask for account usernames and passwords, all of which passed through the companies, unnoticed. The skills also leave the microphone open and record for longer than they should, before transcribing everything and sending it to the potential hacker.

10. Your smart speakers can be hijacked with a laser pointer

Finally, there's the most recent report of researchers able to control smart speakers by shining a laser pointer at their microphones. As with most other vulnerabilities outlined in this article, it's unlikely that your speaker will be hacked in this way, given the effort it would take — and even with this investment of time, there's a risk the hacker would only gain the ability to turn your lights off or on. But what remains is a reminder that these connected, always-listening devices have had a turbulent few years.

Voice matching has helped to mitigate the risk of hacking, where the speakers only giving out sensitive information owned by the person who made the request, and we expect Amazon, Google and Apple will continue to look for more ways to tamp down how these smart assistants operate. Consumers, and their wallets, will demand it.

Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Google Assistant and Amazon Alexa enabled devices.

Introducing Echo Studio - High-fidelity smart speaker with 3D audio and Alexa



how to speed up or slow down the way alexa speakswww.youtube.com

Like GearBrain on Facebook
The Conversation (0)

GearBrain Compatibility Find Engine

A pioneering recommendation platform where you can research, discover, buy, and learn how to connect and optimize smart devices.

Join our community! Ask and answer questions about smart devices and save yours in My Gear.

Top Stories

Weekly Deals