Amazon
Amazon sent 1,700 private Alexa recordings to the wrong person
One customer asked for his user data from Amazon, then was given someone else's private Alexa chats
One customer asked for his user data from Amazon, then was given someone else's private Alexa chats
Amazon has blamed human error for accidentally sending 1,700 private Alexa recordings to the wrong person.
The recordings included all interactions the person had had with their Alexa smart speaker, which was enough for journalists to identify them and their partner based on personal details they had said to the voice assistant.
Read More:
The error happened after a German Amazon user exercised their right to request all data the retailer holds on them. This right came after the passing of the European Union's General Data Protection Regulation, known as the GDPR, which allows all EU residents to demand a copy of all data companies have collected on them.
For Amazon, this would include their purchasing history and interactions with the company's Alexa voice assistant. All Alexa users can access these recordings via the Alexa smartphone app—GearBrain explains how to do this here. These recordings can be heard but also deleted, both from the app and from Amazon's servers, although the company warns that doing so reduces Alexa's ability to better understand you in the future.
This case was first reported by a German magazine called C't, which was given the recordings by the customer who received them by mistake. The recordings included instructions for Alexa to control smart home devices like a thermostat, play music from Spotify, and interact with a Fire TV streaming device.
A report by C't states: "Using these files, it was fairly easy to identify the person involved and his female companion; weather queries, first names, and even someone's last name enabled us to quickly zero in on his circle of friends. Public data from Facebook and Twitter rounded out the picture."
The person whose data was mistakenly shared was described as "audibly shocked" when the magazine got in touch to say what had happened and how they had identified him. "He started going through everything he and his friends had asked Alexa and wondered what secrets they might have revealed. He also confirmed that we had correctly identified his girlfriend," the report adds.
While some readers may not be concerned by Amazon mishandling their requests to adjust the thermostat, Alexa-equipped devices have a habit of listening when they shouldn't because they hear words that sound similar to 'Alexa'. These recordings, which could include personal information never intended for the assistant, are still stored on Amazon's server.
The retailer said, "This was an unfortunate case of human error and an isolated incident. We have resolved the issue with the two customers involved and have taken steps to improve our processes further. We were also in touch with the relevant regulatory authorities on a precautionary basis."
This embarrassing failure for Amazon comes after it was revealed in May that a conversation between a couple in Portland, Oregon, was accidentally sent to a random person in their contacts list by an Alexa device which mistakenly thought it was being asked to do so.
In 2017, just as it went on sale, a faulty Google Home Mini - which uses the Google Assistant instead of Alexa - was found to have recorded almost constantly since being switched on. Like Alexa devices, the Home Mini has sent all of these erroneous recordings to Google's servers.
Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Amazon Alexa enabled devices.
GearBrain Compatibility Find Engine
A pioneering recommendation platform where you can research,
discover, buy, and learn how to connect and optimize smart devices.
Join our community! Ask and answer questions about smart devices and save yours in My Gear.