Amazon

Amazon sent 1,700 private Alexa recordings to the wrong person

One customer asked for his user data from Amazon, then was given someone else's private Alexa chats

Like GearBrain on Facebook

Amazon has blamed human error for accidentally sending 1,700 private Alexa recordings to the wrong person.

The recordings included all interactions the person had had with their Alexa smart speaker, which turned out to be enough for journalists to identify them and their partner, based on personal details they had said to the voice assistant.

Read More:

The error happened after a German Amazon user exercised their right to request all data the retailer holds on them. This right came after the passing of the European Union's General Data Protection Regulation, known as the GDPR, which gives all EU residents the ability to demand a copy of all data companies have collected on them.

For Amazon, this would include their purchasing history, but also interactions they have made with the company's Alexa voice assistant. All Alexa users can access these recordings via the Alexa smartphone app - GearBrain explains how to do this here. These recordings can be heard but also deleted, both from the app and from Amazon's servers, although the company warns that doing so reduced Alexa's ability to better understand you in the future.

This case was first reported by a German magazine called C't, which was given the recordings by the customer who received them by mistake. The recordings included instructions for Alexa to control smart home devices like a thermostat, play music from Spotify, and interact with a Fire TV streaming device.

Amazon

A report by C't states: "Using these files, it was fairly easy to identify the person involved and his female companion; weather queries, first names, and even someone's last name enabled us to quickly zero in on his circle of friends. Public data from Facebook and Twitter rounded out the picture."

The person whose data was mistakenly shared was described as being "audibly shocked" when the magazine got in touch to say what had happened, and how they had identified him. "He started going through everything he and his friends had asked Alexa and wondered what secrets they might have revealed. He also confirmed that we had correctly identified his girlfriend," the report adds.

While some readers may not be concerned by Amazon mishandling their requests to adjust the thermostat, Alexa-equipped devices have a habit of listening when they shouldn't, due to hearing other words which sound similar to 'Alexa'. These recordings, which could include personal information never intended for the assistant, are still stored on Amazon's server.

The retailer said in a statement: "This was an unfortunate case of human error and an isolated incident. We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities."

This embarrassing failure for Amazon comes after it was revealed in May that a conversation between a couple in Portland, Oregon was accidentally sent to a random person in their contacts list by an Alexa device which mistakenly thought it was being asked to do so.

In 2017, just as it went on sale, a fault Google Home Mini - which uses the Google Assistant instead of Alexa - was found to have recorded almost constantly since being switched on. Like Alexa devices, the Home Mini has sent all of these erroneous recordings to Google's servers.

Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Amazon Alexa enabled devices.


Like GearBrain on Facebook
Show Comments ()

The GearBrain

See which products "work with" either Google Assistant or Amazon Alexa by clicking on the device below.