Amazon Blames Human Error For Sharing Recordings With Another User

An Alexa user in Germany got a lot more than he bargained for when he asked to listen to his personal recordings. Instead of sending him a link to his data, Amazon sent him 1,700 personal recordings from a complete stranger and then said it was a “human error”! Read on for the full story.

Amazon Blames Human Error For Sharing Recordings With Another User

Amazon Blames Human Error For Sharing Recordings With Another User

Alexa’s Mistake Caused by a ‘Human Error’ – The Full Story

Amazon offers its Alexa users the option of checking up on the recordings the smart speaker tracks. In the spirit of “ask and you shall receive”, your data can be accessible by simply asking Alexa to send it to you. Normally, you’ll soon receive a link where you can download the recorded audio files.

A German Alexa user did exactly that, wanting to check up on the audio Alexa consistently records. After getting his link, he accessed the stored data only to find that it wasn’t actually his. Amazon had sent him a link with his files and 1,700 audio files that belong to a completely different user.

The German user swiftly contacted Amazon and reported the incident. According to Reuters, Amazon did not respond immediately but did quickly take the data off of the link. However, by then, the man had already downloaded everything.

Amazon’s Response

In a statement made to The Washington Post, an Amazon representative said,

“This was an unfortunate case of human error and an isolated incident. We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We are also in touch on a precautionary basis with the relevant regulatory authorities.”

Is This Amazon’s First Mistake?

Well…no.

Earlier this year a Portland family had their private conversation recorded by their Alexa….and then sent it to the phone of one of the family’s contacts. Alexa did not inform the family it was going to send the recordings at all.

Back then, Amazon’s excuse was that Alexa had heard the background conversation incorrectly and interpreted it as a command to record and send. This is how an Amazon spokesperson excuse that mistake,

Echo woke up due to a word in the background conversation sounding like ‘Alexa’. Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

What Does All That Mean?

So… Amazon is saying that

  • a) the background conversation somehow matched what Alexa needed to hear to progress with the command perfectly.
  • b) no one heard Alexa asking anything out loud (if you’ve ever heard an Alexa asking for something, you know how farfetched this excuse is).
  • c) only a situation so ridiculously unlikely could possibly cause Alexa to mess with user privacy.

I don’t know how much I’m willing to ignore the rules of probability in order to believe this one.

Amazon Alexa’s Human Error – My Final Thoughts

Big Tech has been incredibly disappointing this year, especially when it comes to our personal data. That’s why I find it very hard to swallow parts of this story. For one, why was there even any need for a human to be involved in a data retrieval process? It’s one thing to believe that an algorithm can sort through and collect your data, it’s another thing entirely when you a human element to it. The second thing I don’t seem to trust is the idea that this is an “isolated incident”.

This kind of data breach has happened before, and Amazon’s knowledge of the incident is completely based on the German user’s sense of morality (I must say, bravo). Logic states that this might not be an isolated incident, just an under-reported one. Hopefully, Amazon will come out with a detailed and transparent statement addressing the exact procedures it will take to stop this from ever happening again. Till then, though, I’m unplugging my Alexa for good.

Add a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

as-seen-on