Amazon employees may be listening to your recorded Alexa conversations


A spokesperson for the company released the following statement: "We take the security and privacy of our customers' personal information seriously".

While Amazon says that the manual listening is done only for "quality control" purposes to improve its systems' understanding of competing pronunciations, it has emerged that staff are told to ignore distressed noises, such as assaults or cries for help.

And when listeners hear you singing in the shower, they don't just laugh about it to themselves, the sources told Bloomberg - they share amusing recordings among themselves in an internal chat room otherwise used to seek help in deciphering unclear speech. Those fears are likely to be compounded following a report that Amazon has thousands of workers listening in on Echo owners' conversations. Their task is to identify the human speech Alexa doesn't fully understand and add the required extra information to ensure Alexa can respond more adeptly in future.

Amazon staff listen to recordings of customer interactions with voice-based assistant Alexa to help train the artificial intelligence's responses, it has been reported. Human workers manually annotate voice recordings fed into machine learning algorithms to help Alexa guess the best possible answer for a particular situation.

As for how the entire workflow operates, in a broad sense, reviewers listen to a voice recording, transcribe it, and annotate it to the degree that Alexa's understanding of it wa accurate or off the mark.

The Daily Mail noted that "concerns have been raised by some in the past that smart speaker systems could be used to [listen in on] user conversations, often with the aim of targeting users with advertising". According to the company, "employees do not have direct access to information that can identify the person or account" during the annotation process, but Bloomberg's sources provided screenshots showing that listeners actually receive an account number, device serial number, and first name of the user associated with an audio clip.

More news: Bernie Sanders Confirms He's Now a Millionaire
More news: Angela Merkel backs ‘longer’ Brexit delay than Theresa May’s request
More news: European Union agrees to offer May Brexit delay after Macron talks tough

Amazon insists it has a zero tolerance policy for "abuse of our system" and claims to use multi-factor authentication and encryption to protect customer recordings during the annotation process.

What about Google and Assistant?

All companies say the clips lack personally identifiable information.

The popularity of smart speakers has grown exponentially since they were first introduced to the United Kingdom in 2016. One of the workers spoken to said up to 100 recordings are being transcribed every day when Alexa was triggered by something other than the wake word.

If the wake word is not heard, the audio is discarded. You can also listen to and delete previous voice recordings. You'll be taken to Amazon's external Alexa privacy page.

"Whether it's Microsoft, whether it's Google, whether it's Apple - you're agreeing to store your data on their servers", said Dimitrelos.

To delete voice recordings created by Siri on an iOS device, go to the Siri & Search menu in Settings and switch Siri off.