Amazon employees review thousands of audio recordings made by Alexa every day – including excerpts from couples arguing and having sex – a study says.
The clips were accidentally captured by the popular digital assistant – confusing the sounds for the commands he should be listening to – and sent for analysis.
Technology company employees review one in every five hundred recordings made by Alexa, whether it's intentional commands to the assistant or unintended recordings.
According to a privacy expert, the disclosure recalls the extent of the personal information that the technology company has about its users.
Scroll down for video
Amazon employees daily review thousands of audio recordings made by Alexa – including excerpts from couples arguing and having sex – claims a Sun investigation
Amazon has an English-speaking team that daily monitors thousands of Alexa recordings in Bucharest, Romania, claims the Sun, along with similar institutions in Boston, Costa Rica and India.
Members of this team are said to be aware of various personal moments accidentally captured by the digital assistant, including arguments, money talks, frank discussions on medical issues and the sound of Alexa users having sex.
& # 39; It has been said that couples who have sex and even what seemed like a sex attack have been heard by staff & # 39 ;, a former Amazon analyst told the sun.
& # 39; There were times when I heard couples arguing at home and others where children tried to teach Alexa to curse. & # 39;
& # 39; We were told to focus on Alexa assignments, but it was impossible not to hear other things. & # 39;
& # 39; Amazon told us that everyone we listened to had agreed, so I never felt I was spying. & # 39;
It is estimated that Amazon & # 39; s Echo and Echo dot smart speakers – on which Alexa works – can be found in around 6.5 million homes in the UK alone.
& # 39; All Amazon Alexa owners are likely to be amazed by the news that Amazon employees can listen to intimate moments & # 39 ;, Kaspersky, head of security researcher David Emm, told MailOnline.
& # 39; While digital assistants such as the Amazon Echo are supposed to be activated by a specific & # 39; wake-up & # 39; word, these devices have recently made headlines over and over again for intrusion into people's privacy, so the question is, where does this end? & # 39;
& # 39; These devices are intended to improve our lives, not to spy on us! & # 39;
On Twitter, various Alexa users report that their smart speaker devices are accidentally triggered by random sounds – or even for no apparent reason.
On Twitter, various Alexa users report that their smart speaker devices are accidentally triggered by random sounds – or even for no apparent reason
& # 39; While digital assistants such as the Amazon Echo are supposed to be activated by a specific & # 39; wake-up & # 39; word, & # 39; Kaspersky security expert David Emm told MailOnline
& # 39; Recently these devices made headlines over and over again because they violate people's privacy & # 39 ;, Kaspersky security expert David Emm told MailOnline
As the number of smart speakers in homes and businesses continues to grow, Mr. Emm added, consumers need to be made aware that what is happening at home does not necessarily remain at home & # 39 ;.
& # 39; It is an alarming wake-up call for owners of all digital assistants that the seller has access to a huge volume of your personal information, & # 39; he said.
This information is of potential value, not only for cyber criminals who might have access to the data, but also for advertisers.
& # 39; Such information may be used incorrectly, even by trusted parties, & # 39; said Mr. Emm.
& # 39; Amazon, Google, and other manufacturers must take action to prevent continued privacy invasions. & # 39;
& # 39; They must make clear what information is collected and how it is stored and offer people the option to refrain from such storage. & # 39;
User audio clips were accidentally captured by the popular digital assistant – confusing the sounds for the commands he should listen to – and sent for analysis
& # 39; We take the security and privacy of our customers' information seriously & # 39 ;, an Amazon spokesperson told MailOnline.
& # 39; We label a fraction of one percent (0.2 percent) of customer interactions to improve the customer experience. & # 39;
This information, they explained, helps Amazon train its speech recognition and natural language interpretation systems to help Alexa better understand what user requests are.
& # 39; We have strict technical and operational safeguards to protect customer privacy and have a zero tolerance policy for misuse of our system, & # 39; the spokesman continued.
& # 39; Data partners do not receive information that can identify customers, access to internal tools is highly controlled and customers can delete voice recordings at any time. & # 39;
In addition, Amazon agents who listen to recordings are bound by strict confidentiality rules, the Sun reported.
WHY ARE PEOPLE INVOLVED ABOUT PRIVACY WITH AMAZON ALEXA DEVICES?
Amazon devices are activated earlier when they are not wanted – which means that the devices may listen.
Millions hesitate to invite the devices and their powerful microphones into their homes for fear that their conversations will be heard.
Amazon devices rely on microphones that listen to a keyword that can be accidentally activated without the owner's knowledge.
The camera on the £ 119.99 ($ 129) Echo Spot, which also acts as a & # 39; smart alarm & # 39 ;, will probably also be directed directly to the user's bed.
The device has such advanced microphones that it can hear people from all over the room talking, even when music is playing.
Last month a British hacker from Mark Barnes saw that the versions of the Echo 2015 and 2016 were converted into a live microphone.
Fraudsters can then use this live audio feed to collect sensitive information from the device.
The Sun's investigation comes after similar privacy issues were revealed with Apple & # 39; s equivalent digital assistant, Siri – so sensitive that even the sound of a zipper being pulled can activate it.
& # 39; There have been countless recordings of private conversations between doctors and patients, business deals, criminal transactions, sexual encounters, etc. & # 39 ;, an anonymous source reportedly said.
In particular, it has been suggested that Siri is particularly prone to unintended activation on Apple Watch devices.
& # 39; The regularity of unintended triggers on the watch is incredibly high & # 39 ;, the whistleblower reportedly told the sun.
& # 39; These recordings are accompanied by user data showing location and contact information. & # 39;
& # 39; You hear a doctor and a patient, talking or people having sex. & # 39;
An Apple spokesperson told the Sun that in the UK fewer than one percent of voice-controlled recordings are listed by staff.
& # 39; User requests are not linked to the (user) Apple ID & # 39 ;, she added.
. (TagsToTranslate) Dailymail (t) sciencetech