WhatsNew2Day
Latest News And Breaking Headlines

Smart speakers listen to users up to 19 times a day because they hear random words on TV

A new study found that smart speakers are turned on up to 19 times a day, on average 43 seconds each time by words that are not understood by people who speak in the same room or hear from televisions.

That means that speakers who use popular virtual assistants such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and the unnamed digital helper on Google Home and Nest devices listen to unconscious users.

The study at Northeastern University discovered that smart speakers were triggered by words that they often did not hear and that they often produced bizarre responses about 2 to 19 times a day.

Apple's Home Pod smart speaker

Google Home Mini

Google Home Mini

Apple’s Home Pod and the Google Home Mini smart speakers are included in a study by Northeastern University which found that such devices randomly record people when they hear words wrong, sometimes even broadcast through television programs when they broadcast to a room

Amazon's ultrasound devices are also included in Northeastern University's research, where smart speakers randomly started taking people up to 19 times a day

Amazon's ultrasound devices are also included in Northeastern University's research, where smart speakers randomly started taking people up to 19 times a day

Amazon’s ultrasound devices are also included in Northeastern University’s research, where smart speakers randomly started taking people up to 19 times a day

A study by Northeaster University on smart speakers, including the Harman Kardon Invoke speakers from Microsoft, showed that such devices could turn on randomly to record people after misunderstandings of wake up commands for up to 43 seconds each time

A study by Northeaster University on smart speakers, including the Harman Kardon Invoke speakers from Microsoft, showed that such devices could turn on randomly to record people after misunderstandings of wake up commands for up to 43 seconds each time

A study by Northeaster University on smart speakers, including the Harman Kardon Invoke speakers from Microsoft, showed that such devices could turn on randomly to record people after misunderstandings of wake up commands for up to 43 seconds each time

The most confused, according to the February 2020 study, was Siri, which responds from Apple’s Home Pod.

Although designed to respond to “Hey Siri,” the report says that “Apple Homepod, activations took place with words that rhymed or sounded similar to the term.”

Examples are ‘He clear’, ‘Very sorry’, ‘Hey sorry’, ‘Okay, yes’,’ And seriously ‘,’ Hello ma’am ‘,’ Faith’s funeral ‘,’ Historians’, ‘I see’, ‘I it I’m sorry, they say.

A more alarming discovery in the report revealed that the speakers were randomly switched on and stayed on for 20 to 43 seconds.

That means that a smart speaker can listen to users for almost a whole minute.

“Voting assistants such as Amazon’s Alexa, OK Google, Apple’s Siri and Microsoft’s Cortana are becoming increasingly ubiquitous in our homes, offices and public spaces,” the report says.

Apple's Siri

Apple's Siri

Amazon's Alexa

Amazon's Alexa

Apple’s Siri (left) and Amazon’s Alexa were among the studies studied in a study by Northeastern University, which showed that the virtual assistants listened to unknown users

Google house

Google house

Microsoft's Cortana

Microsoft's Cortana

Google Home and Microsoft’s Cortana were also included in a study by Northeast University that found digital assistants eavesdropping after they had misunderstood words they thought were wake commands

“While useful, do these systems also cause significant privacy concerns – namely, what exactly do these systems record from their environment, and does that include sensitive and personal conversations that were never meant to be shared with companies or their contractors?”

When Apple’s Home Pod hears these words, he sometimes thinks he hears “Hey Siri.”

“He clear”

“Very bad”

“Hey sorry”

“OK, yes.”

“And seriously”

‘Hello Miss’

“Faith’s funeral”

“Historians”

‘I get it’

‘Sorry’

‘They say’

The report notes that these are “not just hypothetical concerns of paranoid users.”

“There have been a whole series of recent reports on devices that continuously record and outsource audio and cloud providers to transcripts from contractors of audio recordings of private and intimate interactions,” it explains.

The study emphasizes how Google admitted in 2017 that its Google Home Mini speaker, which was unveiled at the time, was guilty of eavesdropping on users.

Recordings found by Dutch broadcaster VRT revealed two years later that the same speakers were departed again after misunderstanding certain words.

To make matters worse, the devices were discovered while listening to private, sometimes intimate conversations.

Recordings of pillow conversations, fighting couples, confidential business phone calls and even discussions with children were transcribed by Google contractors in what the tech giant said it was an attempt to understand different spoken languages.

A team of researchers in Northeastern wanted to go beyond anecdotes and ran 125 hours of Netflix, watching shows like original shows on the streaming service such as ‘Narcos, NBC’s’ The Office ‘, the WBs’ Gilmore Girls ‘and other programs that contain’ reasonable large amounts of dialogue. “

The researchers also used video feeds from the devices to know when they light up, indicating that they were triggered and recorded in response to a word being heard.

Researchers at Northeaster University have set up several smart speakers in a controlled environment (photo) to see when they would misunderstand a wake command in a study that discovered that the devices dropped eaves on unknown users

Researchers at Northeaster University have set up several smart speakers in a controlled environment (photo) to see when they would misunderstand a wake command in a study that discovered that the devices dropped eaves on unknown users

Researchers at Northeaster University have set up several smart speakers in a controlled environment (photo) to see when they would misunderstand a wake command in a study that discovered that the devices dropped eaves on unknown users

The researchers also used video feeds from the devices to know when they were lit, to indicate that they had been triggered and recorded in response to a word heard

The researchers also used video feeds from the devices to know when they were lit, to indicate that they had been triggered and recorded in response to a word heard

The researchers also used video feeds from the devices to know when they were lit, to indicate that they had been triggered and recorded in response to a word heard

Specifically tested were the following:

– Google Home Mini 1st generation, which uses the keyword ‘OK, Hey, or Hi, Google’

– Apple Homepod 1st generation, which relies on the words “Hey, Siri”

– Harman Kardon Invoke from Microsoft, using the “Cortana” keyword

– Two Amazon Echo Dots, 2nd generation, which use the keyword words “Alexa, Amazon, Echo and computer”

– Two Amazon Echo Dots, 3rd generation, using the keyword words ‘Alexa, Amazon, Echo, Computer’

None of the devices appeared to be constantly recording conversations. “The devices often wake up, but often for short intervals (with a few exceptions),” the report says.

The Office and Gilmore Girls were found to be responsible for most activations.

“These two shows have more dialogue than the others, which means that the number of activations is at least partly related to the amount of dialogue,” the report says.

NBC's 'The Office' was one of the shows that lit smart speakers more often than others in a study by Northeaster University who discovered that the devices were looking for unknown users

NBC's 'The Office' was one of the shows that lit smart speakers more often than others in a study by Northeaster University who discovered that the devices were looking for unknown users

NBC’s ‘The Office’ was one of the shows that lit smart speakers more often than others in a study by Northeaster University who discovered that the devices were looking for unknown users

Shows such as the 'Gilmore Girl's' from the WB with more dialogue with others were found in a study by Northeastern University to activate smart speakers who recorded unconscious users

Shows such as the 'Gilmore Girl's' from the WB with more dialogue with others were found in a study by Northeastern University to activate smart speakers, who recorded unconscious users

Shows such as the ‘Gilmore Girl’s’ from the WB with more dialogue with others were found in a study by Northeastern University to activate smart speakers who recorded unconscious users

.