Advertisements
Apple will make major changes to a program that retrieved audio from its voice assistant Siri

Apple apologizes for allowing contractors to listen to Siri recordings without user knowledge and says the program now only applies to those who sign up

  • After a kickback, Apple makes major changes to its Siri listening program
  • Users are automatically excluded from the program and can choose to log in
  • It will now use its own employees to listen to fragments, not contractors
  • In a blog post, Apple apologized that it was not & # 39; high ideals & # 39; has met
Advertisements

Apple said it is now keeping customers from default from a program that listens to audio clips that have been cleared by Siri, the voice assistant.

The company announced this decision in a blog post this week and marks the most important step since it decided to suspend the program earlier this summer.

Apple has been identified as one of many companies collecting audio clips from users in an effort to improve the accuracy of the voice assistant. Without knowing the majority, those fragments were then assessed by human contractors.

& # 39; As a result of our assessment, we realize that we have not fully lived up to our high ideals, and so we apologize, & wrote Apple.

Advertisements

Apple will make major changes to a program that retrieved audio from its voice assistant Siri

Apple will make major changes to a program that retrieved audio from its voice assistant Siri

According to the company, it will resume its program later this fall with some major changes.

In addition to standard user logoffs, Apple said it will also strictly use Apple employees to study the data. Previously, the company had hired external contractors to do the work.

Some of those contractors were the first people to talk to The Guardian anonymously in July when the program was first reported.

According to them, the intrinsically personal nature of part of the content was part of the incentive to make public the practice of harvesting audio clips.

It is not uncommon for devices such as the Apple Watch or Homepod to inadvertently capture audio that is not intended for the device, including conversations about sex, issues or medical issues.

Advertisements

Although the contractors were allegedly encouraged to report unintended triggers, they say that the process is purely technical and that they have not received a hard procedure for handling sensitive information.

As part of the new changes, Apple said it will work to ensure that all information collected by accident at the company is deleted.

In addition to standard user logoffs, Apple said it will also strictly use Apple employees to study the data. Previously, the company had hired external contractors to do the work. Photo file

In addition to standard user logoffs, Apple said it will also strictly use Apple employees to study the data. Previously, the company had hired external contractors to do the work. Photo file

In addition to standard user logoffs, Apple said it will also strictly use Apple employees to study the data. Previously, the company had hired external contractors to do the work. Photo file

Apple said that all the information collected in the program was anonymous, but whistleblowers said the intrinsically personal nature of some recordings jeopardizes anonymity.

Advertisements

Apple is just one of many other companies that have recently used human contractors to analyze voice assistant commands.

Among the other devices found to be recording users were the hugely popular smart speaker from Amazon, Alexa, the Google Home, and Facebook through the audio messaging feature.

Similarly, these devices have regularly collected data that most could consider private, including sex, private conversations, business and even medical information.

IS YOUR SMARTPHONE LISTENING TO YOU TO TARGET ADS?

For years, smartphone users have been complaining about the creepy feeling that their gadget picks up every word, even when it's in their pocket.

Many share a similar story: they were chatting about a niche product or vacation destination with friends and shortly thereafter an advertisement with the same theme appears in their social media apps.

Advertisements

According to Dr. Peter Henway, a senior security advisor for cyber security company Asterisk, these strangely relevant ads are not just coincidence and your phone listens to what you say regularly.

It is not known exactly what activates the technology, but Dr. ir. Henway claims that the technology is completely legal and is even covered in the terms of the user agreements for your mobile apps.

Most modern smartphones are loaded with AI assistants, which are activated by voice commands, such as & # 39; Hey Siri & # 39; or & # 39; OK, Google & # 39 ;.

A cyber security researcher suggests that these strangely relevant ads are not just coincidence and that your phone listens regularly to what you say. An expert said it is not exactly known what the technology is triggering (stock image)

A cyber security researcher suggests that these strangely relevant ads are not just coincidence and that your phone listens regularly to what you say. An expert said it is not exactly known what the technology is triggering (stock image)

For years, smartphone users have been complaining about the creepy feeling that their gadget captures every word, even when it's in their pocket (stock image)

Advertisements

These smartphone models constantly listen to the appropriate keyword or sentence, with everything else being thrown away.

However, keywords and phrases picked up by the gadget can be accessed by third-party apps, such as Instagram and Twitter, when proper permissions are enabled, Dr. told Henway Vice.

This means that when you chat about needing new jeans or plans for a vacation, apps can plaster your timeline with advertisements for clothing and offers on flights.

Facebook categorically denies that it uses smartphone microphones to collect information for targeted advertising.

The company said earlier that the creepy feeling that your phone is listening to you is just an example of heightened perception, or the phenomenon where people notice things they have recently talked about.

A number of other companies, including WhatsApp, also deny the interception of private conversations and describe anecdotal evidence as pure coincidence.

. (TagsToTranslate) Dailymail (t) sciencetech