Google returns to analyzing and assessing anonymized audio clips by users of users. However, it has also taken the big step of automatically getting every user out of the setting that allows Google to save their audio. That’s why you might get an email today: Google would like you to sign up for the program again and try to provide clearer information about what it’s all about.
Those are very large movements that affect a large number of people, although Google says the exact number of users who receive the email is confidential. It should end up in the inbox of anyone who has interacted with a product that uses Google’s voice AI, including apps like Google Maps and services like Google Assistant.
Here is a PDF of the email which is sent to pretty much anyone who speaks into a microphone with a Google logo next to it, which reads in part:
To give you control over your audio recording setting, we have disabled it for you until you can view the updated information. Go to your Google account to view and enable the audio recording setting if you want.
It links to this URL (which I mention, because you should never just click on an URL to an account setting without double-checking it): https://myactivity.google.com/consent/assistant/vaa
It’s hard to remember now, but last summer was one of the biggest technical stories of how every major company used people to assess the quality of their AI transcripts. When some of those sound recordings started to leak, Google, Amazon, Apple, Microsoft and Facebook were very popular.
That meant the summer of scandals in 2019 was characterized by technical explanations of how machine learning works, apologies, outrage, walkbacks, and eventually each company finally started making it easier for users to know what data was stored and how it could be deleted. I’ll put some stories in a sidebar to give you an idea of how intense it was.
All of those companies got significantly better at providing real disclosures about how audio data was used and made it easier to delete them or not provide them completely. Most of those big tech companies also went back to using human reviewers to improve their services – with disclosures and / or users seeking permission again.
But Google not bring back human reviewers after it paused practice worldwide in September last year. When it did, it promised, ‘We will not include your audio in the human review process unless you reconfirm your audio [Voice & Audio Activity] VAA institution. “Today’s email is then that promise made – albeit much later than everyone else.
When you click on the link in the email, you will be taken to a very short website with the YouTube video below explaining Google’s policies. You can also click a link that provides more detailed information about how Google stores and uses audio.
If you choose to allow Google to save your audio, it will be used in two ways. There is a period when it is linked to your account. Google uses that data to improve voice matching and you can go there to view or delete all that data. From June 2020, the default timeline for automatic data deletion is 18 months.
Your audio is then cut into pieces and “anonymized” before being sent to human reviewers to check for transcription accuracy. And since it has been a point of contention, I will add that some of those reviewers will be with third party suppliers. Only anonymized data is sent to people, Google says.
A strange caveat to all of this: While Google changes the setting to save audio recordings for everyone, it changes the audio policy with already uploaded. If you want to remove that, you can do it yourself. However, if you don’t bother, Google tells me that people won’t be reviewing audio uploaded during the break.
If you’d like to opt out or delete any data from any of these major companies, here are a few links to get you started: