Microsoft's new privacy policy allows people to listen to some Skype and Cortana recordings

Just like any other technology company that has caught its hand in the cookie jar this year – hello there, Amazon, Apple, Google and Facebook – we recently discovered that Microsoft had quietly let human contractors listen to your Skype translations and Cortana voice recordings . That's right: they are not just AI.

But unlike Apple and Google, who each stopped listening to some of these recordings after the revelations, Microsoft only seems to be updating its privacy policy to admit that people are indeed reviewing some of these recordings. A warning here: Microsoft only does this for the Skype translation feature, not for Skype calls. However, the company analyzes speech fragments of Cortana requests and exchanges, presumably on all platforms, including PC, where it is easier to search the web with more sensitive requests.

motherboard seen the changes, which you can also read yourself here, hereand here. Here are the key phrases that you can point to:

Our processing of personal data for these purposes includes both automated and manual (human) processing methods. Our automated methods are often related to and supported by our manual methods.

And:

To build, train, and improve the accuracy of our automated processing methods (including AI), we manually review some of the predictions and inferences that were produced by the automated methods against the underlying data from which the predictions and inferences were made. For example, we manually check short fragments of a small sample of speech data that we have taken to de-identify to improve our speech services, such as recognition and translation.

And:

When you talk to Cortana or other apps that use Microsoft voice services, Microsoft saves a copy of your audio recordings (e.g., Voice data) (…) This may include transcription of audio recordings by Microsoft employees and suppliers, prioritize user privacy, including taking steps to de-identify data, require confidentiality agreements with suppliers and their employees and require suppliers to meet the high privacy standards set out in European legislation and elsewhere, depending on the designed procedures.

It is true that systems built using machine learning, such as most modern speech recognition and natural language processing systems, generally need to be monitored by people to improve – it is not clear how a machine would tell a false positive, unless a person points out, annotates the data and re-enters it into the system. And to the honor of Microsoft, it offers a privacy dashboard where you can delete your voice data retroactively.

(Cortana also seems to be out.)

But the scandal with all these technology companies was that they didn't think about making it clear that people (read: outsourced contractors) would listen to extremely personal data, such as people who put their exact street address, confidential medical information, or sex sounds in the microphone of a voice assistant – and let us work proactively sign out, if we decide that this is something that we do not want to bring to our home.

Apple says it has a future update that allows its customers to unsubscribe. Will other companies do the same?