Just like any other technology company that has caught its hand in the cookie jar this year – hello there, Amazon, Apple, Google and Facebook – we recently discovered that Microsoft had quietly let human contractors listen to your Skype translations and Cortana voice recordings . That's right: they are not just AI.
Our processing of personal data for these purposes includes both automated and manual (human) processing methods. Our automated methods are often related to and supported by our manual methods.
To build, train, and improve the accuracy of our automated processing methods (including AI), we manually review some of the predictions and inferences that were produced by the automated methods against the underlying data from which the predictions and inferences were made. For example, we manually check short fragments of a small sample of speech data that we have taken to de-identify to improve our speech services, such as recognition and translation.
When you talk to Cortana or other apps that use Microsoft voice services, Microsoft saves a copy of your audio recordings (e.g., Voice data) (…) This may include transcription of audio recordings by Microsoft employees and suppliers, prioritize user privacy, including taking steps to de-identify data, require confidentiality agreements with suppliers and their employees and require suppliers to meet the high privacy standards set out in European legislation and elsewhere, depending on the designed procedures.
It is true that systems built using machine learning, such as most modern speech recognition and natural language processing systems, generally need to be monitored by people to improve – it is not clear how a machine would tell a false positive, unless a person points out, annotates the data and re-enters it into the system. And to the honor of Microsoft, it offers a privacy dashboard where you can delete your voice data retroactively.
(Cortana also seems to be out.)
But the scandal with all these technology companies was that they didn't think about making it clear that people (read: outsourced contractors) would listen to extremely personal data, such as people who put their exact street address, confidential medical information, or sex sounds in the microphone of a voice assistant – and let us work proactively sign out, if we decide that this is something that we do not want to bring to our home.
Apple says it has a future update that allows its customers to unsubscribe. Will other companies do the same?