Apple apologized Wednesday for its program that allowed contractors to listen to recordings of people talking to Siri, the company’s digital voice assistant.
The tech giant’s grading program allowed under .2% of audio Siri requests and their transcripts “to measure how well Siri was responding and to improve its reliability,” according to Apple.
An anonymous whistleblower told The Guardian in July that there had been “countless” times in which contractors could hear conversations between doctors and patients, businessmen, potential criminals and sexual partners, and those conversations were also connected to location, contact details and app data.
JUST IN: Apple has apologized for how it handled the audio files when customers accessed Siri and has announced a series of changes aimed at better safeguarding customer privacy. https://t.co/90RIngweoT
— Axios (@axios) August 28, 2019
“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading,” the apology stated. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”
While Apple acknowledges the importance of privacy as “a human right” in the beginning of its statement, it goes on to say that users’ personal data “makes Siri better.”
“In order for Siri to more accurately complete personalized tasks, it collects and stores certain information from your device. … Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that ‘trains’ Siri to improve,” the statement reads. (RELATED: 13-Year-Old Arrested After Telling Siri He Wanted ‘To Shoot Up A School’)
Apple plans to end its process of hiring contractors to listen in on conversations in an attempt to improve Siri’s performance and will rely on computer-generated transcripts to help the voice assistant be more accurate. Apple will also allow users to give consent before their audio samples are used, and if they do give consent, only Apple employees will be allowed to listen in.
“Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve,” the statement concludes.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact firstname.lastname@example.org.