Apple apologizes for Siri privacy issue and quits keeping audio recordings
Apple Inc. apologized Wednesday for privacy mishaps surrounding its Siri voice assistant and said it would no longer retain audio recordings of Siri interactions, among other changes.
The announcement follows criticism of the iPhone maker and other technology giants for employing humans to listen to recordings of user interactions with voice assistants in a bid to improve the product. Apple had hundreds of contractors listening to recordings of Siri users in a process called “grading,” but the company suspended the program a few weeks ago after some consumers raised concerns. It plans to reinstate the practice after making a few changes in software updates this fall that will give users more control over their privacy.
“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said in a statement Wednesday.
Bloomberg News reported this year that Amazon.com Inc. and Apple had teams analyzing recordings. The Guardian reported in July that some of the people reviewing the Siri requests heard private personal details and possibly criminal activity. Amazon, which still has teams auditing voice commands for its Alexa digital assistant, said this month that it was letting users opt out of human review. Google has agreed to stop transcribing voice recordings in the European Union amid a German investigation.
The tech giants’ use of human reviewers has spurred examinations by lawmakers and regulators in the United States and Europe. Privacy advocates have expressed concern that the companies’ practices could violate users’ rights, particularly in cases in which devices begin recording unintentionally or without the user’s knowledge. Apple faces a class-action lawsuit over privacy violations related to human reviewers listening to recordings.
“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process,” Apple said. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies.”
At the CES technology trade show in Las Vegas this year, Apple posted billboards that proclaimed, “What happens on your iPhone, stays on your iPhone.”
At Apple, less than 0.2% of Siri commands were analyzed, the company said. The recordings that were reviewed also don’t contain personally identifiable information and are stored for six months tied to a random identifier, not linked to a user’s Apple ID or phone number.
As part of the changes Apple said it’s implementing, users will be able to opt in to let the company listen to a select bunch of anonymized audio samples in order to improve Siri, and then be able to opt out of the program later if they wish. Although it will no longer store audio recordings, computer-generated transcriptions will be held anonymously for up to six months, Apple said.
In another change, Apple said only its own employees — not outside contractors — would listen to audio samples. The Guardian reported Wednesday that at least 300 contractors in Europe lost their jobs as a result of Apple suspending its grading program. Apple also said it’s making changes to the review process to reduce the data about customers that reviewers can see.
After user concerns that Apple could be retaining recordings from Siri that were accidentally picked up because of a mistaken button press or the system wrongly thinking the user had said “Hey Siri,” Apple said Wednesday that it would work to delete inadvertent recordings.
Apple has often sought to distinguish itself as having tighter privacy controls than other tech companies. But this isn’t the first time it has had to apologize for lapses. Early this year Apple issued a mea culpa for a bug in its FaceTime video chat service that enabled users to listen in on people before the people had even accepted or rejected a call.