Apple is to suspend a program in which it sent anonymized recordings of Siri chats to third party contractors, following privacy concerns raised by a report in the Guardian.
Anonymous sources stated that Apple was sharing anonymized Siri conversations with teams who would use the data to better train the AI assistant, particularly when it had been triggered by accident.
But The Guardian's source pointed out privacy flaws in the program, which regularly let contractors listen in on private situations, including sexual activity and criminal dealings. The data given to the contractors also makes it relatively easy to figure out who the anonymized recordings could belong to.
Apple will now suspend the practice while it conducts a "thorough" review of the system it refers to a "grading", according to a statement sent to TechCrunch.
"Protecting user privacy"
In a statement, Apple said:
"We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
Siri can be accessed by many types of Apple product, from the smart speaker HomePod, to its iPhone range, to the Apple Watch and its Macs.
It's the wearable that was thought to be most prone to triggering false-positive Siri activations due to its wake word also being tied to the movements of an owners hand, as well (worryingly) as the sound of a zip being pulled.
Apple publicly prizes itself on its privacy track record, which is what makes this particularly infringement such a troubling one. On the one hand, it mocks competitors whose privacy controls seem lax, and on the other is sending data and audio clips back to human researchers.
- How to turn off Siri in iOS on your iPhone and iPad