By John Gruber
WorkOS: APIs to ship SSO, SCIM, FGA, and User Management in minutes. Check out their launch week.
Three weeks ago, writing for The Guardian, Alex Hern reported:
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.
Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.
But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.
I pooh-poohed this story at first, mostly on the grounds that I thought we knew about this, and that the recordings were only saved from users who had consented to it. I was mistaken. This is a privacy fiasco, and a betrayal of Siri users’ trust.
A week later, Apple issued statements to TechCrunch and The Verge stating that it was suspending this “grading” program. From Matthew Panzarino’s report at TechCrunch:
Apple says it will review the process that it uses, called grading, to determine whether Siri is hearing queries correctly, or being invoked by mistake.
In addition, it will be issuing a software update in the future that will let Siri users choose whether they participate in the grading process or not.
My reading of this is that until last week, if you used Siri in any way, your recordings might be used in this “grading” process. If I graded Apple on the privacy and trust implications of this, I’d give them an F. I don’t think it’s debatable whether users of any voice assistant should have their recordings listened to or even reviewed (in text form) by human employees without their express consent. But especially users of Siri, given Apple’s prominent position as a privacy focused company. Apple literally advertises on the basis of its user-focused privacy policies — but apparently the billboards should have read “What happens on your iPhone stays on your iPhone, except for some of your Siri recordings, which we listen to.”
From Sam Byford’s report for The Verge:
Apple did not comment on whether, in addition to pausing the program where contractors listen to Siri voice recordings, it would also stop actually saving those recordings on its servers. Currently the company says it keeps recordings for six months before removing identifying information from a copy that it could keep for two years or more.
Until the opt-in process is crystal clear, Apple should delete all existing recordings and confirm that it is no longer saving them. I don’t even know where to start with the fact that until this story broke, they were keeping copies with identifying information for six months. This defies everyone’s expectations of privacy for a voice assistant.
We should expect Apple to lead the industry on this front, but in fact, they’re far behind. Amazon has a FAQ written in plain language that explains how Alexa works, and how to view your voice recordings from Alexa-powered devices. You can review them in the Alexa app in Settings: Alexa Privacy (a pretty obvious location) or on the web. That settings page also has an option: “Use Voice Recordings to Improve Amazon Services and to Develop New Features”. I think Amazon should make clear that with this turned on, some of your recordings may be listened to by Amazon employees, but it’s not too hard to surmise that’s what’s going on.
Apple offers no such setting, and offers absolutely no way to know which, if any, of our Siri recordings have been saved for review by employees. This is something we should have explicit, precise control over, but instead it’s a completely black box we have no control over or insight into whatsoever.
From a privacy perspective, there are two fundamental types of Siri interactions: purposeful and accidental. Purposeful interactions are when you press the side button or say “Hey Siri” with the intention of invoking Siri. Accidental interactions occur when the button is pressed too long accidentally, or when a device incorrectly hears “Hey Siri” even though you said no such thing. All recorded Siri interactions should be treated by Apple with extraordinary care, but accidental invocations, when identified, should be deleted immediately unless the user has expressly agreed to allow it — each and every time. Having Apple contractors listen to random conversations or audio is the nightmare scenario for an always-listening voice assistant.
Compare and contrast with iOS’s transcript feature for voicemail. At the bottom of each transcription, iOS asks whether the transcription was “useful” or “not useful”. Tap on either of those and you get a very explicit prompt:
Help Improve Transcriptions?
Would you like to submit this voicemail to Apple to improve transcription accuracy?
Recordings will only be used to improve the quality of speech recognition in Apple products.
Do not submit recordings if you believe the speaker would be uncomfortable with you submitting the content to Apple.
The two buttons at the bottom of the prompt: Cancel and Submit. You must address this same prompt every single time you flag a transcription as useful or not useful. Every time. That’s how you do it.
In addition to being correctly respectful of privacy, the voicemail transcription feature also puts the user in control. So when a voicemail is transcribed poorly, you can flag it and submit it to Apple. That would be a great feature for Siri — when an interaction goes poorly, and we know the interaction was innocuous in terms of revealing anything private, we should be able to flag it and submit it to Apple. I firmly believe that Siri has gotten far more useful and far more accurate in the last few years, but clearly it’s still very far from perfect. I’d be happy to help Apple by submitting failed interactions on a per-interaction basis. Apple needs to stop pretending Siri is perfect.
I’ll give the final word to Steve Jobs, speaking about privacy back in 2010 at Kara Swisher and Walt Mossberg’s D8 conference:
“Privacy means people know what they’re signing up for, in plain English and repeatedly. I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data.”
I can’t say it any better than that.