By John Gruber
Jiiiii — All your anime stream schedules in one place.
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:
First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
This is all good. Opt-in only, with an easy way to opt back out. Inadvertent recordings will be deleted as soon as they’re identified. This bit, too, is interesting:
Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device’s data is disassociated from the random identifier.
The Verge had previously reported the following, which made it sound like the recordings were tied to Apple IDs for the first six months:
Currently the company says it keeps recordings for six months before removing identifying information from a copy that it could keep for two years or more.
The Verge wasn’t wrong there, but it’s an important clarification that the “identifying information” was a random per-device identifier that was designed not to be able to be tied back to an Apple ID or phone number.
Apple also has a “Siri Privacy and Grading” FAQ, written in very clear language. Basically, Apple is admitting they fucked up on this grading thing, they’re owning up to it, and are committed to doing everything they should have been doing all along to protect users’ privacy and make everything as clear as possible to users.
My take on this saga was severely critical, but I am convinced this was a mistake — really, a series of mistakes — on Apple’s part, not an indication that the company’s privacy stance is hypocritical or merely bullshit marketing hype. It is therefore not surprising, but satisfying nonetheless, to see Apple address it head-on like this.
Previous: | Siri, Privacy, and Trust |
Next: | Going Pro |