By John Gruber
Jiiiii — Free to download, unlock your anime-watching-superpowers today!
Russell Brandom, writing for The Verge, speculating about the possibility that U.S. law enforcement agencies could force Amazon and Google to help them identify people through archived voice data the companies retain:
The most ominous sign is how much data personal assistants are still retaining. There’s no technical reason to store audio of every request by default, particularly if it poses a privacy risk. If Google and Amazon wanted to decrease the threat, they could stop logging requests under specific users, tying them instead to an anonymous identifier as Siri does. Failing that, they could retain text instead of audio, or even process the speech-to-text conversion on the device itself.
But the Echo and the Home weren’t made with the NSA in mind. Google and Amazon were trying to build useful assistants, and they likely didn’t consider that it could also be a tool of surveillance. Even more, they didn’t consider that a person’s voice might be something they would have to protect.
This is an interesting piece, but I think Brandom makes a mistake by making this sound binary. Were Alexa and Google Home designed with the NSA in mind, yes or no? Did they consider these products could be tools of surveillance, yes or no? That’s the wrong way to think about it. Of course people at Amazon and Google thought about these things. I would wager heavily that they care about privacy in this regard, too.
The issue isn’t about whether they care. It’s about how much they care, relative to other factors like the potential for saved audio data to improve the product. It’s not that Apple cares about privacy and that Amazon and Google don’t; it’s that Apple cares more than they do. It’s a trade-off.
★ Monday, 22 January 2018