By John Gruber
Kolide ensures only secure devices can access your cloud apps. Watch the demo to see how it works.
Katie Benner and Adam Goldman, reporting for The New York Times, “FBI Finds Links Between Pensacola Gunman and Al Qaeda”:
The F.B.I. recently bypassed the security features on at least one of Mr. Alshamrani’s two iPhones to discover his Qaeda links. Christopher A. Wray, the director of the F.B.I., said the bureau had “effectively no help from Apple,” but he would not say how investigators obtained access to the phone.
That would certainly be interesting to know — but I don’t expect the FBI to reveal how they got in. But privacy advocates should not succumb to the argument that because the FBI did get into one of these iPhones, that it all worked out fine in the end. The problem with this argument is that it’s implicitly based on the assumption that it would not be fine if a phone were so secure that the FBI could not get into it. Strong encryption is, on the whole, a good thing, and should remain legal — regardless whether there are known ways to circumvent it.
The investigation has served as the latest skirmish in a fight between the Justice Department and Apple pitting personal privacy against public safety. Apple stopped routinely allowing law enforcement officials into phones in 2014 as it beefed up encryption.
This framing is entirely wrong. This suggests that Apple has the ability to “just unlock” an iPhone encrypted with a passcode or passphrase. They don’t. The difference between 2014 and today isn’t that Apple previously was cooperative with law enforcement requests and now is not — the difference is that modern iPhones can’t be “unlocked” the way older ones could, because the security on modern iPhones is so much better now.
It has argued that data privacy is a human rights issue and that if it were to develop a way to allow the American government into its phones, hackers or foreign governments like China could exploit the same tool.
But law enforcement officials have said that Apple is creating a haven for criminals. The company’s defiance in the Pensacola shooting allowed any possible co-conspirators to fabricate and compare stories, destroy evidence and disappear, Mr. Wray said.
Apple did not defy anyone here. They chose, years ago, to design secure systems that have no backdoors to unlock. Not for tech support (“I forgot my passcode”), not for law enforcement. Wray knows this. Their badmouthing of Apple’s intentions in this case is just another example of their trying to scare people into supporting legislation to make secure encryption illegal. The message from Barr and Wray to Apple is implicitly this: If you won’t add backdoors to your devices we’re going to keep saying you’re aiding terrorists and deviant criminals.
Mr. Barr has maintained one of the department’s “highest priorities” is to find a way to get technology companies to help law enforcement gain lawful access to encrypted technology.
“Privacy and public safety are not mutually exclusive,” he said. “We are confident that technology companies are capable of building secure products that protect user information and, at the same time, allow for law enforcement access when permitted by a judge.”
This is not mathematically possible, and newsrooms should stop publishing these claims from law enforcement officials without comment from encryption experts. Saying you want technology companies to make a backdoor that only “good guys” can use is like saying you want guns that only “good guys” can fire. It’s not possible, and no credible cryptographer would say that it is. You might as well say that you want Apple to come up with a way for 1 + 1 to equal 3.
If law enforcement officials choose to wage a campaign to make strong encryption illegal under the guise that only “good guys” would have the circumvention keys, that’s on them, but news media need to get their shit together on the fact that what law enforcement claims to be asking for is impossible, and what is possible — adding backdoors — would be a security disaster.
Apple issued a statement responding to Barr and Wray (via The Verge):
The terrorist attack on members of the US armed services at the Naval Air Station in Pensacola, Florida was a devastating and heinous act. Apple responded to the FBI’s first requests for information just hours after the attack on December 6, 2019 and continued to support law enforcement during their investigation. We provided every piece of information available to us, including iCloud backups, account information and transactional data for multiple accounts, and we lent continuous and ongoing technical and investigative support to FBI offices in Jacksonville, Pensacola, and New York over the months since. […]
We sell the same iPhone everywhere, we don’t store customers’ passcodes and we don’t have the capacity to unlock passcode-protected devices.
Apple cooperated in every way they technically could. The DOJ is not asking for Apple’s cooperation unlocking existing iPhones — they’re asking Apple to make future iPhones insecure.