Apple Fights for Precedent in iPhone Unlocking Case ★
Matthew Panzarino, writing at TechCrunch:
And herein lies the rub. There has been some chatter about whether
these kinds of changes would even be possible with Apple’s newer
devices. Those devices come equipped with Apple’s proprietary
Secure Enclave, a portion of the core processing chip where
private encryption keys are stored and used to secure data and to
enable features like TouchID. Apple says that the things that the
FBI is asking for are also possible on newer devices with the Secure Enclave.
The technical solutions to the asks would be different (no
specifics were provided) than they are on the iPhone 5c (and other
older iPhones,) but not impossible.
If I had to bet, Apple is probably working double time to lock it
down even tighter. Its reply to the next order of this type is
likely to be two words long. You pick the two.
The point is that the FBI is asking Apple to crack its own safe,
it doesn’t matter how good the locks are if you modify them to be
weak after installing them. And once the precedent is set then the
opportunity is there for similar requests to be made of all
billion or so active iOS devices. Hence the importance of this
fight for Apple.
So now we know why Apple is drawing the line with this case: it really is a slippery slope that would affect all current devices, not just the ones prior to the A7 CPU and the Secure Enclave.
Sundar Pichai on the Apple/FBI Encryption Fight ★
Sundar Pichai, in a series of tweets:
Important post by @tim_cook. Forcing companies to enable hacking
could compromise users’ privacy.
We know that law enforcement and intelligence agencies face
significant challenges in protecting the public against crime and
We build secure products to keep your information safe and we give
law enforcement access to data based on valid legal orders.
But that’s wholly different than requiring companies to enable
hacking of customer devices & data. Could be a troubling precedent.
Looking forward to a thoughtful and open discussion on this important issue.
Could Pichai’s response be any more lukewarm? He’s not really taking a stand, and the things he’s posing as questions aren’t actually in question. I’m glad he chimed in at all, and that he seems to be leaning toward Apple’s side, but this could be a lot stronger.
Apple’s iOS Security Guide on Passcodes and the Secure Enclave (PDF) ★
From page 12 of Apple’s most recent iOS security whitepaper:
By setting up a device passcode, the user automatically enables
Data Protection. iOS supports six-digit, four-digit, and
arbitrary-length alphanumeric passcodes. In addition to unlocking
the device, a passcode provides entropy for certain encryption
keys. This means an attacker in possession of a device can’t get
access to data in specific protection classes without the
The passcode is entangled with the device’s UID, so brute-force
attempts must be performed on the device under attack. A large
iteration count is used to make each attempt slower. The iteration
count is calibrated so that one attempt takes approximately 80
milliseconds. This means it would take more than 5.5 years to try
all combinations of a six-character alphanumeric passcode with
lowercase letters and numbers.
The stronger the user passcode is, the stronger the encryption key
becomes. Touch ID can be used to enhance this equation by enabling
the user to establish a much stronger passcode than would
otherwise be practical. This increases the effective amount of
entropy protecting the encryption keys used for Data Protection,
without adversely affecting the user experience of unlocking an
iOS device multiple times throughout the day.
To further discourage brute-force passcode attacks, there are
escalating time delays after the entry of an invalid passcode at
the Lock screen. If Settings → Touch ID & Passcode → Erase Data is
turned on, the device will automatically wipe after 10 consecutive
incorrect attempts to enter the passcode. This setting is also
available as an administrative policy through mobile device
management (MDM) and Exchange ActiveSync, and can be set to a
On devices with an A7 or later A-series processor, the delays are
enforced by the Secure Enclave. If the device is restarted during
a timed delay, the delay is still enforced, with the timer
starting over for the current period.
The question of the day is whether the code on the Secure Enclave that enforces these brute force countermeasures can be flash-updated (by Apple) to circumvent them. With the iPhone 5C in the current debate, the FBI wants Apple to update iOS itself to circumvent the brute force countermeasures. With an iPhone 5S or any of the 6-series iPhones, iOS is not involved. But if Apple can technically update the code that executes on the Secure Enclave, then the point is moot. The same kind of court order that requires Apple to provide the FBI with a custom (insecure) version of iOS could compel them to provide the FBI with a custom (insecure) ROM for the Secure Enclave.
Update: Rich Mogull, on Twitter, responding to my question here:
@gruber It is my understanding, from background sources, that all
devices are vulnerable.
And Farhad Manjoo:
By the way according to Apple it is not true that an iOS rewrite
of the sort the FBI is asking for here wouldn’t work on newer
In other words, a flash update to the Secure Enclave could make new iPhones more susceptible to brute force passcode cracking.
Edward Snowden on Google’s Silence ★
Edward Snowden, responding to a call for Google to publicly side with Apple:
This is the most important tech case in a decade. Silence means
@google picked a side, but it’s not the public’s.
Update: Sundar Pichai has chimed in.
Apple Court Order Heats Up Encryption Battle on Capitol Hill ★
Hamza Shaban, reporting for BuzzFeed
“Apple chose to protect a dead ISIS terrorist’s privacy over the
security of the American people,” Sen. Tom Cotton says, while Sen.
Dianne Feinstein vows to introduce a bill to force Apple to comply
with a court order giving the FBI access to the San Bernardino
Expect this sort of rhetoric to heat up. The emotional component of the San Bernardino attack is explosive.
As for Feinstein, I think any such bill would make for a terrible law — but I’d rather see an actual law passed than see the All Writs Act of 1789 abused by the FBI in this way. The more I think about it, though, the more I think that this is actually the FBI’s goal here — to create a political controversy driven by fear of terrorism committed by Muslims, and get egregious new anti-encryption legislation passed. I think the FBI knew Apple would fight this, and that the laws currently on the books are on Apple’s side. They want to get a new law on the books.
Apple Versus the FBI ★
This is why I’m just a tiny bit worried about Tim Cook drawing
such a stark line in the sand with this case: the PR optics could
not possibly be worse for Apple. It’s a case of domestic terrorism
with a clear cut bad guy and a warrant that no one could object
to, and Apple is capable of fulfilling the request. Would it
perhaps be better to cooperate in this case secure in the
knowledge that the loophole the FBI is exploiting (the
software-based security measures) has already been closed, and
then save the rhetorical gun powder for the inevitable request to
insert the sort of narrow backdoor into the disk encryption itself
I just described?
Then again, I can see the other side: a backdoor is a backdoor,
and it is absolutely the case that the FBI is demanding Apple
deliberately weaken security. Perhaps there is a slippery slope
argument here, and I can respect the idea that government
intrusion on security must be fought at every step. I just hope
that this San Bernardino case doesn’t become a rallying cry for
(helping to) break into not only an iPhone 5C but, in the long
run, all iPhones.
I am convinced that Apple is doing the morally correct thing here, by fighting the court order. I’ll bet most of you reading this agree. But like Thompson, I’m not sure at all Apple is doing the right thing politically. The FBI chose this case carefully, because the San Bernardino attack is incendiary. Do not be mistaken: Apple is sticking its neck out, politically, and they risk alienating potential customers who believe — as many national political figures do — that Apple should comply with this order and do whatever the FBI wants.
By fighting this, Apple is doing something risky and difficult. It would be easier, and far less risky, if they just quietly complied with the FBI. That’s what makes their very public stance on this so commendable.
WhatsApp CEO Jan Koum Supports Apple and Tim Cook in Encryption Fight ★
WhatsApp CEO Jan Koum:
I have always admired Tim Cook for his stance on privacy and
Apple’s efforts to protect user data and couldn’t agree more with
everything said in their Customer Letter today. We must not allow
this dangerous precedent to be set. Today our freedom and our
liberty is at stake.
Good for him. Where are the leaders of other tech companies on this? I hear crickets chirping in Mountain View and Redmond.
Uber’s Atomic Meltdown ★
Eli Schiff on Uber’s incoherent new branding:
The team admitted that it took them eighteen grueling months to
come up with the brand’s core values. That should have been a
warning sign. But for Kalanick, the time flew by. Kalanick
reminisced about the experience, “This change didn’t happen
overnight, but it sure feels like it did.” One can be sure that
Uber’s Design Director, Shalin Amin, and the team would
disagree with Kalanick on that. Indeed, Amin explained that he
“basically gave up understanding what your [Kalanick’s]
personal preference was.”
It remains unclear why Uber allowed Wired to publish this
statement, but it is telling: “Truth be told, Amin and Kalanick
didn’t fully understand what they were trying to do.”
In general, it is not a great idea to put the brand of a company
valued in the tens of billions of dollars in the hands of people
who readily admit they don’t know what their own intentions are.
ACLU Comment on FBI Effort to Force Apple to Unlock iPhone ★
Alex Abdo, staff attorney with the ACLU Speech, Privacy, and Technology Project:
This is an unprecedented, unwise, and unlawful move by the
government. The Constitution does not permit the government to
force companies to hack into their customers’ devices. Apple is
free to offer a phone that stores information securely, and it
must remain so if consumers are to retain any control over their
The government’s request also risks setting a dangerous precedent.
If the FBI can force Apple to hack into its customers’ devices,
then so too can every repressive regime in the rest of the world.
Apple deserves praise for standing up for its right to offer
secure devices to all of its customers.
The EFF on the Legality of Using the All Writs Act of 1789 to Compel Apple to Engineer a Back Door ★
Andrew Crocker, writing for the EFF blog back in October:
Reengineering iOS and breaking any number of Apple’s promises to
its customers is the definition of an unreasonable burden. As the
Ninth Circuit put it in a case interpreting technical assistance
in a different context, private companies’ obligations to
assist the government have “not extended to circumstances in which
there is a complete disruption of a service they offer to a
customer as part of their business.” What’s more, such an order
would be unconstitutional. Code is speech, and forcing Apple
to push backdoored updates would constitute “compelled speech” in
violation of the First Amendment. It would raise Fourth and Fifth
Amendment issues as well. Most important, Apple’s choice to offer
device encryption controlled entirely by the user is both entirely
legal and in line with the expert consensus on security best
practices. It would be extremely wrong-headed for Congress to
require third-party access to encrypted devices, but unless it
does, Apple can’t be forced to do so under the All Writs Act.
Unsurprisingly, the EFF today announced it is supporting Apple.
Apple Can Comply With the FBI Court Order ★
Dan Guido has a good piece on the technical aspects of what the FBI wants Apple to do:
Again in plain English, the FBI wants Apple to create a special
version of iOS that only works on the one iPhone they have
recovered. This customized version of iOS (*ahem* FBiOS) will
ignore passcode entry delays, will not erase the device after any
number of incorrect attempts, and will allow the FBI to hook up an
external device to facilitate guessing the passcode. The FBI will
send Apple the recovered iPhone so that this customized version of
iOS never physically leaves the Apple campus.
As many jailbreakers are familiar, firmware can be loaded via
Device Firmware Upgrade (DFU) Mode. Once an iPhone enters
DFU mode, it will accept a new firmware image over a USB cable.
Before any firmware image is loaded by an iPhone, the device first
checks whether the firmware has a valid signature from Apple. This
signature check is why the FBI cannot load new software onto an
iPhone on their own — the FBI does not have the secret keys that
Apple uses to sign firmware.
Guido thinks the situation would be very different if the iPhone were newer than a 5C:
At this point it is very important to mention that the recovered
iPhone is a 5C. The 5C model iPhone lacks TouchID and, therefore,
lacks the single most important security feature produced by
Apple: the Secure Enclave.
If the San Bernardino gunmen had used an iPhone with the Secure
Enclave, then there is little to nothing that Apple or the FBI
could have done to guess the passcode. However, since the iPhone
5C lacks a Secure Enclave, nearly all of the passcode protections
are implemented in software by the iOS operating system and,
therefore, replaceable by a firmware update.
Why the FBI’s Request to Apple Will Affect Civil Rights for a Generation ★
Rich Mogull, writing at Macworld:
Make no mistake: This is unprecedented, and the situation was
deliberately engineered by the FBI and Department of Justice to
force a showdown that could define limits our civil rights for
generations to come. This is an issue with far-reaching
implications well beyond a single phone, a single case, or even
As a career security professional, this case has chilling
Apple does not have the existing capability to assist the FBI. The
FBI engineered a case where the perpetrators are already dead, but
emotions are charged. And the law cited is under active legal
debate within the federal courts.
The crux of the issue is should companies be required to build
security circumvention technologies to expose their own
customers? Not “assist law enforcement with existing tools,” but
“build new tools.”
Really good take on just how high the stakes are in this case. It is not about one single iPhone 5C.
Vox Hires Choire Sicha to Oversee Partnerships With Facebook, Snapchat ★
Lukas Alpert, reporting for the WSJ:
Vox Media has long counted its own content platform as a key to
its success. But now it says the future lies in platforms run by
others, so it’s bringing in a digital media stalwart to help
strengthen those ties.
The company has hired veteran Choire Sicha, co-founder of the Awl
Network and a well-known figure in digital media, to become its
director of partner platforms.
Not sure what this means for The Awl, but it seems like a clear win for Vox.
The Entire First Episode of ‘Making a Murderer’ ★
If you don’t have Netflix but want a taste of what everyone has
been talking about for the past two months, the entire first
episode of Making a Murderer is up on YouTube.
(I do wonder how many DF readers don’t have Netflix.)
Apple Posts Open Letter to Customers on Encryption ★
Blockbuster letter, signed by Tim Cook:
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.