By John Gruber
Magic Puzzles: Three 1000-piece jigsaw puzzles with beautiful artwork and a magical ending.
From a company-wide memo sent by Magic Leap founder Rony Abovitz Thursday:
As we’ve shared over the last several weeks, in order to set Magic Leap on a course for success, we have pivoted to focus on delivering a spatial computing platform for enterprise.
As nearly everyone has finally realized, our actual technology is nothing at all like what we promised, lied about for years, and sold gullible deep-pocketed investors on. Our con is falling apart at the seams, so we’ll milk the last few dollars out of the only investors dumb enough to give us even more money, by repeating the word “enterprise” and doing that thing with our fingers like Obi-Wan Kenobi.
We have closed significant new funding and have very positive momentum towards closing key strategic enterprise partnerships.
You’re not going to believe this but we somehow raised another $350 million. I know, right?
As the board and I planned the changes we made and what Magic Leap needs for this next focused phase, it became clear to us that a change in my role was a natural next step.
Everyone agrees the jig is up.
I discussed this with the board and we have agreed that now is the time to bring in a new CEO who can help us to commercialize our focused plan for spatial computing in enterprise. We have been actively recruiting candidates for this role and I look forward to sharing more soon.
Our Craigslist ad: “Florida company seeks Bernie Madoff type.”
I have been leading Magic Leap since 2011 (starting in my garage). We have created a new field. A new medium. And together we have defined the future of computing.
No one will remember us or anything we’ve done — unless Netflix makes one of those documentaries like the Fyre Festival one. I love that movie. Which makes me think maybe we should change our Craigslist ad to “Billy McFarland type”. Actually, when does he get out of prison?
I am amazed at everything we have built and look forward to everything Magic Leap will create in the decades to come.
I am amazed that we raised $2.4 billion and have managed to stretch this con out for 7 years and counting. We even convinced Google to invest. Google! Those guys are smart!
I will remain our CEO through the transition and am in discussions with the board with regards to how I will continue to provide strategy and vision from a board level. I remain super excited about Magic Leap’s future and believe deeply in our team and all of their incredible talent and capabilities.
I guess I should be ashamed of myself but I’m not. ★
Nilay Patel asked this of Siri on his Apple Watch. After too long of a wait, he got the correct answer — for London Canada. I tried on my iPhone and got the same result. Stupid and slow is heck of a combination.
You can argue that giving the time in London Ontario isn’t wrong per se, but that’s nonsense. The right answer is the common sense answer. If you had a human assistant and asked them “What’s the time in London?” and they honestly thought the best way to answer that question was to give you the time for the nearest London, which happened to be in Ontario or Kentucky, you’d fire that assistant. You wouldn’t fire them for getting that one answer wrong, you’d fire them because that one wrong answer is emblematic of a serious cognitive deficiency that permeates everything they try to do. You’d never have hired them in the first place, really, because there’s no way a person this lacking in common sense would get through a job interview. You don’t have to be particularly smart or knowledgeable to assume that “London” means “London England”, you just have to not be stupid.
Worse, I tried on my HomePod and Siri gave me the correct answer: the time in London England. I say this is worse because it exemplifies how inconsistent Siri is. Why in the world would you get a completely different answer to a very simple question based solely on which device answers your question? At least when most computer systems are wrong they’re consistently wrong.
I tried the same question on every other system I know where it should work: “What time is it in London?”
So every other service that tries to answer “What time is it in London?” gets it right. Only Siri gets it wrong. ★
Josh Marshall, writing at Talking Points Memo, “Unpacking the Mask Debate”:
Here’s an article that is very current among mask skeptics. It’s a review by two bona-fide experts, Dr. Lisa M. Brosseau and Dr Margaret Sietsema, writing back on April 1st, a veritable lifetime ago in COVID19 terms. It was published by the Center for Infectious Disease Research and Policy at The University of Minnesota.
The gist is that there’s little to no scientific evidence that masks are effective for the population at large and that what protection there might be is minimal at best. Additionally, they argue that mask-wearing may create a false sense of security that leads people to relax more effect mitigation strategies like distancing and hand washing. So the net effect of mask-wearing may actually be more infections rather than fewer.
If you read the report closely however a few points emerge.
First, it’s not evidence that masks are not effective — few studies really show this or demonstrate it in any clear way — but a lack of evidence for their efficacy. Second, they focus heavily on health care workers, both for available studies about what works and doesn’t and for the standards we should apply for efficacy. Finally, they take a very binary approach to efficacy. They work or they don’t.
As a vocal face mask proponent, I’ve heard something like the above counterargument from a small number of mask skeptics. Basically, the pro-mask argument is that there seems to be a lot of upside to widespread mask-wearing, and effectively no downside whatsoever beyond the initial “this feels weird” social awkwardness and mild physical discomfort. (Pro tip: Keep a tin of Altoids next to your masks.)
We’re waiting for peer-reviewed studies. In the meantime, early studies and anecdotal evidence from countries with established mask-wearing social norms suggest quite strongly that mask wearing is effective. And so if there are no downsides, there really is no argument against universal face mask wearing in public, especially indoors.
One segment of anti-mask crusaders are those who insist that the whole pandemic has been so profoundly overblown that it’s effectively a hoax. This is lunacy — there’s no point arguing with them. No surprise, some of them are flat-earthers too. But there are more than lunatics who are opposed to face masks.
The in-touch-with-reality anti-mask skeptics seem to have latched onto the idea that maybe there are downsides, that wearing a mask might somehow make it more likely that you’ll get infected — the “false sense of security” argument proposed in the article Marshall cites. That’s a plausible hypothesis, and the world is full of counterintuitive truths. E.g. the fact that one typically stays drier walking, rather than running, to shelter in a rainstorm — even though running decreases your exposure time to the rain, it so greatly increases the number of droplets that hit you that you wind up wetter. Maybe wearing a face mask in a pandemic is like running in the rain, the thinking goes, counterintuitively making things worse.
The problem for masks skeptics is there’s no data that suggests this might be the case. A plausible hypothesis is only the start of the scientific method. There is longstanding evidence in Asian countries with mask-wearing norms that, at the very least, face-mask-wearing causes no harm. As Marshall notes, if anything, as evidence comes in, masking-wearing appears to be even more effective than even proponents thought.
I’m old enough to recall when wearing seat belts became mandatory. Roughly speaking, these laws spread quickly from state to state, starting with New York in 1984 and becoming the rule rather than the exception within a decade. (“Live free or die” New Hampshire is the only remaining state that doesn’t require adults to wear a seat belt.)
I recall a similar sort of opposition to these laws as we see now with mandatory face masks. Opposition to compulsory seat belt laws always seemed crazy to me, because the evidence was so overwhelming that seat belts save lives and greatly reduce injuries that it was clearly worth making an exception to the principle, widely held in America, that the government generally shouldn’t tell people what to do. But crazy or not, opposition there was. “Fuck you, I don’t want to wear one, it’s a free country.” Word for word, the same sentiment then about seat belts as now about face masks.
One of the arguments against compulsory seat-belt-wearing was that sometimes wearing a seat belt makes things worse. “What if I’m in an accident and my seat belt gets jammed, trapping me in a burning car?” “I read about a guy who wasn’t wearing a seatbelt and he walked away from a terrible accident because he was thrown out of the car before it was totaled.”
I don’t agree with it, but to some degree I get it: What right does a government that sells you lottery tickets have to tell you that your odds are better if you’re wearing a seat belt?
But there’s a fundamental difference between wearing a seat belt in a car and wearing a face mask in a store. A seat belt really only protects the wearer. There are tangential arguments that society as a whole benefits from fewer car crash deaths and injuries, but the primary reason we have laws requiring you to wear a seat belt is to protect you from harm. Face mask requirements aren’t like that. They’re more like laws banning smoking in restaurants and making drunk driving a serious crime — they protect us all from harm.
From earlier in my childhood, I recall ubiquitous signs at the entrances of stores and restaurants: “No shirt, no shoes, no service.” There were variants, but that exact phrasing was common. I always considered those signs so strange, as I couldn’t imagine why anyone would even want to go into a store or restaurant without a shirt or shoes, let alone need a sign telling them that doing so was not permitted, but I figured it must have been a problem with hippies or something. (There were a lot of old people complaining about hippies long after there were any hippies left to complain about.)
Basically, other than poolside or at a beach, anyone who wants to go into a public establishment barefoot or shirtless is an asshole. It seems pretty clear that the people today angrily objecting to mandatory face masks aren’t really concerned with the epidemiological efficacy of masks. They’re concerned with asserting their perceived entitlement to be an asshole. You don’t need to hang a “No assholes allowed” sign to enforce it as a rule. ★
Katie Benner and Adam Goldman, reporting for The New York Times, “FBI Finds Links Between Pensacola Gunman and Al Qaeda”:
The F.B.I. recently bypassed the security features on at least one of Mr. Alshamrani’s two iPhones to discover his Qaeda links. Christopher A. Wray, the director of the F.B.I., said the bureau had “effectively no help from Apple,” but he would not say how investigators obtained access to the phone.
That would certainly be interesting to know — but I don’t expect the FBI to reveal how they got in. But privacy advocates should not succumb to the argument that because the FBI did get into one of these iPhones, that it all worked out fine in the end. The problem with this argument is that it’s implicitly based on the assumption that it would not be fine if a phone were so secure that the FBI could not get into it. Strong encryption is, on the whole, a good thing, and should remain legal — regardless whether there are known ways to circumvent it.
The investigation has served as the latest skirmish in a fight between the Justice Department and Apple pitting personal privacy against public safety. Apple stopped routinely allowing law enforcement officials into phones in 2014 as it beefed up encryption.
This framing is entirely wrong. This suggests that Apple has the ability to “just unlock” an iPhone encrypted with a passcode or passphrase. They don’t. The difference between 2014 and today isn’t that Apple previously was cooperative with law enforcement requests and now is not — the difference is that modern iPhones can’t be “unlocked” the way older ones could, because the security on modern iPhones is so much better now.
It has argued that data privacy is a human rights issue and that if it were to develop a way to allow the American government into its phones, hackers or foreign governments like China could exploit the same tool.
But law enforcement officials have said that Apple is creating a haven for criminals. The company’s defiance in the Pensacola shooting allowed any possible co-conspirators to fabricate and compare stories, destroy evidence and disappear, Mr. Wray said.
Apple did not defy anyone here. They chose, years ago, to design secure systems that have no backdoors to unlock. Not for tech support (“I forgot my passcode”), not for law enforcement. Wray knows this. Their badmouthing of Apple’s intentions in this case is just another example of their trying to scare people into supporting legislation to make secure encryption illegal. The message from Barr and Wray to Apple is implicitly this: If you won’t add backdoors to your devices we’re going to keep saying you’re aiding terrorists and deviant criminals.
Mr. Barr has maintained one of the department’s “highest priorities” is to find a way to get technology companies to help law enforcement gain lawful access to encrypted technology.
“Privacy and public safety are not mutually exclusive,” he said. “We are confident that technology companies are capable of building secure products that protect user information and, at the same time, allow for law enforcement access when permitted by a judge.”
This is not mathematically possible, and newsrooms should stop publishing these claims from law enforcement officials without comment from encryption experts. Saying you want technology companies to make a backdoor that only “good guys” can use is like saying you want guns that only “good guys” can fire. It’s not possible, and no credible cryptographer would say that it is. You might as well say that you want Apple to come up with a way for 1 + 1 to equal 3.
If law enforcement officials choose to wage a campaign to make strong encryption illegal under the guise that only “good guys” would have the circumvention keys, that’s on them, but news media need to get their shit together on the fact that what law enforcement claims to be asking for is impossible, and what is possible — adding backdoors — would be a security disaster.
Apple issued a statement responding to Barr and Wray (via The Verge):
The terrorist attack on members of the US armed services at the Naval Air Station in Pensacola, Florida was a devastating and heinous act. Apple responded to the FBI’s first requests for information just hours after the attack on December 6, 2019 and continued to support law enforcement during their investigation. We provided every piece of information available to us, including iCloud backups, account information and transactional data for multiple accounts, and we lent continuous and ongoing technical and investigative support to FBI offices in Jacksonville, Pensacola, and New York over the months since. […]
We sell the same iPhone everywhere, we don’t store customers’ passcodes and we don’t have the capacity to unlock passcode-protected devices.
Apple cooperated in every way they technically could. The DOJ is not asking for Apple’s cooperation unlocking existing iPhones — they’re asking Apple to make future iPhones insecure. ★
A Washington Post story today on Apple and Google’s joint effort on COVID-19 exposure notification project, from reporters Reed Albergotti and Drew Harwell, is the worst story I’ve seen in the Post in memory. It’s so atrociously bad — factually wrong and one-sided in opinion — that it should be retracted.
Start with the headline: “Apple and Google Are Building a Virus-Tracking System. Health Officials Say It Will Be Practically Useless.” It’s not a “virus-tracking system”, and the health officials the Post talked to don’t know what they’re talking about.
But as the tech giants have revealed more details, officials now say the software will be of little use. Due to strict rules imposed by the companies, the system will notify smartphone users if they’ve potentially come into contact with an infected person, but it won’t share any data with health officials or reveal where those meetings took place.
Notifying people when they’ve potentially come into contact with an infected person sounds useful to me. It’s true that by design, Apple and Google’s system does not track location. It’s true that location information would be potentially useful to health officials. But the exposure notifications alone are inherently useful, even without location data attached.
The gist of Apple and Google’s project is that it attempts to balance privacy with the usefulness of tracking potential exposure. It’s right there in the name of the project: “Privacy-Protecting Contact Tracing”. The Post’s sources for this story seemingly want a system with no regard for privacy at all. I wish that were an exaggeration.
But Apple and Google have refused, arguing that letting the apps collect location data or loosening other smartphone rules would undermine people’s privacy. The companies are also concerned that easing the restrictions around apps’ Bluetooth use would drain phone battery life, which could irritate customers. That unbending stance has led some health authorities to abandon hopes of building a fully functioning contact-tracing app.
“Unbending stance” is a rather harsh description of Apple and Google’s desire not to “undermine people’s privacy” or “drain phone battery life”. This isn’t an “unbending stance”. It’s table stakes for designing a system that people will actually install and use. Imagine trying to sell the public on a system that undermines their privacy or unduly drains their phone batteries — let alone a system that does both.
But Helen Nissenbaum, a professor of information science and director of the Digital Life Initiative at Cornell University, called Apple and Google’s use of privacy to defend their refusal to allow public health officials access to smartphone technology a “flamboyant smokescreen.” She said it was ironic that the two companies had for years tolerated the mass collection of people’s data but were now preventing its use for a purpose that is “critical to public health.”
“If it’s between Google and Apple having the data, I would far prefer my physician and the public health authorities to have the data about my health status,” she said. “At least they’re constrained by laws.”
Nissenbaum obviously has no idea whatsoever how this system is designed to work, despite the fact that Apple and Google have published a succinct 7-page FAQ that explains it in simple, easy-to-understand terms. It seems clear that neither the reporters from the Post nor Nissenbaum have read that FAQ, or if they did, that they don’t understand it. (Or willfully ignored it.)
Google and Apple will not “have the data”. It is stored entirely and only on each user’s own device. We, the users, will have the data, and we, the users, can share that data with our doctors.
And how in the world did “At least they’re constrained by laws” make it into this story? Nissenbaum believes Apple and Google are not constrained by laws? That will be news to both companies’ legal compliance departments, who I presume will soon be laid off.
The Apple-Google system uses the short-range Bluetooth antennas in people’s smartphones to log when two people come into contact for a short period of time, but not where that contact took place. An alert is sent if one of the people tests positive for a coronavirus infection, but that information is not shared with public health officials or contact-tracing teams.
That’s close to an accurate description — sort of, if you squint your eyes — but what the Post omits is essential. The information is not shared automatically with health officials, but if you opt into the system and get a notification that you’ve potentially been in contact with someone who has tested positive, you can then share that information with your doctor. Only doctors and registered health officials can confirm that a user in this system has tested positive for COVID-19 — otherwise, it would be open season for pranksters.
The tension over virus-tracking apps reflects a major power imbalance between the tech giants and state and local health officials, who argue that Apple and Google’s technical decisions have undermined their response to a global health emergency. It also highlights the tech giants’ ability to exert unfettered control over how billions of smartphones work.
This is nonsense. Smartphones comply with a veritable mountain of regulations and laws around the world. If you use an iPhone just look in Settings → General → Legal & Regulatory.
“They are exercising sovereign power. It’s just crazy,” said Matt Stoller, the director of research at the American Economic Liberties Project, a Washington think tank devoted to reducing the power of monopolies. Apple and Google have “decided for the whole world,” he added, “that it’s not a decision for the public to make. … You have a private government that is making choices over your society instead of democratic governments being able to make those choices.”
This quote is what’s crazy. Again, this guy Stoller clearly has no idea what he’s talking about. Apple and Google deciding how their operating systems work, in compliance with all existing laws, all around the world, is not “exercising sovereign power”. No one here is alleging that Apple or Google are doing anything even vaguely illegal. They’re not toeing some sort of line, they’re not taking advantage of any sort of loopholes.
And if Apple and Google did what Stoller and Nissenbaum seem to want them to do — track location data of every person you’re in contact with and report that data automatically to government health officials, they almost certainly would be breaking all sorts of laws around the world. The whole point of Europe’s well-intentioned but overzealous GDPR law — 88 dense pages in PDF — is, quoting from its preamble, “Natural persons should have control of their own personal data.” That’s exactly the point of Apple and Google’s system — and seemingly exactly the opposite of what every source in this Post story thinks Apple and Google should do.
Also, regarding Stoller’s advocacy for democracy, good luck finding public support for a system that turns phones into surveillance devices that report anything at all automatically to the government, let alone something as sensitive as who we’ve been in contact with and where we’ve been. I’ll grant that one can make a case that a system where government health officials have access to such data from our phones, automatically, could be useful in tracking COVID-19 infections. But try getting popular support for it. And no one I’ve seen has made the case that such a system is necessary for using phones in the aid of contact tracing.
There is not much overlap between (a) people who have thought long and hard about the very complicated ways smartphones can be used to abuse personal privacy with tracking and data collection; and (b) public health officials admirably trying to track COVID-19. None of the few people in the intersection of those two groups were quoted in this story.
The companies have argued that limiting the data the apps use could bolster their adoption rate, because people may not trust or use an app that logs their location for later use by public health authorities.
You think so?
But some parts of the U.S., including Apple and Google’s home state, say the restrictions have rendered the apps effectively useless.
None of these apps are out yet, because the APIs in iOS and Android aren’t out yet.
Contact tracers today use phone calls and interviews to track people’s movements, and rely almost entirely on people’s memory. Minute-by-minute location logs recorded by people’s phones, some officials have argued, could ease that burden by providing a more precise and automated way to track new outbreaks.
In what other context would the above paragraph pass the sniff test? “Some officials” — unnamed, unsourced — are arguing that the government should enjoy “minute-by-minute location logs recorded by people’s phones” and this is given zero pushback in a news story. No pushback at all on this argument, describing a scenario that is the very definition of a potential privacy fiasco.
“The limitations of those kind of apps are extensive,” said Mike Reid, an assistant professor of medicine at University of California, San Francisco, who’s leading the effort to train contact tracers in the state. “I don’t think they have an important role to play for most of the population.”
The contact tracers, he said, will be using software made by Salesforce and Accenture to help reach patients by phone and are trained on how to protect sensitive patient information.
“We go to pains to minimize the amount of data we take from people and we ask consent from people we’re talking to on the phone. We go to considerable lengths to ensure there are strong technical controls to ensure the anonymization of our platforms,” he said. “Can you say the same thing about these big tech companies? I’m not sure.”
Yeah, so it would be better if Apple and Google minimized the data and stored it only on the devices themselves, rather than collecting it on their servers. And they should explain in detail how their system protects privacy and ensures anonymity from start to finish.
Also — also! — we now have someone who will be training contact tracers in California, who voluntarily went on the record that Salesforce and Accenture are more worthy of trust for contract-tracing privacy protection (with detailed location data!) than the Apple/Google proposal. Goddamn.
With the Apple and Google approach, “we’ve overcompensated for privacy and still created other risks and not solved the problem,” said Ashkan Soltani, the former chief technologist of the Federal Trade Commission. “I’d personally be more comfortable if it were a health agency that I trusted and there were legal protections in place over the use of the data and I knew it was operated by a dedicated security team.”
It is legit amazing to see Ashkan Soltani, of all people, say “we’ve overcompensated for privacy.”
Tom Frieden, the former director of the Centers for Disease Control and Prevention now working with the health organization Vital Strategies, said the proximity-tracing system as proposed by Apple and Google has “been largely a distraction.”
“There are very serious questions about its feasibility and its ability to be done with adequate respect for privacy, and it has muddied the water for what actually needs to happen,” Frieden said in an interview Wednesday. “This was an approach that was done with not much understanding and a lot of overpromising.”
Here is Apple and Google’s joint announcement. What exactly did either company overpromise? Did a bunch of idiots who weren’t involved, didn’t read the specs, and don’t even understand the proposal jump to overpromise-y conclusions? Sure. But how is that Apple or Google’s fault?
The proximity-tracing systems are “a bright shiny object,” he said, “but right now they’re doing nothing to stop the pandemic.”
Maybe because they’re not fucking out yet? Hallelujah, holy shit — where’s the Tylenol? ★