By John Gruber
WorkOS: APIs to ship SSO, SCIM, FGA, and User Management in minutes. Check out their launch week.
From a 2014 story by Maria Konnikova for The New Yorker (thanks to reader Dave Aton for the link):
But, as pilots were being freed of these responsibilities, they were becoming increasingly susceptible to boredom and complacency — problems that were all the more insidious for being difficult to identify and assess. As one pilot whom Wiener interviewed put it: “I know I’m not in the loop, but I’m not exactly out of the loop. It’s more like I’m flying alongside the loop.”
Here’s the PR statement issued by Uber after one of their self-driving cars was caught on video running a red light in San Francisco last week:
“This incident was due to human error. This is why we believe so much in making the roads safer by building self-driving Ubers,” spokesman Matt Wing said in a statement. “This vehicle was not part of the pilot and was not carrying customers. The driver involved has been suspended while we continue to investigate.”
At first read, it sounds like Uber is saying there was a human driving the car. But if you parse it closely, it could also be the case that the car was in autonomous mode, and the “human error” was that the human behind the wheel didn’t notice the car was going to sail through a red light, and failed to manually activate the brake. I think that’s what happened — otherwise the statement wouldn’t be ambiguous.
As Craig Hockenberry and I discussed on the latest episode of The Talk Show, this sort of thing seems inevitable. How can a human being maintain moment’s-notice attention for hours on end while riding in an autonomous car that drives safely for days and days? I don’t think it’s feasible.
★ Monday, 19 December 2016