Ken Klippenstein, reporting for The Intercept:
Highway surveillance footage from Thanksgiving Day shows a Tesla
Model S vehicle changing lanes and then abruptly braking in the
far-left lane of the San Francisco Bay Bridge, resulting in an
eight-vehicle crash. The crash injured nine people, including
a 2-year-old child, and blocked traffic on the bridge for over
The video and new photographs of the crash, which were obtained by
The Intercept via a California Public Records Act request,
provides the first direct look at what happened on November 24,
confirming witness accounts at the time. The driver told police
that he had been using Tesla’s new “Full Self-Driving” feature,
the report notes, before the Tesla’s “left signal activated” and
its “brakes activated,” and it moved into the left lane, “slowing
to a stop directly in [the second vehicle’s] path of travel.” [...]
Since 2016, the federal agency has investigated a total of 35
crashes in which Tesla’s “Full Self-Driving” or “Autopilot”
systems were likely in use. Together, these accidents have killed
19 people. In recent months, a surge of reports have emerged in
which Tesla drivers complained of sudden “phantom braking,”
causing the vehicle to slam on its brakes at high speeds. More
than 100 such complaints were filed with NHTSA in a three-month
period, according to the Washington Post.
The footage of the crash is weird. The Tesla just slows to a stop in the passing lane. When AI systems make mistakes, they often are very different from the sort of mistakes people make.
It ought to be enough for self-driving systems to be safer, mile-per-mile, than human-driven vehicles. That may even be true for Tesla’s “full self-driving” mode, today. 35 crashes in 6 years isn’t a lot. But that’s not going to be enough. It’s not fair, but the bar for self-driving cars is much higher than having a lower accident rate than human-driven vehicles. Even though human-caused crashes are increasing in frequency, they’re so commonplace that they don’t register as shocking. (Same thing with gun deaths in the U.S.) An accident caused by an AI-driven car, though — that’s jarring because that’s novel. “Dog Bites Man” doesn’t make the front page. “Man Bites Dog” does.
As for Tesla’s system in particular, it strikes me as bizarre that it’s legal for them to enable this when they themselves still describe the feature as “beta” software.
★ Wednesday, 11 January 2023