When Venetian merchants hauled the first shipments of a popular Ottoman drink called coffee into 17th century Europe, leaders in the Catholic Church did not exult at the prospect of increased productivity at the bottom of a warm cuppa. So they asked Pope Clement VIII to declare coffee “the bitter invention of Satan.” The pontiff, not one to jump to conclusions, had coffee brought before him, sipped, and made the call. “This Satan's drink is so delicious that it would be a pity to let the infidels have exclusive use of it,” he declared, the (perhaps apocryphal) story goes.
Which is all to say: Sometimes people are so scared of change that they get things very wrong.
Today that metathesiophobia has found a new target in cars that occasionally drive themselves. And the fearful murmuring only got louder this week, when the National Highway Traffic Safety Administration opened an investigation after a driver in Utah crashed into a stopped firetruck at 60 mph, reportedly while Tesla's Autopilot feature was engaged. Every time a Tesla with its semiautonomous Autopilot feature crashes—one hit a stopped firetruck in Southern California in January, another struck a highway barrier in Mountain View, California, in March, killing its driver—it makes headlines. (One could imagine the same thing happening with a car using Cadillac’s Super Cruise or Nissan’s Pro Pilot, but those newer, less popular features have had no reported crashes.)
So, many are fearful. The National Transportation Safety Board and the National Highway Transportation Safety Administration have launched investigations into these crashes, while consumer advocates lob criticisms at Tesla.
Human factors engineers who study the interactions between humans and machines question the sagacity of features that allow drivers to take their hands off the wheel, but require they remain alert and ready to retake control at any moment. Humans are so bad at that sort of thing, many robocar developers, including Waymo, Ford, and Volvo, are avoiding this kind of feature altogether.
Elon Musk, a leader who inspires quasi-religious devotion in his own right, spurns this hand-wringing. “It’s really incredibly irresponsible of any journalist with integrity to write an article that would lead people to believe that autonomy is less safe,” he said on an earnings call earlier this month. “People might turn it off and die.”
Musk and Tesla spokespeople have repeatedly said the feature can reduce crashes by 40 percent. But a recent clarification from the National Highway Traffic Safety Administration and a closer look at the number reveals that it doesn’t hold up.
Still, it’s plausible that Autopilot and its ilk save lives. More computer control should minimize the fallout when human drivers get distracted, sleepy, or drunk. “Elon’s probably right in that the number of crashes caused by this is going to be less than the ones that are going to be avoided,” says Costa Samaras, a civil engineer who studies electric and autonomous vehicles at Carnegie Mellon University.1 “But that doesn’t change how we interact with, regulate, and buy this technology right now.” In other words: It’s never too early to ask questions.
So how can carmakers like Musk’s prove that their tech makes roads safe enough to balance out the downsides? How can Autopilot follow in the path of the airbag, which killed some people, but saved many more, and is now ubiquitous?
Experts say it would take some statistics, helped along by a heavy dose of transparency.
“The first thing to keep in mind is, while it seems like a straightforward problem to compare the safety of one type of vehicle to another, it’s in fact a complicated process,” says David Zuby, who heads up vehicle research at the Insurance Institute of Highway Safety.
The natural starting point is looking at how many people die driving a given car as a function of miles driven, then comparing that rate to other models. Just a few problems. First, it’s hard to separate out semi-autonomous features from other advanced safety features. Is it Super Cruise doing the saving, or Cadillac’s automatic emergency braking, which steps to avoid crashes, even when the driver’s in full control of the car?
Second, we don’t have enough fatality data to draw statistically sound conclusions. While the lack of death and injury is nice, it means that independent researchers can’t definitively prove, based on police reports, if cars with these specific features are actually killing fewer people.
Then, you have to make sure you’re comparing your apple to another apple. This week, Musk tweeted that Tesla only saw one death per 320 million miles, compared to one death per 86 million miles for the average car. The problem is that latter figure is includes all road deaths involving all vehicles—those killed in motorcycles (which are way more dangerous than cars), clunkers built in the late ‘80s, and tractor trailers, as well as those killed while biking or walking.
“A Tesla is not an average car—it’s a luxury car,” says David Friedman, a former NHTSA official who now directs car at Consumers Union. It’s heavier than the average car, and so safer in a crash. (Again, a good thing—but not helpful for evaluating Autopilot.) Tesla owners are likely richer, older, and spend less time on rural roads than the average drivers. That’s important, because research indicates middle-aged people are the best drivers, and rural roads are the most dangerous kind, accounting for more than half of this country’s vehicle fatalities.
The Insurance Institute for Highway Safety has tried to track Autopilot safety through insurance claims. According to its very preliminary research, Teslas produced in the years after the company launched Autopilot were no more or less likely to see claims filed for property damage and bodily injury liability than Teslas produced before. But IIHS did find a 13 percent reduction in collision claim frequency, which could indicate that cars equipped with Autopilot are getting into fewer crashes. Still, IIHS doesn’t actually know if Autopilot was engaged during any of those incidents.
Which is all to say: It’s very, very difficult to separate out the effects of Autopilot from other variables. At least for folks who don’t work at Tesla.
Earlier this month, Musk announced that Tesla would begin to publish quarterly reports on Autopilot safety. That could be great for transparency, experts say, provided Tesla coughs up the right sorts of data. (Tesla did not respond a request for comment.)
For one, it would be great if any and all safety data could be verified by a third-party source. “When any company or entity that’s trying to sell something publishes data about their product, you have to worry at the back of your head that they may have taken data out of what they’re publishing,” says Zuby, the IIHS researcher. “So you’d like to have an independent party say, ‘Yeah, we’ve looked at all the data, and Tesla is putting out all the data.’”
Beyond that, researchers and regulators would like to get really specific. The ideal would be a GPS-pinned list of all crashes, down to the date and time of the incident. That way, investigators could separate out incidents by weather, lighting conditions, and road type. (Crashes are way less likely on highways, so even the most effective Autopilot-like function would not be able to prevent all road deaths.) Were there other vehicles or pedestrians involved? Maybe semi-autonomous features are great at protecting their own drivers and not great at protecting others.
Friedman, with the Consumers Union, says he’d like to see reports of “disengagements”—when drivers see that Autopilot is doing something wrong, like merging into a lane when it shouldn’t, and take over control. This info could give safety researchers valuable clues about how real people are using this tech.
Whatever the truth of its tech, Tesla doesn’t have the kind of papal power or persuasion that gave us macchiatos and cafés crème. Neither does General Motors, or Nissan, or any other automaker pushing this sort of feature. But they do have more access to how people are using their nascent technology than your standard public health official—and nothing helps turn doubters into believers like a few words of truth.
1Post updated, 5/17/18, 1:45 PM EDT: This story has been updated to clarify the context of Samaras's comments.
The teens who hacked Microsoft’s Xbox empire—and went too far
Ketamine offers hope—and stirs up controversy—as a depression drug
PHOTO ESSAY: Want to hunt aliens? Go to West Virginia’s low-tech ‘quiet zone’
How red-pill culture jumped the fence and got to Kanye West
Waymo’s self-driving car crash revives hard questions