Tesla's Wild Fight With the Feds Investigating Its Autopilot Death

Elon Musk's automaker has quit the investigation led by the National Transportation Safety Board, and the sniping is getting serious.
Image may contain Car Vehicle Transportation Automobile Wheel Machine Human Person Factory and Building
Jasper Juinen/Bloomberg/Getty Images

Tesla loves a good fight. CEO Elon Musk has battled car dealers, President Trump, and more than a few reporters. Now he has found a new opponent in the National Transportation Safety Board. The agency is investigating the crash of a Model X that was running with Autopilot engaged when it slammed into a highway divider in Northern California last month, killing the driver. Today, the NTSB announced it kicked Tesla off the team looking into what happened and how to stop it from recurring.

On the surface, the disagreement is about when and how to make information about the crash public. The NTSB, which investigates all major transportation accidents, is a cards-to-the-vest operation. It often shares facts as it finds them but rarely draws conclusions about things like causality or remedies until it's ready to release a thorough, detailed, and considered report. That usually takes at least a year, sometimes two.

KTVU/AP

Tesla argues the safest thing to do is make whatever it knows public as soon as possible. A week after the March 23 crash, Tesla announced that Walter Huang, the driver of the Model X, had turned on Autopilot, putting the car's computer in charge of staying between the lane lines and a safe distance from other vehicles. Driver's using Autopilot are supposed to keep their eyes on the road and hands on the wheel to monitor the fallible system. Tesla said Huang’s hands were not detected on the wheel for the six seconds prior to the crash and that he should have had about five seconds of unobstructed view of the concrete lane divider he slammed into, but the vehicle logs show no action was taken.

In revealing those details—and effectively blaming the driver when the investigation had barely begun—Tesla violated its agreement with the NTSB, which requires all parties to keep quiet and let it do the talking. Last night, Tesla released a statement saying it’s pulling out of that agreement:

“Today, Tesla withdrew from the party agreement with the NTSB because it requires that we not release information about Autopilot to the public, a requirement which we believe fundamentally affects public safety negatively," the company said in a statement. "We believe in transparency, so an agreement that prevents public release of information for over a year is unacceptable."

This morning, the NTSB disputed that account, saying it called Musk last night to give Tesla the boot:

X content

This content can also be viewed on the site it originates from.

“Releases of incomplete information often lead to speculation and incorrect assumptions about the probable cause of a crash, which does a disservice to the investigative process and the traveling public,” it said in a statement.

Tesla, never one to let a good scrap go to waste, fired back this afternoon. It reiterated that it broke up with the NTSB, not the other way around. "It's been clear in our conversations with the NTSB that they're more concerned with press headlines that actually promoting safety," it said in [a statement. "Among other things, they repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts." Tesla also said it plans to complain to Congress.

Musk has previously griped about the NTSB's involvement, saying it's up to the National Highway Traffic Safety Administration (NHTSA), not the NTSB, to regulate the auto industry. Indeed, the NTSB has no regulatory power. Its mission is to investigate accidents and make safety recommendations to the relevant government body. (The NHTSA is also looking into the crash and says it will "take action as appropriate.")

As far as this NTSB investigation goes, Tesla's departure is unlikely to change much. The automaker says it will still provide whatever technical help the NTSB needs, to recover and interpret data from the vehicle’s sensors leading up to and during the crash. Even if it refuses, the NTSB can subpoena the info.

But to properly understand Tesla's seething anger at a government body widely seen as even-keeled and impartial, you need a quick dive into the past. In May 2016, a Tesla Model S running Autopilot crashed into a truck turning across its path, killing its driver, Josh Brown. The NHTSA pinned the crash on driver error, saying the system wasn't defective. A few months later, the NTSB issued its own report, saying Tesla bears some blame for Brown's death because its car didn't do enough to ensure he watched the road. “The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” NTSB chief Robert Sumwalt said at the time. It was the first substantive rebuke of one of Tesla's hallmark features, a serious blow to an automaker that trades on innovation.

After Brown's death, Tesla updated its software, escalating the warnings the car issues to inattentive drivers. But the basic premise of the system remains: The car works the steering and speed; the human monitors and intervenes as needed. And there's plenty of reason to think humans are just not good at that sort of thing. So it's easy to imagine that the NTSB will come to a similar conclusion once it's done investigating Huang's death, painting Tesla's innovative system in a damning light. It's also easy to guess that Musk and Tesla are trying to spin things in their favor before the Feds tell that kind of tale the second time in two years.

Along with Tesla, automakers including Audi, Cadillac, Mercedes-Benz, Nissan, and others already or soon will offer this sort of semiautonomous system, requiring that the human behind the wheel remain attentive. Cadillac's Super Cruise is especially sophisticated. It allows hands-free driving, using a camera to track the driver's head to make sure they're looking at the road. It has bright green and red LEDs in the top steering wheel to grab the driver’s attention when needed and can vibrate the seat.

These systems make highway driving more pleasant, and likely safer. Most of the time, they work well and probably prevent many crashes inattentive humans would cause in regular cars. Tesla claims you are 3.7 times less likely to be involved in a fatal accident if you've got Autopilot (which it sells as a $5,000 option). “It unequivocally makes the world safer for the vehicle occupants, pedestrians, and cyclists,” the company said in a recent blog post. “The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe.”

That’s fair. Humans cause 40,000 deaths on US roads every year. But it’s also fair to say that Tesla’s Autopilot system isn’t perfect and could be made even safer. For the official word on how to do that, we’ll have to wait for the NTSB to finish its work—even without Tesla's help.


Self-Driving Quandaries