1. home
  2. news
  3. Tesla’s Autopilot Contributed to a 2018 Crash, NTSB Says

Tesla’s Autopilot Contributed to a 2018 Crash, NTSB Says

Did early adopters adopt too early?

It is becoming too difficult to avoid my use of the words "toldja so," in a follow-up column, particularly when it comes to automotive technology disruptor Tesla Motors and its equally disruptive CEO, Elon Musk. Try not to come off as smug when the news in question involves something as touchy as a fatal crash, in this case the 2018 accident involving a Tesla Model X that collided with a crash barrier in California that had been damaged in a previous accident.

The late Tesla operator—we hesitate to call him a "driver," because he wasn't quite driving—Wei Huang probably would have survived had California fixed the barrier, according to a story on a National Transportation Safety Board hearing in Washington Tuesday reported by The New York Times and other sources.

Don't blame solely the California Transportation Commission. According to the hearing as reported in the Times, Huang, an Apple employee, was behind the wheel of his 2017 Tesla Model X, which was traveling at 71 mph on U.S. Highway 101 in Mountain View, when the car struck a median barrier, spun, hit two other vehicles and then caught fire.

Huang had been playing a game on his smartphone, though it was not clear whether he was engaged in the game, when the Tesla, being operated in Autopilot mode hit the barrier, the NTSB said. The board called on Apple to ban non-emergency use of company-issued devices while driving and called on other electronics companies to lock drivers out of their devices, or limit what they can do while behind the wheel. The NTSB also criticized Tesla for not responding to an order limiting automated driving in 2017.

"It's been 881 days since these recommendations were sent to Tesla and we've heard nothing," NTSB chairman Robert L. Sumwalt said on Tuesday. "We're still waiting."

The NTSB told Tesla and five other automakers they should limit use of automated systems to specific conditions for which they are deemed safe, and to improve monitoring of drivers to make sure they have their hands on the wheel and are focused on the road, according to the Times. Those recommendations came after reports of the first known fatality attributed to use of an automated driving system.

In 2016, Joshua Brown died when his Tesla Model S, apparently being operated under Autopilot, struck a semi-truck in Williston, Florida. Here's where my toldja so moment comes in.

Musk's hubris "touting his company's ability to quickly add such technology can take the blame for its inappropriate use in his cars," I wrote in July 2016, after details of Brown's crash emerged. I suggested that Tesla and Musk would set back development of autonomous cars, rather than advance development.

Perhaps to prove his company's disruptiveness, or perhaps to help boost Tesla's already overvalued share price, Musk ignored any such warnings, and in his April 2019 quarterly financial report conference call with stock analysts announced that his cars would have Level 5 full autonomous technology by the end of the year. What's more, he told us Tesla would introduce Model 3 robotaxis some time this year, in 2020. Tesla Model 3 owners could amortize the costs of buying their cars by loaning them out automotive Airbnb-style and the Model 3s would drive themselves to the next car-share customers.

The good thing is that at worst, the Level 5 Model 3s are yet another much-delayed Musk project, and at best, they will never appear at all. Musk has said that it is irresponsible for an automaker to delay any system that could save lives in the way full autonomy promises. But by overselling the promises of his company's technological development, he's leaving an opening for traditional automakers to properly test Level 4 and 5 systems and beat Tesla to market with safer autonomous vehicles.