Tesla Inc. again placed blame for a fatal crash involving a Model X last month on the driver as the man’s family hires a firm to explore legal options.
The March 23 death of Walter Huang happened on a clear day, with several hundred feet of visibility ahead, the electric-car maker said in an emailed statement last week. Tesla had already said Huang, 38, didn’t have
his hands on the steering wheel for six seconds before his vehicle collided with a highway barrier in Mountain View, California, and caught fire.
“The only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so,” the statement said. “The fundamental premise of both moral and legal liability is a broken promise, and there was none here.”
Tesla is ratcheting up the rhetoric it uses to defend Autopilot, a driver-assistance system that it’s claimed will eventually render its cars fully capable of driving themselves. U.S. National Transportation Safety Board investigators are looking into the crash that killed Huang, as well as a collision in January involving a Tesla Model S using Autopilot that rear-ended a fire truck parked on a freeway near Los Angeles.
Tesla said it is “extremely clear” that Autopilot requires drivers to be alert and have hands on the steering wheel. The system reminds the driver this every time it’s engaged, according to the company. Minami Tamaki LLP, the law firm that announced its hiring by Huang’s family earlier Wednesday, declined to comment on Tesla’s statement.
“Tesla’s response is reflective of its ongoing strategy of doubling down on the explicit warnings it has given to drivers on how to use, and not use, the system,” said Mike Ramsey, an analyst at Gartner Inc. “It’s not the first time Tesla has taken this stance.”
After partially faulting Autopilot for a May 2016 fatal crash, NTSB investigators last year called on carmakers to do more to ensure drivers stay engaged as next-generation cars start to steer themselves. Tesla has lagged behind automakers including General Motors Co. and Subaru Corp. in embracing driver-facing camera systems that can monitor head and eye movement and disengage partially autonomous systems when the driver isn’t paying attention.
An NTSB spokesman said this week that Tesla indicated to the agency that it had already made improvements to Autopilot and was working on additional ones, without being more specific.
No Defect, Recall
The NTSB has no regulatory powers but makes safety-related recommendations to both the government and transportation companies. The National Highway Traffic Safety Administration, which does have the power to order recalls and fine manufacturers, also investigated the 2016 Tesla crash and closed its probe in January 2017, saying it didn’t find a defect.
According to data Tesla gave investigators, Autopilot prevents the rate of crashes per million miles driven by about 40 percent, a figure it continues to cite in defending the system. The company has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel, how many alerts can be ignored before the system disengages, what version of software was in Huang’s Model X, or when it was built.
“We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road,” Tesla said in last Wednesday’s statement. “NHTSA found that even the early version of Tesla Autopilot resulted in 40 percent fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.”