Tesla reveals details in driverless-vehicle death
Tesla Inc. confirmed the Model X driver who died in a gruesome crash a week ago was using Autopilot and defended the safety record of its driver-assistance system.
(Bloomberg) – Tesla Inc. confirmed the Model X driver who died in a gruesome crash a week ago was using Autopilot and defended the safety record of its driver-assistance system that’s back under scrutiny following a fatality.
Related: Navigating the twists and turns of self-driving cars and insurance
Computer logs recovered from the Tesla driven by Wei Huang, 38, show he didn’t have his hands on the steering wheel for six seconds before the sport utility vehicle collided with a highway barrier in California and caught fire on March 23, according to a blog post the company published late Friday.
‘No action was taken’
“The driver had received several visual and one audible hands-on warning earlier in the drive,” Tesla said in the post. The driver had “about five seconds and 150 meters of unobstructed view” of the concrete highway divider and an already-crushed crash cushion that his Model X collided with, according to the company. “But the vehicle logs show that no action was taken.”
The collision occurred days after an Uber Technologies Inc. self-driving test vehicle killed a pedestrian in Arizona, the most significant incident involving autonomous-driving technology since a Tesla driver’s death in May 2016 touched off months of finger-pointing and set back the company’s Autopilot program.
A U.S. transportation safety regulator said Tuesday it would investigate the Model X crash, contributing to Tesla’s loss of more than $5 billion in market value this week.
‘Mushy middle’
“This is another potential illustration of the mushy middle of automation,” Bryant Walker Smith, a University of South Carolina law professor who studies self-driving cars, said in an email. Partial automation systems such as Tesla’s Autopilot “work unless and until they don’t,” and there will be speculation and research about their safety, he said.
Tesla defended Autopilot in the blog post, saying a vehicle equipped with the system is 3.7 times less likely to be involved in a fatal accident. U.S. statistics show one automotive fatality every 86 million miles driven by all vehicles, compared with 320 million miles in vehicles equipped with Autopilot, according to the company.
Devastating event
“None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends,” Tesla wrote, pushing back against criticism that it has lacked empathy by bringing up safety statistics to counter past scrutiny of Autopilot. “We must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.”
Related: Insurers eyeball automated vehicle autopilot issues
Tesla has introduced driver-assistance features through Autopilot that the company continuously improves via over-the-air software updates. While the company said as of October 2016 that it was building all of its cars with hardware needed for full self-driving capability, it hasn’t said when its vehicles will clear testing and regulatory hurdles necessary to drive without human involvement.
The U.S. National Transportation Safety Board (NTSB) sent investigators to look into the crash. The agency and the National Highway Traffic Safety Administration (NHTSA) also are examining a Jan. 22 collision in Los Angeles involving a Tesla Model S using Autopilot and a fire truck parked on the freeway.
NTSB findings
The NTSB concluded in September that Autopilot’s design was a contributing factor in the 2016 fatal crash in Florida involving a Model S driver who’d been using the system and collided with a semi-trailer truck. The agency criticized Autopilot for giving “far too much leeway to the driver to divert his attention to something other than driving.”
In the wake of that crash, Tesla updated Autopilot to stop allowing drivers to ignore repeated warnings to keep their hands on the wheel.
While the NTSB also criticized partially autonomous-driving systems that only monitor steering wheel movement and don’t measure whether drivers are looking at the road, Tesla hasn’t adopted or enabled scanners that can track whether drivers’ eyes are looking ahead toward the road.