Tesla settles wrongful death lawsuit one day before trial starts

Tesla settled with the family of Walter Huang for an undisclosed amount after he was killed in a crash while using its Autopilot software.

Tesla may skirt liability through its use of Beta systems for drivers using Full-Self Driving and Autopilot features and arbitration clauses found in purchase agreements. (Credit: Diego M. Radzinschi/ALM)

Walter Huang, a 38-year-old father of two, was killed in a Tesla accident shortly after dropping his son off at preschool on March 23, 2018. Huang was relying on the Autopilot feature of his Model X at the time of his death. It’s believed he was playing a game on his iPhone when the semiautonomous vehicle veered out of its lane on a busy San Francisco bay area highway, accelerated, and smashed into a concrete highway barrier going over 70 miles per hour. The negligence and wrongful death lawsuit filed by Huang’s family was settled earlier this month (April 2024) for an undisclosed amount, just one day before the trial was set to start.

The lawsuit aimed to hold Tesla and its CEO, Elon Musk, liable for exaggerated statements about the self-driving car’s capabilities, claiming the branded Autopilot feature was promoted in a way that made drivers think they did not need to be as vigilant behind the wheel. This case and others suggest Musk’s remarks on the Autopilot and Full-Self Driving (FSD) technologies foster unfounded faith in the vehicles’ abilities and bring concerns about FSD to the forefront.

Beta models may skirt liability 

Tesla often refers to its FSD technology as a “beta,” which could safeguard the company (and insurers) from liability because it’s not a completed operation. The FSD Beta renders the vehicle in control of itself, but Tesla doesn’t take responsibility for its implementation. The Beta driver is supposed to stay vigilant and responsible for the vehicle’s operations, ready to jump in if the software glitches. One Beta driver and writer for Electrek, Fred Lambert, claims the Beta FSD software saved him once but nearly killed him twice.

“In itself, the system is impressive, but it is not the robotaxi Tesla promised. It is able to render its environment to an impressive level of accuracy, and it can navigate difficult intersections, but it also often fails in dangerous ways,” writes Lambert. “As I was going through a sharp right turn, FSD Beta decided to stop turning halfway through the turn and brought the steering wheel back straight. If I didn’t instantly grabbed [sic] the wheel and applied the brakes, I would have driven us right off the cliff side.”

Mounting legal battles

Tesla has faced several lawsuits and probes in recent years. In May 2021, the California Department of Motor Vehicles launched an investigation into Tesla, filed an administrative complaint in July 2022, and then a motion on Nov. 20, 2023, accusing Tesla of misleading customers with exaggerated claims of the Autopilot and FSD capabilities, including its supposed ability to complete “short and long distance trips with no action required by the person in the driver’s seat.” However, neither technology can perform these feats without driver interaction yet.

Tesla’s response to the DMV investigation is that the company “relied upon [the DMV’s] implicit approval of these brand names,” going on to say in its response notice that the DMV had not taken action against Tesla before or otherwise communicated that there was a problem with Tesla’s advertising using the brand names and claims the state’s false advertising rules on autonomous vehicles “restrict constitutionally protected speech that is truthful and nonmisleading.”

Musk is known for boasting about Tesla’s Autopilot and FSD features, using language such as “expect” and “hope” to skirt liability while also lulling customers into what could be a false sense of security, given the complications, injuries and deaths linked to the semiautonomous vehicles. Should the DMV win its case against Tesla, the company could lose its California manufacturer’s license and be forced to pay restitution for financial losses and damages.

The DMV motion follows previous investigations by the National Highway Traffic Safety Administration (NHTSA) into Autopilot and FSD safety concerns. The NHTSA investigation led to Tesla recalling over two million cars in Dec. 2023, including most of the vehicles manufactured in the U.S. since 2012.

Previous legal outcomes 

Tesla settled a 2017 class action lawsuit for $5 million that alleged the Autopilot feature was “essentially useless and demonstrably dangerous.” Another class action lawsuit against Tesla from September 2022 was dropped after it was revealed four of the five plaintiffs had signed arbitration agreements when purchasing their Tesla vehicles online, and the fifth plaintiff’s case was past the statute of limitations. The arbitration clause limits legal options for Tesla owners injured or killed in a crash. Though the agreement has a 30-day opt-out option, most Tesla owners are unlikely to pursue that paperwork trail so soon after buying their vehicle.

Other individual lawsuits against Tesla that went to trial found Tesla not liable, including a 2019 crash that injured Justine Hsu and another 2019 incident that killed Micah Lee. However, more cases are on the horizon. In Nov. 2023, a Florida judge allowed a case against Tesla to move forward based on evidence that Musk and some Tesla managers were aware the Autopilot software was defective. The case involves a fatal crash that killed Stephen Banner and could continue to trial. Several other lawsuits are embroiled in the U.S. court system and could make their way to trial as individual cases.

Related: