(Bloomberg) – Telling Tesla drivers its Autopilot feature doesn't mean their cars can drive themselves may not be enough to keep Elon Musk off the hot seat if the technology comes up short.
This month, two Teslas equipped with Autopilot veered into barriers following disclosure of the first fatal wreck, a Model S slamming into a 18-wheeler crossing a Florida highway after the semi-autonomous car failed to distinguish the truck's white trailer from sky.
Tesla Motors Inc. warns drivers they must still pay attention and be ready to grab back control of the car, but there's a lot in a name.
“The moment I saw Tesla calling it Autopilot, I thought it was a bad move,'' said Lynn Shumway, a lawyer who specializes product liability cases against carmakers. “Just by the name, aren't you telling people not to pay attention?''
Joshua Brown's death in Florida was the first involving Tesla's semi-autonomous technology, triggering chatter in legal circles about who was liable for the crash and prompting a probe by the National Highway Traffic Safety Administration as well as the National Transportation Safety Board, which typically devotes its attention to mishaps involving planes and trains. Some details remain in dispute, including whether Brown, a former Navy SEAL, might have been watching a Harry Potter movie in a DVD player found in the car.
Musk had anticipated the moment for at least two years, telling drivers to keep their hands on the wheel because they will be accountable if cars on Autopilot crash. Tesla buyers must activate the Autopilot software, which requires them to acknowledge the technology is a beta platform and isn't meant to be used as a substitute for the driver.
|Driver's responsibility
When U.S. investigators began evaluating Brown's crash, Tesla doubled down in a statement: “Autopilot is an assist feature. You need to maintain control and responsibility of your vehicle.”
But people will be people and they often don't do what they're supposed to do.
Lawyers compare giving Tesla drivers Autopilot to building a swimming pool without a fence; the property owner should know that neighborhood kids will find it hard to resist and may get hurt.
“There's a concept in the legal profession called an attractive nuisance,” said Tab Turner, another lawyer specializing in auto-defect cases. “These devices are much that way right now. They're all trying to sell them as a wave of the future, but putting in fine print, 'Don't do anything but monitor it.' It's a dangerous concept.''
As with so-called smart features before it such as anti-lock brakes and electronic stability control, telling drivers Autopilot might not prevent an accident won't help Tesla in court if the technology is found to be defective, Turner said.
“Warnings alone are never the answer to a design problem,'' he said.
|Possible arguments
In a court case, lawyers for accident victims or their families would have other lines of attack if Tesla blames accidents on drivers failing to heed warnings. They could assert that Tesla's software is defective because it doesn't do enough to make sure drivers are paying attention.
Attorneys could also argue that, in Brown's case for example, the car should have recognized the tractor-trailer as an obstacle, or that Tesla could have easily updated its system to address such a foreseeable problem.
“Any argument will try to establish that Tesla acted in an unreasonable way that was a cause of the crash,” said Bryant Walker Smith, a University of South Carolina law professor who researches automation and connectivity. “It doesn't even need to be the biggest cause, but just a cause.”
If Brown's May 7 crash doesn't end up in court, others might.
A 77-year-old driver from Michigan, which passed laws allowing semi-autonomous and fully autonomous vehicles, struck a concrete median in Pennsylvania and his 2016 Model X SUV rolled over. Also this month, a driver in Montana said his Tesla veered off the highway and into a guardrail. Both drivers said their cars were operating on Autopilot at the time and both were cited for careless driving.
Pennsylvania and Montana are among the 42 states without legislation regulating autonomous and semi-autonomous cars. John Thune, chairman of the U.S. Senate Committee on Commerce, Science and Transportation, has asked Musk to brief the committee on details of Brown's crash, according to an e-mailed statement from the committee.
|Musk response
Musk fired back in a tweet, saying the onboard vehicle logs show the Autopilot was turned off in the Pennsylvania crash and that the accident wouldn't have happened if it had been on. The company said the Montana driver hadn't placed his hands on the wheel for more than two minutes while the car was on Autopilot.
Musk and Tesla are certain to argue that while their technology has yet to meet the threshold for “autonomous vehicle,” its Model S has achieved the best safety rating of any car ever tested.
Even with that record, Consumer Reports on Thursday called on Tesla to disable Autopilot on more than 70,000 vehicles. “By marketing their feature as 'Autopilot,' Tesla gives consumers a false sense of security,” Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports, said in the report called “Tesla's Autopilot: Too Much Autonomy Too Soon.”
Tesla shares fell almost 1 percent Friday to $219.64, the lowest since Monday, and were trading at $220.07 at 2:20 p.m. in New York.
“Tesla is consistently introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance,” the carmaker said Thursday in a statement. “We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”
Khobi Brooklyn, a spokeswoman for the Palo Alto, California-based carmaker, cited the company's earlier comments on the three accidents and declined to comment further on possible litigation involving Autopilot.
|National rules
The U.S. government will soon offer the auto industry guiding principles for safe operation of fully autonomous vehicles, part of a plan that includes $4 billion for safety research by 2026.
For now, the double line between Autopilot and full autonomy is a blurry one.
In 2013, NHTSA released a five-rung autonomous vehicle rating system based on cars' computerized capabilities, ranging from level 0 for “no-automation” to level 4 for “full self-driving automation.”
Tesla's likely to argue its technology has yet to surpass level 2: automation designed to relieve driver control of at least two functions. Plaintiffs will counter the car's been marketed more like a level 3, when the driver can fully cede control of all safety-critical functions while remaining available for occasional intervention.
“It's great technology, I hope they get this right and put people like us out of business,'' says Steve Van Gaasbeck, an auto products lawyer in San Antonio, Texas. “There's really no excuse for missing an 18-wheeler.''
Copyright 2018 Bloomberg. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.