An 18-year-old college freshman finishes her final mid-term.

Exhausted but excited, she walks to the parking lot and spots her already packed sedan and gets in to head out for spring break. She pulls away from campus and places her car into autonomous mode.

Wanting to let her friends know she is on her way, she reaches into her purse to grab her cell phone and starts sending a text. In what feels like an instant later, she squints open her eyes to the harsh fluorescent bulb above as a nurse tells her she was in an accident. In those few seconds she took her eyes off the road, trusting her safety to her car's "autonomy," an SUV broadsided her at an intersection at 50 miles per hour.

|

New era changes notions of liability

Fortunately, this account is fictitious, but the scenario exemplifies how, from a legal perspective, the autonomous vehicle era will change notions of liability where even the distracted driver is not necessarily at-fault. There will be a whole new playbook, new theories of liability, and human vs. machine accounts that amount to "he said, it said."

Autonomous vehicles will become part of many types of claims, including personal injuries, property damage and, with respect to autonomous cargo trucks, issues such as business interruption and damaged or lost goods. Adjusters are facing a minefield in sorting out fault, coverage and exposure in claims involving autonomous vehicles.

man behind the wheel of a self-driving car

Between full autonomy and total driver control, consumers will have many intermediate options. (Photo: Shuttestock)

|

Autonomous vehicle claims

When investigating a claim involving an autonomous vehicle, it is important to first determine the vehicle's classification and therefore its level of "autonomy." Between full autonomy and total driver control, consumers will have many intermediate options.

In September 2016, the National Highway Traffic Safety Administration (NHTSA) issued new guidelines identifying six levels of autonomy:

  • Level 0: No automation.

  • Level 1: Some automation (like automatic braking).

  • Level 2: Automation can conduct some of the driving (like lane assistance and adaptive cruise control).

  • Level 3: Driver can cede control over some driving in some circumstances.

  • Level 4: Fully autonomous under certain conditions.

  • Level 5: Fully autonomous for all conditions (no steering wheel required).

Level 2 and 3 vehicles will most complicate fault determinations due in part to the very concept of rotating responsibility between human and machine. With Level 4 and 5 vehicles, the "driver" is the vehicle itself. Thus, if the vehicle is at fault, the claim would be based only in product liability unless there is evidence of independent negligence such as improper maintenance.

|

Vehicle data recorder

After determining the vehicle's classification, a major priority will be obtaining data from the vehicle's event data recorder or "black box." NHTSA recommends all autonomous vehicles have a black box, which often records information such as speed, seat belt usage and braking. With autonomous vehicles, there could be additional data.

For example, after the fatal May 2016 crash involving a Tesla Model S with "autopilot" (a Level 2 vehicle), the black box showed that autopilot was engaged, that the automatic emergency braking system did not provide any warning or braking, and that the driver took no braking, steering or other actions to avoid the collision.

NHTSA ultimately used this data to help reconstruct the accident and found the Tesla performed as designed, and both the vehicle and the driver failed to respond to a crossing tractor-trailer despite it being visible for seven seconds before impact. Note, however, that the NHTSA's role is to determine whether there is a defect necessitating a recall and not to determine potential civil liability.

|

Cameras may make images possible

In addition to black boxes, some autonomous vehicles will use cameras as component parts. Thus, it is plausible that images may be available and efforts should be made to retrieve them.

After obtaining any available hard data, ideally the autonomous vehicle components should be inspected for pre-existing damage or improper maintenance. Furthermore, the vehicle should be checked to ensure any software updates were downloaded. As an example of its importance, four months after the fatal Model S crash, Tesla wirelessly beamed a software update into its vehicles that created new safety measures.

interior of an autonomous vehicle

In further determining potential liability, an area of contention to explore involves consumer expectations. (Photo: Shutterstock)

|

Who's at fault?

In further determining potential liability, an area of contention to explore involves consumer expectations. Among the public criticisms of the Model S is the word "autopilot," which could suggest a total cessation of control more suitable for at least a Level 3 vehicle. For example, following the fatal crash, Consumer Reports criticized the name "autopilot," saying it could give consumers a false sense of security.

The Model S does describe its limitations in the owners' manual. The "Driver Assistance" section includes 52 warnings and six cautions. Then, there is the catch-all: "Never depend on these components to keep you safe. It is the driver's responsibility to stay alert, drive safely, and be in control of the vehicle at all times."

woman putting on mascara while driving

In determining case value and exposure, it's important to realize that American attitudes have not yet caught up to manufacturer excitement (Photo: Shutterstock)

|

Anticipating the inattentive driver

However, in its report on the Model S, NHTSA noted that realistically, drivers do not always read their manuals. The new NHTSA guidelines encourage manufacturers of Level 2 and 3 vehicles to keep the inattentive driver in mind.

Manufacturers are listening. The September 2016 Model S update created a "three strikes" feature where the vehicle determines whether the driver's hands are off the wheel, will provide warnings, and after the third "strike," the autopilot system will shut off.

Other manufacturers, like GM and Audi, are taking a different route and will be installing a camera near the rearview mirror that can monitor the driver's eyes and head to tell if the driver is being attentive.

Not all manufacturers are convinced those warning systems will work, however. Honda and Toyota have added radar, cameras and automatic braking, but they have purposefully avoided an autopilot-like system. Taking it a step further, Google long ago abandoned development of Level 2 vehicles after finding test drivers were not paying attention even when told to do so.

|

Programs to educate consumers?

To help ensure accurate consumer expectations, NHTSA recommends manufacturers and dealers develop programs to educate consumers regarding the abilities and limits of their vehicles and even suggests providing on-road, hands-on training. Thus, potentially liable parties and theories can expand from manufacturers to dealers and distributors as to arguably negligent warning, education and training.

Finally, in determining case value and exposure, it is important to realize that American attitudes have not yet caught up to manufacturer excitement. An American Automobile Association survey released in March 2017 found 78 percent of Americans are afraid to ride in a self-driving vehicle. Perhaps movies like Terminator and The Matrix have conditioned us to mistrust machines with our safety.

Fears regarding autonomous vehicles are only heightened by media accounts sensationalizing the autonomous vehicle in an accident — whether it is the fatal Model S crash or the March 2017 viral photo of the Uber autonomous Volvo SUV on its side. In both cases, the autonomous vehicle was ultimately found not at fault by investigators. For the foreseeable future, however, potential jurors may be instinctively distrustful of autonomous vehicles.

|

New way of thinking about claims investigations

Claims investigations involving autonomous vehicles will simply require a new way of thinking. No longer will it necessarily be conclusive that the texting 18-year-old driver is at-fault if it were reasonable for her to rely on her vehicle's autonomy. The autonomous vehicle revolution has begun. It's time to change the playbook.

Eric Ruben ([email protected]) focuses his practice on premises and products liability lawsuits and serves as regional counsel for various tire manufacturers. He represents a wide range of clients, including tire manufacturers, banks, telecommunications companies, insurance companies and individuals.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.