Many of us have been looking forward to the ease of driving on autopilot, freeing up time so we can spend it more productively. After the Tesla Model S, purportedly driving in autopilot mode, crashed into the back of a parked fire truck that was responding to an accident on I-405 in Culver City, Calif., we've come to understand that "autopilot" is apparently not what it seems to be.
Tesla has since released a statement that autopilot is intended only to be used by an attentive driver. But Tesla's autopilot website states, "All Tesla vehicles produced in our factory … have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver." The accompanying web video shows that, "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." The website claims "full self-driving capability … enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver." The phrase "almost all circumstances" is critical wording here.
|What Does 'Autopilot' Mean?
Greater understanding is needed of the term "autopilot" with respect to vehicles, however. Merriam-Webster defines autopilot as (1) a device for automatically steering ships, aircraft, and spacecraft; also: the automatic control provided by such a device; and (2) automatic pilot. Automatic pilot is defined as a state or condition in which activity or behavior is regulated automatically in a predetermined or instinctive manner, for example, "doing your job on automatic pilot."
By these definitions, and our understanding of the term, we question the use of the term "autopilot" to describe the operations of the Tesla at the time of the crash. What we now know is that a pickup truck suddenly swerved into the right lane to avoid hitting the fire truck, and with the Tesla traveling at 65 mph, there was no time for the driver to react.
Despite the website claims, the Tesla operator's manual warns that the system is ill-equipped to handle this exact situation. Traffic-Aware Cruise Control can't detect all objects and may not brake or decelerate for stationary vehicles, the manual says, especially in situations when you're driving over 50 mph (80 km/h) and a vehicle you're following moves out of your driving path and a stationary vehicle or object is in front of you instead.
What does this mean? Simply stated, a vehicle in autopilot mode, or any car currently equipped with adaptive cruise control or automated emergency braking, will not brake to avoid hitting a stopped vehicle. It might even accelerate toward it.
For example, if a vehicle in front of a car with adaptive cruise control changes lanes or turns off the road and there is a stopped vehicle immediately in front of the automated vehicle, the stopped vehicle will not be detected by the adaptive cruise control. The vehicle may even accelerate to reach the set cruise control speed because it sees the roadway as clear. The automated vehicle only sees the vehicle in front abruptly putting on its turn signal and leaving the lane, but it doesn't see the stopped vehicle.
It's not yet known what types of objects the adaptive cruise control would ignore if they're stationary in the roadway: a downed pole perhaps, or a child stopped on a bicycle? It would seem to be in the interest of public safety to reveal the types of objects the vehicle may not brake for, and to better explain the features of these systems.
|Read the manual
It's always important to read the owner's vehicle manual but in the case of newer vehicles that are equipped with autopilot mode, adaptive cruise control, automated emergency braking, or similar, it is vital for the driver to fully understand the benefits and shortcomings of these systems before driving the vehicle.
Likewise, manufacturers need to be cautious when naming their systems. Autopilot gives the impression that the vehicle can run on its own without human assistance. If sued, would a court agree with a driver that based on the name the driver thought the vehicle could drive itself? It fits common understanding of the term.
We realize other issues would also be at play, but semantics do matter. Otherwise, there could be more crashes of the type experienced with the Tesla and the fire truck, or perhaps worse. Insurers and agents may need to step in and educate their insureds as manufacturers continue to add more computerized driver assist features to vehicles.
Karen L. Sorrell is an insurance news editor with ALM Media. She can be reached by sending email to [email protected].
The opinions expressed here are the writer's own.
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.