Car Accidents Involving Autonomous Cars, Who is Liable?

29 May 2017 Dashboard Insights Blog

With automated cars comes hopes of safer driving, more efficient commuting, increased productivity, reduced human errors, and fewer accidents. However, as self-driving cars becomes a reality, car accidents may lead to legal controversy over who is responsible for the accident; the manufacturer or the owner?

In September 2016, the National Highway Traffic Safety Administration (NHTSA) released a Federal Automated Vehicles Policy which adopted the Society of Automotive Engineers (“SAE”) classification levels for automation. The SAE developed a five point scale to classify levels of automation based on the required input of the human driver or occupant. The scale goes from level zero, denoting full human control, to level five, denoting fully autonomous vehicle requiring no human input. Cruise control is considered level one, while “partial automation”, including parking assist, lane departure warning, and automatic breaking,  is considered level two. Although Tesla’s “Autopilot” feature begins to break the threshold of level two  and move into level three,  it continues to require human input and monitoring to operate. Highly automated vehicles fall between levels three and five.

Despite accidents involving autonomous vehicles, this steady shift from human input towards partial and fully-autonomous operation creates a unique and complex legal question for not only the consumer, but fellow motorists, manufacturers, and their suppliers.

In March 2017, there was an accident involving a Tesla X and a Phoenix police officer.  According to USA Today, the accident barely qualified as a police report as there were no injuries or damages and the contact between the cars was merely a “tap” because the driver alleged that his Tesla was in auto-pilot mode. Similar incidents have been reported with Uber, Waymo/Google, and GM’s autonomous systems in recent months but the overwhelming majority of incidents have been the fault of the opposing driver. As the number of level 1-3 and ultimately level 1-5 autonomous vehicles become more ubiquitous on the roadways, questions have been raised on who will be liable if the vehicle is considered fully autonomous or in fully autonomous mode and is involved in an accident where the incident was the fault of the vehicle and not the passenger or human driver.

In May 2016, Tesla reported its first driver death while the Tesla Autopilot system was activated when a Tesla hit a tractor-trailer that was crossing a highway perpendicular to the flow of traffic. An investigation by NHTSA concluded that the driver had at least 7 seconds to respond and possibly mitigate or avoid the crash, which is longer than most drivers in similar situations, but that distraction likely caused the driver to be non-responsive to the hazard. An investigation of the Autopilot system installed at the time of the accident noted it was designed to avoid rear-end collisions but that the onboard radar was ineffective because it “tunes out what looks like an overhead road sign to avoid false braking events.” Although Tesla has provided extensive information to drivers to point out the Autopilot system requires “continual and full attention” while driving, several autonomous industry experts have been critical of Tesla’s roll-out for giving the public the impression the system is more autonomous and hands free than it is.

We’ve said it before, but as cars become more autonomous, liability may shift from the driver to car, and therefore, to the manufacturer of the vehicle and/or the supplier of the autonomous component system. But, at what point this liability transfers and by how much will be up for debate in states across the country over the coming years. As the shift in liability continues, it’ll also be interesting to see if, and how, a shift in litigating these matters might follow.

Historically individual states have been responsible for determining liability laws and now the  rules for highly automated vehicles. However, this creates a coordination issue as individual states determine their own individual rules. According to the NHTSA Federal Automated Vehicle Policy, states should allocate liability among highly automated vehicle owners, operators, passengers, manufacturers, and others when a crash occurs. The likely question to be raised is, if a highly automated vehicle is determined to be at fault in a car accident, then who will be held liable? Furthermore, it creates an issue if individual states set liability and fault levels at varying thresholds, with little coordination, leaving manufacturers, drivers, and suppliers scrambling when their product crosses state lines.

As major and minor manufacturers race to bring autonomous vehicles and systems to the market in the coming months, states are faced with allocating liability among those involved in accidents to a degree much more difficult than in the past. Unlike the traditional method of allocating liability, the autonomous vehicle and system adds a new player to this calculous, that of the manufacturer and supplier of the system. This element is largely unknown territory to major vehicle manufacturers and their suppliers which increases the need for more vigilance in system development, maintenance, and driver education than past iterations of emerging transit technology.

Please note Foley Summer Associate, Katrina Stencel was a contributing author of this post. The Dashboard Insights team thanks her for her contributions.

This blog is made available by Foley & Lardner LLP (“Foley” or “the Firm”) for informational purposes only. It is not meant to convey the Firm’s legal position on behalf of any client, nor is it intended to convey specific legal advice. Any opinions expressed in this article do not necessarily reflect the views of Foley & Lardner LLP, its partners, or its clients. Accordingly, do not act upon this information without seeking counsel from a licensed attorney. This blog is not intended to create, and receipt of it does not constitute, an attorney-client relationship. Communicating with Foley through this website by email, blog post, or otherwise, does not create an attorney-client relationship for any legal matter. Therefore, any communication or material you transmit to Foley through this blog, whether by email, blog post or any other manner, will not be treated as confidential or proprietary. The information on this blog is published “AS IS” and is not guaranteed to be complete, accurate, and or up-to-date. Foley makes no representations or warranties of any kind, express or implied, as to the operation or content of the site. Foley expressly disclaims all other guarantees, warranties, conditions and representations of any kind, either express or implied, whether arising under any statute, law, commercial use or otherwise, including implied warranties of merchantability, fitness for a particular purpose, title and non-infringement. In no event shall Foley or any of its partners, officers, employees, agents or affiliates be liable, directly or indirectly, under any theory of law (contract, tort, negligence or otherwise), to you or anyone else, for any claims, losses or damages, direct, indirect special, incidental, punitive or consequential, resulting from or occasioned by the creation, use of or reliance on this site (including information and other content) or any third party websites or the information, resources or material accessed through any such websites. In some jurisdictions, the contents of this blog may be considered Attorney Advertising. If applicable, please note that prior results do not guarantee a similar outcome. Photographs are for dramatization purposes only and may include models. Likenesses do not necessarily imply current client, partnership or employee status.

Related Services