- The Washington Times - Tuesday, October 29, 2024

Self-driving cars have the potential to dramatically change transportation, but collisions between humans and autonomous vehicles pose complicated legal challenges. 

With no driver at the wheel, a crash with an autonomous vehicle, or AV, means confusion, especially in the infancy of the nascent technology. Issues a driver will have to navigate include filing an insurance claim and who to sue, if needed, not to mention how to sue.

“It’s imperfect,” said University of South Carolina law professor Bryant Walker Smith, whose work focuses on AVs. “Crash responders have remarked that to communicate with a vehicle on a crash scene that’s blocking a car in an emergency, they’ve had to stick their head in the window and yell because there was no way that they could hear or speak with that assistant from outside the vehicle.”

The beginning of the process is expected to look similar to how drivers navigate crashes with other humans, even if there is no other human. After a crash with an AV, a human driver would exit their vehicle, contact law enforcement and inform their insurance company of the collision. If the AV is at fault, the driver’s insurance company would contact the AV manufacturer or developer to determine a settlement. 

AV groups and companies have created training programs for law enforcement to ensure smooth responses to collisions involving their vehicles, similar to crashes between human drivers. However, AVs have caused problems for first responders in the past: Waymo robotaxis blocked emergency vehicles 66 times last year, according to the San Francisco Police Department. 

AVs complicate legal liability in collisions by introducing new parties. Unlike traditional collisions, when one human is typically at fault and liable for injuries or property damage, AVs lack a human operator, which shifts legal responsibility to the manufacturer. Therefore, victims can’t pursue personal injury lawsuits, which are based on proving negligence against another human, because there is no human to hold accountable. Instead, drivers must pursue product liability lawsuits, aiming to show that the AV malfunctioned or made a critical error. 

“When you start to introduce the car making a mistake, you get into product liability rules,” QuantivRisk Chairman and founder Mike Nelson said. “People adjudicating who is at fault are going to have a hard time dealing with percentage responsibility the car manufacturer has versus the driver of the car.”

AV manufacturers such as Volvo and Tesla, as well as robotaxi companies such as Alphabet’s Waymo and General Motors’ Cruise, have promised to take responsibility if their vehicles cause collisions. That could lead to a speedier legal process for victims, but some companies may choose to fight liability claims to protect their image. 

“They will probably choose a few high-profile cases to defend their technologies when they are really confident that their story is solid,” Mr. Smith said. “To say ‘we did everything right, and in fact, the only reason you’re alive is because we responded to your failure better than an ordinary human driver would have.’” 

Product liability lawsuits are more complicated and costly than personal injury suits. To prove an AV — or any product — is defective, expert testimony is typically needed, inflating legal expenses for victims. For example, AVs rely on artificial intelligence, so a lawsuit trying to prove product liability would need an AI expert.  

However, while engineers can demonstrate inputs and outputs, they cannot explain how an AI reached specific decisions. That lack of transparency complicates proving that an AV made the wrong decision on the road or malfunctioned.

While AVs may alter the typical legal processes drivers navigate after a collision, some industry insiders say they will not fundamentally change the legal landscape.

“Autonomous vehicles are game-changing technology and will deliver incredible benefits to Americans,” Autonomous Vehicle Industry Association CEO Jeff Farah said. “But there is no need to change the liability regime in our country to accommodate AVs — they work within the existing regime that has been in place for years.”

Additionally, drivers might be better off colliding with AVs, even if humans are in the wrong. Some legal research indicates judges may be harsher on AVs than on humans. A 2022 study of hundreds of judges found that more than half assigned greater blame and damages to AVs, even under identical circumstances. As a result, human drivers injured in a collision with a self-driving car may have a better chance of securing payment for damages.

Cruise settled a lawsuit in May after one of its AVs collided with and dragged a pedestrian under its wheels late last year. The terms of the settlement haven’t been disclosed, but reports suggest the settlement reached more than $8 million in damages. 

Cases involving AV collisions are rare so far, making it challenging to gauge the cost and legal complexity for human drivers. Earlier this year, Tesla settled a wrongful death lawsuit with the family of Walter Huang, who died after his Tesla crashed into a highway divider while in Autopilot mode. The settlement has not been disclosed. 

While Tesla’s vehicles are not fully autonomous, the legal arguments made by the plaintiffs in cases against the company could signal potential issues for other AV manufacturers and victims. Huang’s family asserted that Tesla’s driver assistance programs had significant design flaws. They also accused the company of misleading customers through its marketing, which portrayed the technology as fully autonomous. 

While AVs might introduce some complexity to liability laws, the AV industry argues that its vehicles will significantly reduce collisions, meaning most people won’t have to go through the legal system in the first place. 

Waymo in September reported that its driver was involved in 84% fewer airbag-deployed collisions than human drivers, 73% fewer injury-causing collisions and 48% fewer police-involved collisions.

“Because the Waymo Driver is programmed to respect the rules of the road like stop signs and speed limits, and because it never drives drunk, drowsy or distracted, the Waymo Driver prevents many common types of crashes from occurring at all,” Waymo said. 

According to collision data recorded under requirements from the National Highway Traffic Safety Administration, 617 crashes involving AVs have occurred since 2021. Of those, 91 were in the first half of 2024. Most collisions involving AVs haven’t resulted in major injuries, with more than 65% recording a speed limit under 25 mph.

Most crashes with human drivers don’t lead to major injuries. A 2024 analysis of human and AV collisions found that 24.7% of crashes result in minor injuries and 68.76% resulted in no injuries. Additionally, the majority of collisions in the U.S. occur below 40 mph, with about 30% happening at speeds under 25 mph. 

• Vaughn Cockayne can be reached at vcockayne@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.