Autonomous cars have been heralded as the future of transportation, but hang-ups remain over who will be responsible if a self-driving vehicle is involved in a collision.
Automobile manufacturers, insurance companies, and transportation regulators are all wrestling with the liability issue. Manufacturers have been quick to note that technology to make driving fully autonomous is still evolving and even the most advanced self-driving cars require human supervision.
“There is a misconception that just because a vehicle has the capability to self-drive that it doesn’t need human oversight,” an auto lobbyist told The Washington Times. “Autonomous cars are perfectly safe, but like every other piece of technology, they are safer when people are paying attention when they use them.”
The notion was put to the test in 2018 when a self-driving Uber vehicle struck and killed a pedestrian in Arizona. The incident, which is the first recorded case of a fatality involving a self-driving car, happened even as the vehicle had a backup human driver behind the wheel.
The human driver was distracted and watching television on a smartphone at the time of the accident, however. Prosecutors in Arizona eventually charged the backup driver with negligent homicide.
Uber itself skirted charges mainly because it had a paid employee at the wheel whose job was to monitor the self-driving car. Auto manufacturers cannot count on the same argument to avoid liability for future accidents.
“The fact [the backup driver] was watching TV makes her an easy and maybe convenient person to accept responsibility,” Ed Walters, a lecturer of robotics law at Georgetown University, told CNN at the height of the case. “Remove that fact and it could easily be Uber.”
Some manufacturers, including Volvo, have responded to the reality by agreeing to accept full liability for what their cars do when in autonomous mode. Others have argued they should only be held liable if the self-driving technology malfunctions and not for human ineptitude.
Still, some are trying to have it both ways.
Tesla, which has dominated the self-driving car market, pitches its auto-pilot technology as the car “driving itself.” Yet the company also explicitly warns drivers they must keep their hands on the wheel at all times and that its technology can only assist with steering, braking, speed and lane changes without making the car fully autonomous.
“The person in the driver’s seat is only there for legal reasons,” a video on Tesla’s website says. “He is not doing anything. The car is driving itself.”
Who will ultimately wind up liable for accidents when a vehicle is self-driving will be decided by state and federal regulators. At the moment, though, there are few laws and regulations governing self-driving cars on the books.
Auto safety advocates see that as a problem given the rate of recent accidents associated with self-driving cars. A report by the National Highway Traffic Safety Administration found that over a 10-month period, there were nearly 400 collisions involving self-driving vehicles.
“These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations,” said Steven Cliff, the agency’s administrator.
Tesla, whose vehicles were responsible for more than 270 of the incidents, has said the rate of collisions will decrease as its technology continues to evolve and improve.
Earlier this month, the company issued a recall for 362,758 vehicles with full self-driving programs that could cause crashes.
• Haris Alic can be reached at halic@washingtontimes.com.
Please read our comment policy before commenting.