- The Washington Times - Friday, June 17, 2022

A new government report on partially automated vehicle crashes is stirring safety concerns that could pump the brakes on the development of self-driving cars.

Automakers reported 392 crashes of vehicles with partially assisted driving technology from last July through May 15, the National Highway Traffic Safety Administration announced last Wednesday.

Among them, electric car maker Tesla led the way with 273 accidents, and Honda had 90. Subaru reported 10 crashes, and all other automakers reported five or fewer.

The U.S. safety regulator cautioned in a news release that the report, collected as part of a Standing General Order issued to more than 100 manufacturers last June 21, was “not comprehensive.”

Industry watchdogs said the report justifies greater scrutiny of automakers as they race to produce fully self-driving cars that go beyond current functions like automatic lane-centering.

“This recent report shows the need for us to move slowly with the widespread use of self-driving vehicles. In my opinion, there seems to be some other agenda behind this huge push to fully automate the process of driving,” said Bob Norton, a former assistant general counselor at Chrysler.

Mr. Norton, now a vice president at Hillsdale College in Michigan, added that he’s troubled by the “mixed messages” of some newer car commercials that show families “clapping along with the music on the radio” while a car drives itself.

“We should be emphasizing the fact that drivers need to always keep their hands on the wheel and their eyes on the road,” he said.

Tesla’s accidents involved malfunctions of its Autopilot, “Full Self-Driving,” Traffic Aware Cruise Control and other software. The self-driving software allowed some of its electric cars to roll through stop signs.

Kelly Funkhouser, manager of vehicle technology at Consumer Reports, said accidents could be underreported in the NHTSA tally.

“There needs to be better oversight of these systems, and better standards for collecting, recording, and reporting the data,” Ms. Funkhouser said. “For example, it doesn’t appear that there are any reports from several automakers that we know have systems similar to Autopilot.”

Tesla, which issued sweeping recalls of its electric car models earlier this year amid a federal investigation into its safety practices, did not respond to a request for comment.

In a statement emailed to The Washington Times, American Honda said some accidents may not have appeared in last week’s tally due to the government’s reporting guidelines.

The company wrote that because the crashes were based on “unverified customer statements” about its Advanced Driver Assistance Systems, “it is likely that some reported incidents would not have met NHTSA’s reporting criteria given more definitive data and time.”

“Therefore, Honda urges caution when comparing [required] crash report data between automakers, as apple-to-apple comparisons simply may not be possible at this time,” the automaker said.

Honda’s reporting “thus far has not identified any defects in the ADAS features in Acura or Honda vehicles,” the statement said.

Marc Scribner, a senior transportation policy analyst at the libertarian Reason Foundation, said that while the report “raises serious questions” about self-driving technology, it should not be used “to declare any of the technologies unsafe or guide enforcement actions.”

“Politicians should be careful to understand the limitations of these data and the differences between the classes of automation systems covered by the report in order to avoid counterproductive actions based on elementary technical errors,” Mr. Scribner said.

Other advocates of the technology agreed. Tara Andringa, executive director of the nonprofit Partners for Automated Vehicle Education, said “human factors account for most vehicle crashes.”

“The data and statistics show that AV technology is safe and effective, particularly when compared to humans driving vehicles with no automation or driver assistance features,” Ms. Andringa said.

Safety watchdogs are not so sure.

Joe Young, director of media relations for the Virginia-based Insurance Institute for Highway Safety, said the crash report raises more questions than it answers.

“There isn’t yet any clear evidence that partially automated driver assistance systems provide any safety benefits, so our priority in recent years has been to push for applications of the technology that ensure it won’t make things less safe,” Mr. Young said.

This week’s report comes as safety concerns and a patchwork of state regulations have slowed the testing of fully self-driving vehicles that operate without human drivers.

A handful of states have allowed manufacturers to test fully automated tech vehicles in limited circumstances.

In 2018, an Uber safety driver was charged with involuntary manslaughter after his self-driving car killed a 49-year-old woman crossing the street in Tempe, Arizona.

Last week’s NHTSA report found that 25 of these automated vehicle companies had 130 crashes. The leaders were Google’s Waymo with 62, Transdev Alternative Services with 34 and General Motors-controlled Cruise with 23.

Walter Block, an economist at Loyola University New Orleans, said the jury is still out on self-driving cars.

“One day soon, Automated Vehicles will be better drivers than ordinary human beings. But that time has not yet come,” Mr. Block said.

• Sean Salai can be reached at ssalai@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide