The world of self-driving electric vehicles is outpacing laws and regulations from Washington while Tesla and other automakers face major technological hurdles to achieve true autonomous status.
Underscoring the concerns, Tesla said Thursday that it was recalling 362,758 vehicles with full self-driving programs that could cause crashes. It affects “certain 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with Full Self-Driving Beta (FSD Beta) software or pending installation.”
The advisory said the software “may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”
In addition, Tesla said, the system “may respond insufficiently to changes in posted speed limits or not adequately account for the driver’s adjustment of the vehicle’s speed to exceed posted speed limits.” The company said it will release a free “over the air” software update.
No federal or state laws prohibit the use of driver assistance software as long as a driver is alert and behind the wheel. Critics say a patchwork of state laws and executive orders gives regulators broad discretion.
Tesla dominates the driver assistance industry. Online videos show consumers testing their cars’ autopilot abilities only to make an illegal maneuver or nearly cause an accident, forcing the driver to quickly take over control from the software.
SEE ALSO: Tesla recalls ‘Full Self-Driving’ models to fix flaws in behavior
Dan O’Dowd, a California technology entrepreneur and one-time U.S. Senate candidate, has been calling for years for elected leaders to ban Tesla’s driver assistance technology, which is not fully autonomous but is used by many drivers as if it were. He accuses the company and its billionaire owner, Elon Musk, of using the general public as guinea pigs while jeopardizing public safety.
“Take it off our roads. It’s not [Tesla’s] roads to put their 2-ton killer robots on,” Mr. O’Dowd told The Washington Times in a phone interview. “They’re using us as a test bed, and they don’t need to.”
He said Tesla has found loopholes in state regulations such as California’s that require rigorous testing and safeguards for fully autonomous vehicles. They include the robotaxis already in use. Although Tesla may market its vehicles with Autopilot or Full Self-Driving modes, the company downgrades its products in the eyes of the law as merely advanced cruise control features that require human intervention to skirt certain rules.
As part of a yearslong campaign against Tesla and its self-proclaimed “self-driving” technology, Mr. O’Dowd spent more than half a million dollars to air a 30-second Super Bowl ad in select markets last weekend featuring a Tesla in driver assistance mode mowing down a child-sized mannequin and almost veering head-on into oncoming traffic.
The ad called on the National Highway Traffic Safety Administration to ban the use of Tesla’s self-driving functions in the face of “endangering the public with deceptive marketing and woefully inept engineering.”
The clip caught the attention of Mr. Musk on Twitter, his social media platform. He responded with a laughing emoji to a Tesla supporter who reposted Mr. O’Dowd’s video with a caption thanking him for “spreading the word that Teslas are the leader for general autonomy.”
SEE ALSO: Tesla fires dozens of workers amid unionization move
“They succeeded in sidestepping the relatively little regulation there is,” Mr. O’Dowd said.
Congress has tried for years to crack down on what is or is not permissible for autonomous vehicles or driver assistance software but has failed because of disagreements that transcend party lines about regulations of the increasingly popular industry.
Sen. Edward J. Markey, Massachusetts Democrat, and some of his colleagues in the party have pushed for regulation and called on the Federal Trade Commission to investigate Tesla’s marketing.
“I was run over by a car when I was 5 years old, and it’s something you never forget,” Mr. Markey said in a brief interview. “I just want to make sure that these vehicles are safe — not just for people in the car but for pedestrians.”
Meanwhile, bipartisan coalitions are trying to pave the way for more self-driving cars.
The National Highway Traffic Safety Administration, which has safety jurisdiction over autonomous vehicles, changed its safety standards last year at the request of manufacturers. As a result, autonomous vehicles are no longer required to have steering wheels or pedals to engineer them to be truly driverless.
Such a move gave way to Zoox, an Amazon-owned driverless venture. Zoox announced this week that it has deployed robotaxis for its employees in California as a test case to shuttle them on a small amount of public road between office buildings.
With roughly 160,000 Tesla owners in the U.S. and Canada who can use driver assistance modes as they hit the streets, Mr. O’Dowd and other critics say regulators are making mistakes. Mr. O’Dowd continually submits complaints and warnings to the National Highway Traffic Safety Administration while prodding for answers about its investigation into crashes involving Teslas using driver assistance programs.
NHTSA acting Administrator Ann Carlson told reporters last month that the regulatory agency is “investing a lot of resources” and “working really fast” to finish an investigation that began in August 2021, according to Reuters.
A 2022 NHTSA report found that Teslas accounted for nearly 70% of the 392 crashes involving driver assistance technologies in roughly the previous year.
Tesla says its vehicles still “require active driver supervision and do not make the vehicle autonomous,” but the company has been the subject of lawsuits and its technology and marketing schemes have come under fierce public scrutiny. In December, a driver said the car’s driver assistance software caused an eight-vehicle pileup by unexpectedly slamming on the brakes on a busy highway.
Drivers of such vehicles still may be permitted to operate them on roadways, but officials warn that they can commit wrongdoing if they are not careful.
In California, law enforcement officials say dozing off behind the wheel of a Tesla in driver assistance mode — as viral videos have depicted — warrants reckless driving charges under current law despite the lack of explicit mention of such technology.
• Ramsey Touchberry can be reached at rtouchberry@washingtontimes.com.
Please read our comment policy before commenting.