Tesla recalls millions of vehicles amid probe of Autopilot crashes
Tesla is recalling more than two million vehicles over government contentions that its Autopilot system can be misused by drivers.
Tesla is recalling more than two million vehicles over government contentions that its Autopilot system can be misused by drivers, in the midst of a years-long probe by the top US auto-safety regulator into crashes involving the driver-assistance technology.
The safety recall, which covers nearly all Tesla vehicles sold in the US, could dent the carmaker’s reputation. It is unlikely to be expensive to implement and won’t meaningfully limit drivers’ access to Tesla’s advanced driver-assistance features.
The free software remedy Tesla plans to deploy remotely will add more alerts and controls encouraging drivers to remain attentive while using Autopilot to steer within a lane, according to the recall report.
The National Highway Traffic Safety Administration has been scrutinising Tesla’s system after years of high-profile crashes and questions about whether the company was promising more than it could deliver.
Autopilot is among the most well-known advanced driver-assistance systems and comes standard on new Teslas. It is designed to help drivers with tasks such as steering and maintaining a safe distance from other vehicles on the highway but doesn’t make cars autonomous.
The auto-safety agency found that Tesla’s controls regarding Autopilot can be inadequate, potentially leading drivers to misuse the system.
Tesla’s remedy will vary vehicle to vehicle, depending on its hardware, according to the recall report. It might include making visual alerts more prominent, simplifying how drivers engage and disengage the automatic steering function, and checking in with drivers more if they are using Autopilot in certain circumstances.
The electric-car maker has long maintained that driving with Autopilot engaged is safer than doing so without it. Tesla, in a recent post on X, formerly Twitter, pointed to internal data showing that crashes were less common when drivers were using Autopilot.
The safety recall covers Tesla Models 3, S, X and Y produced between late 2012 and December 7, 2023, which are equipped with the company’s automatic steering function.
Some have said the name Autopilot risks giving drivers an inflated sense of the technology’s capabilities. They have also argued that Tesla doesn’t have enough safeguards in place to stop drivers from improperly using the system, putting themselves and others in danger.
Drivers have at times overridden the safety functions to operate the vehicle without their hands on the wheel.
In 2021, NHTSA opened an investigation into Tesla’s driver-assistance system after a series of crashes involving Autopilot and emergency-response vehicles.
The following year, the agency said it had expanded this examination to include a wider range of crashes, not only those at emergency scenes. Forensic data available at the time for 11 of the crashes showed drivers failed to take evasive action in the two to five seconds before the collision, the agency said.
NHTSA said then that it would further assess how drivers interact with Autopilot and the degree to which it might reduce motorists’ attentiveness.
That investigation remains open, NHTSA said on Wednesday (Thursday AEDT).
A Wall Street Journal analysis of dashcam footage and data from a Texas crash shows Tesla’s Autopilot system failed to recognise stopped emergency vehicles.
On Wednesday, a spokesman for the Fauquier County Sheriff’s Office in Virginia said a driver was killed while using Tesla’s Autopilot system when his vehicle crashed with a tractor-trailer in July. The driver of the tractor-trailer was originally charged with reckless driving, but an investigation showed the Tesla was travelling at approximately 70mph through a 45mph zone, he added.
Last year, The Wall Street Journal reported that federal prosecutors and the Securities and Exchange Commission were investigating whether Tesla had misled consumers and investors about how its advanced driver-assistance system performed.
The Justice Department was looking at statements that Tesla and its executives made about the safety and functionality of Autopilot, the Journal reported.
Tesla tells drivers using Autopilot to pay attention to the road and to keep their hands on the wheel, but the company’s public messaging has at times appeared inconsistent with that guidance.
Some, including the California Department of Motor Vehicles, have said the language Tesla uses to describe its technology risks giving drivers a false impression of what it can realistically do when in use.
Tesla has denied the California DMV’s claims, arguing that the agency is bringing them under statutes and regulations that are unconstitutional, and saying “they impermissibly restrict Tesla’s truthful and nonmisleading speech about its vehicles and their features”.
Over the years, Tesla has added new driver-assistance features and capabilities, including an upgrade marketed as “Full Self-Driving Capability” that it sells for $US12,000 (S18,000) upfront or $US199 a month as a subscription service.
As Tesla’s profitability has come under pressure this year, chief executive Elon Musk has talked up the promise of Tesla’s Autopilot technology, describing a vision of the future in which the company’s software generates strong profit margins. He has frequently indicated Tesla is on the verge of developing fully autonomous cars.
The Wall Street Journal