Tesla files recall on 2 million vehicles to fix autopilot software
Tesla is recalling about two million cars to limit the use of the Autopilot function after a long investigation by US regulators.
The US traffic safety regulator has confirmed Tesla is recalling over two million vehicles, almost all of the cars it sold in the US, due to a risk associated with its Autopilot software.
The mass recall follows a two-year investigation into about 1000 crashes that occurred when the tech was in use.
Documents posted yesterday by US safety regulators said the update would increase warnings and alerts to drivers and even limit the areas where basic versions of Autopilot can operate.
“Tesla has now filed a safety recall with the agency related to its Autopilot software system,” the National Highway Traffic Safety Administration (NHTSA) said, adding that “affected vehicles will receive an over-the-air software remedy.”
The recall covers nearly all 2 million Tesla vehicles on US roads.
In a statement, the regulator said Tesla’s autopilot system “can provide inadequate driver engagement and usage controls that can lead to foreseeable misuse of the system”.
If Autopilot is misused or if the driver fails to recognise that the function is activated, the risk of an accident could be higher, NHTSA said.
Vehicles will be updated to include more alerts encouraging drivers to keep their hands on the steering wheel.
US-based Tesla, the brainchild of tech billionaire Elon Musk, has been hit with several lawsuits stemming from car accidents, and its driver-assistance technology has provoked regulatory probes.
In 2021, NHTSA opened an investigation into 11 incidents involving stationary emergency vehicles and Tesla vehicles using the assisted driving feature.
Despite its name, Tesla’s Autopilot can automatically steer, accelerate, and brake in its lane, but it is a driver-assist system and cannot drive itself.
Independent tests have shown that the monitoring system is easily fooled.
Drivers have even been caught driving drunk or even sitting in the back seat.
While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.
â Tesla (@Tesla) December 12, 2023
We at Tesla believe that we have a moral obligation to continueâ¦
On Tuesday, the manufacturer took to X in a seething defence of its Autopilot feature after a Washington Post article critiqued Tesla’s safety systems.
“Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged,” it wrote, pointing to data suggesting fewer crashes when the system was used.
“ … we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury.”
More Coverage
Currently, fully automated vehicles cannot be driven on Australian roads.
The version of Autopilot software available in the US which allow “full self drive” is not compliant with Australian road rules and is unavailable to local drivers.
– With AFP