Tesla issues recall for almost two million of its cars due to faulty self-driving tech

Tesla Model S
(Image credit: Tesla)

Self-driving vehicles will remain in the world of science fiction as Tesla is recalling two million of its cars to prevent drivers from misusing the Autopilot function.

This move, which does not remove cars from the road but insists on an update, comes after the US National Highway Traffic Safety Administration (NHTSA) conducted a two-year investigation into almost 1,000 crashes involving the Tesla feature. According to the official recall notice, the vehicles affected are the 2012-2013 Tesla Model S, 2016-2023 Model X, 2017-2023 Model 3, and the 2020-2023 Model Y. The NHTSA told CNN that the system gives “drivers a false sense of security” making them believe it’s totally safe to activate and use to navigate busy roads. 

This should come as no surprise to some as Autopilot’s poor performance is well-documented online. You can find multiple videos on YouTube showcasing the software suddenly disengaging mid-drive and plowing through dummies in tests. CNN points out in its report that while the owner's manuals state the mode is meant to only be used on “highways [or] limited-access roads with a fully attentive driver”, the company’s messaging has confused drivers into thinking the technology is safe in other scenarios.

Simply put, features like Autopilot and Full Self Driving “do not live to their names”, according to the NHTSA’s investigation.

Patching self-driving

Tesla took to X (the platform formerly known as Twitter) to defend the safety of Autopilot following another report from the Washington Post regarding eight fatal accidents where the self-driving function “should not have been enabled”. The post is pretty long, but basically, the company places the blame on drivers misusing the technology, claiming the publication omitted important facts. They also source their own numbers to back up their argument that Autopilot is safer when engaged.

Soon after the post, Tesla agreed to roll out a software update to its cars limiting the usage of Autosteer if a driver “fails to demonstrate [they’re] ready to resume control… while the feature is on.” The way it's supposed to work is you must have your hands on the wheel and your foot on the brake at all times in case you need to make sudden decisions. You can’t just be on your phone scrolling away, assuming the car will handle everything. It doesn’t work that way, despite what people have been led to believe. 

Future update

It’s unknown when the patch will be released or what it will fully entail. The recall report mentioned earlier states it will include adding extra notifications telling people to stay alert while Autosteer is turned on. Nor do we know if the recall will extend to Europe, China, or any other global regions where Tesla cars are sold. We reached out to Tesla asking for details and if it plans on upgrading its Cybertruck self-driving function to be more in line with the other cars.

Although we managed to find a contact email, we don’t expect a response from Tesla as they famously dissolved their PR department in 2020. But if we do, this story will be updated at a later time.

We should mention that next year, the United States will implement new tax credit rules changing how its electric vehicle tax break works. Now, it will only apply to a handful of EVs. If you want to know what they are, check out TechRadar's list of the 10 EVs you can get a tax break on in 2024.

You might also like

Cesar Cadenas
Contributor

Cesar Cadenas has been writing about the tech industry for several years now specializing in consumer electronics, entertainment devices, Windows, and the gaming industry. But he’s also passionate about smartphones, GPUs, and cybersecurity.