Tech

Tesla recalls 12,000 cars over dangerous Full Self-Driving beta error

Elon Musk's autonomous driving dreams are causing more headaches than anything else.

TIANJIN, CHINA - 2020/02/09: Driver's cab and intelligent control system of a Tesla Model 3 car.   O...
Zhang Peng/LightRocket/Getty Images

Tesla has chosen to recall almost 12,000 of its electric automobiles sold since 2017 because of a widespread error in the communication system. The error in question can send up a false collision warning and/or unexpectedly activate the emergency brake system, according to the National Highway Traffic Safety Administration (NHTSA).

The recall affects some — but not all — Model S, Model X, Model 3, and Model Y vehicles purchased since 2017. The NHTSA has determined that a total of 11,704 vehicles may have been affected by the communication problem.

You only have to worry about the recall if you’re one of the die-hard Tesla fans willing to risk their lives testing the beta versions of Tesla’s Full Self-Driving (FSD) software, which is meant to eventually enable an entirely autonomous driving experience. The recall is directly related to version 10.3 of the beta, an update that was so buggy Tesla rolled it back the day after its release.

Those problems were apparently bad enough that the NHTSA felt it needed to step in. That’s never good news for an automaker.

It all goes back to the FSD beta — After years of empty promises, Tesla finally opened its Full Self-Driving software up to a limited testing group earlier this year. Ever since, Tesla has been pumping out updates as quickly as humanly possible to avoid further criticism of the still-nascent software.

That sense of urgency hasn’t exactly worked out well for Tesla. We knew through Elon Musk that version 10.3 of the software had issues, but we didn’t realize just how deep they ran. Unexpected deployment of the emergency brake can be life-threatening to both passersby and drivers. All because of a “communication disconnect” between two chips.

Not making friends at the NHTSA — Up until quite recently, the NHTSA had taken a relatively hands-off approach to autonomous driving tech. The National Transportation Safety Board (NTSB) has had its eye on Tesla’s automated driving features for years, now, but the NTSB can only make recommendations.

The NHTSA, on the other hand, has more authority and can force recalls when necessary. The agency finally opened a formal investigation into Autopilot — FSD’s predecessor in many ways — in August, after years of automation-motivated crashes.

Now that Tesla has demonstrated just how disastrous the FSD beta can be, you can bet the NHTSA will be watching more hawklike than ever. Not that we really needed a failed update rollout to illustrate the finer points of the FSD beta’s shortcomings.