Tesla's latest Full Self-Driving beta was so bad the company retracted it

Musk's rush to release FSD is putting the public in danger.

Zusmarshausen, Germany - August 21: (BILD ZEITUNG OUT) 12 fast-charging stations are reserved for Te...
DeFodi Images/DeFodi Images/Getty Images

We have known that beta versions of Tesla’s Full Self-Driving software is very, very bad for some time now, but apparently Tesla just caught on. Testers of the latest version (10.3) have had so much trouble with the software that Tesla has decided to temporarily pull it from all vehicles.

Musk tweeted about the rollback on Sunday. “Please note, this is to be expected with beta software,” he wrote. “It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”

Tesla’s Full Self-Driving software doesn’t exactly enable fully autonomous driving. It’s really just a much more extended version of Tesla’s existing driver-assist Autopilot software, requiring frequent driver interaction to operate. And it costs $10,000 — plus hardware upgrade fees — to even participate in the beta program.

Even with the worst of Full Self-Driving’s problems hidden away behind a mountain of NDAs, it’s painfully obvious Tesla is nowhere near ready to safely allow consumers to test the software on public roads.

Seems…not safe — Customers chosen to be part of Tesla’s Early Access Program are hand-selected based on a “safety score” calculated by a Tesla’s sensor arrays. Only those with the highest scores (usually 99 out of 100) receive new versions of the Full Self-Driving software when it’s first released. This ostensibly makes beta testing as safe as possible.

In his tweet announcing the rollback to version 10.2, Musk makes sure to note that it’s “impossible” to fully test new software versions; it’s for this very reason beta testers exist, to find and report bugs. But this is semi-autonomous driving we’re talking about. The stakes are very high. Sending out software so riddled with bugs you later have to retract it has life-threatening consequences.

Stop trying to make FSD happen — It’s clear from the speed with which these updates are being released that Musk and Tesla have gone into overdrive with Full Self-Driving. Musk has been promising this feature for years. Now, with scrutiny of the software at an all-time high, Musk is rushing out software updates multiple times every month.

That rush is putting Tesla drivers — and everyone else on public roads with them — at risk. Tesla is endangering many people by failing to properly test its beta software before sending it out.

Full Self-Driving just seems like more trouble than it’s worth, at this point. It’s a headache and a mortal risk. Even Musk has had to admit the software’s “actually not that great.” Tesla hasn’t even been able to make Autopilot safe yet; maybe the company should focus on that before jumping to fully autonomous driving. Just a thought.