Tech

Tesla's latest Full-Self Driving beta shows the technology is still very bad

It’d be easier — and safer — to just drive the car yourself.

Diedenbergen, Germany - May 12, 2021: Tesla Model 3 on a highway nearby Wiesbaden in Germany. The Te...
ollo/iStock Unreleased/Getty Images

One concern with self-driving technology is how it interacts with the cars and pedestrians around it. Human drivers know the unspoken rules of the road about how to interact with others in a way that’s predictable so that other drivers know how to act in concert. Drivers move through an intersection in the order they arrived there, for instance, and maintain space from pedestrians or bicyclists. But Tesla’s fully autonomous software, still in beta, absolutely does not know how to properly behave.

Recently, a Tesla owner named Kim Paquette shared a video on YouTube of her car driving through the streets of Providence, Rhode Island with Full Self-Driving (FSD) enabled. The software package promises to eventually drive vehicles entirely on its own, but Tesla has been using a limited number of drivers as guinea pigs to test the technology and provide data as it continues improving FSD. Paquette’s video shows it has a way to go.

Scary AF — As you watch Paquette’s car try and navigate through the narrow, busy streets, it’s almost harrowing how many times she nearly crashes into another car before disengaging FSD and taking over control. Each time she does so, a “bong” tone is emitted, and that tone is emitted constantly throughout her 18-minute recording, as the car jerks around unpredictably, pulling out into intersections and nearly getting T-boned by cars it fails to see past corners.

Other oddities ensue. For instance, the car stops far back from a stop sign, and then starts to pull out into an intersection before it can see if a car is coming, forcing Paquette to quickly hit the brake. A normal human would inch out to make sure no cars are about to pass through — the Tesla does not. It also does other strange things, like getting way too close to parked cars on the street. At one point in the video, the car gives space for bikers on the side of the road before suddenly jerking towards them, at which point Paquette again has to intervene. Later it tries navigating around traffic... on a single-lane street, something she says the car attempts often, as it doesn’t always understand when a street is a single lane.

“[The car] seems very confused today,” she comments. “It’s very jerky and unsure.”

Danger to society — The whole video is anxiety-inducing, and makes it even harder to believe CEO Elon Musk when he proclaims that FSD is “amazing” and can nearly drive itself already. Maybe on the large, lazy streets of suburban California, but Paquette’s video shows it’s not only bad in a busy city — it’s actually pretty dangerous.

Proponents will point out that all this data feeds back into Tesla’s development of autonomous tech. Each time the driver corrects their car, in theory, that information goes back into the company’s “neural network,” a machine learning program that uses data from all the cars to learn and get smarter. But this video should make people feel uncomfortable being around a Tesla. Another way to think about this is that people are human test subjects when they’re walking around a city and a bunch of wealthy Tesla owners put their vehicles in control to test out and improve FSD.

When Musk first introduced Autopilot, an advanced driver assistance technology, back in 2015, he said that full autonomy would be here within three years. But that timeline has continued to be pushed back along with the rest of the industry. At least the rest of the industry has been testing its technology in much more controlled environments. FSD is likely years away from being usable — maybe more if regulators crack down.