We’ve known for a while now that Tesla’s suite of self-driving products known as Autopilot is nowhere near as good as Technoking Elon Musk says it is. Still, every now and then the fact that it’s clearly got a long way to go yet is being used in real cars, on real roads, but real people, many of whom believe it’s more capable than it is, astounds us anew. Today’s example is a great one if you like jokes about the obstacles autonomous vehicles still face: Tesla’s sensors have been mistaking the moon for a yellow traffic light.
A video posted by Twitter user Jordan Nelson (@JordanTeslaTech) — now gone semi-viral — shows the issue in question in action. From here, the experience is pretty hilarious, really; Nelson is driving along on a night with a perfectly bright yellow moon, and their Tesla continuously registers the glowing orb as a yellow traffic light, over and over again, slowing down gently each time.
Musk hasn’t directly commented on Nelson’s video just yet, but his legion of fanboys has already jumped to the rescue with some prime excuses. And surely Musk himself will reply to the tweet soon. He loves pulling shit like that.
Autofailure — Despite their tenuous similarities, the Moon is a chunk of space rock, not a traffic light. It doesn’t even really cast any light of its own, instead only reflecting the sun.
This kind of edge-case problem is exactly the sort of thing autonomous systems need to get to grips with if they’re going to be safe enough to take over from human drivers completely.
Tesla’s Autopilot has also previously been tricked using a literal piece of tape into speeding up 50 mph.
Elon Musk’s flourishing fan base is always quick to jump to their Technoking’s defense. The replies under Nelson’s video are flooded with Tesla fanboys who seem to think the “simple” solution to this problem would be… adding future moon trajectory data to Tesla’s neural networks.
Maybe this shouldn’t be allowed? — In relatively safe environments, these slip-ups are funny; in other cases they’re deadly. Tesla’s Autopilot system has been a key player in a number of high-stakes car crashes in the last few years, and Tesla has done its damndest to keep anti-Autopilot talk to a minimum.
A recent rule passed by the National Highway Traffic Safety Administration requires automakers to report any crashes that involved semi-autonomous or full self-driving software — so we should have more accurate data about these crashes moving forward.
Of all the problems with Autopilot, mistaking the Moon for a traffic light is minimal. It’s also a pertinent reminder that Tesla’s costly autonomous driving package is perhaps not quite ready for prime time.