In spite of what Technoking Elon Musk wants you to think, Tesla’s self-driving Autopilot software is still pretty bad. And by pretty bad we mean like it’s generally more of a hazard to passengers than it is a boon.
The National Highway Traffic Safety Administration (NHTSA) has opened a formal investigation into Tesla’s not-quite-autonomous driving system. After years of coasting on subterfuge and general withholding of information, it’s time for Tesla to be brutally honest about just how unhelpful Autopilot really is.
The NHTSA’s probe is focused on a very specific subset of Autpilot-related crashes: those involving emergency response vehicles like ambulances. The agency has identified 11 crashes since 2018 in which Tesla vehicles, while operating on Autopilot or Traffic Aware Cruise Control mode, have hit vehicles at locations where emergency responders were using flashing lights, cones, flares, or an illuminated arrow board.
Given that Autopilot literally can’t tell the difference between a traffic light and the moon, we’re not all that surprised by this news.
Bringing in the big guns — Tesla has done a pretty good job of keeping quiet any crashes involving Autopilot, but they haven’t gone entirely under-the-radar. The National Transportation Safety Board (NTSB) has had its eye on Tesla’s Autopilot problems for quite a while now and has made multiple safety recommendations to the automaker.
The NTSB doesn’t have the authority to do much more than investigate, though. Instead the NTSB simply makes recommendations — it told the NHTSA that Tesla should require a better system to make sure drivers are paying attention, for example, and that Autopilot should be limited to areas where it can safely operate. The fact that the NHTSA is now stepping in means the problem has finally been elevated above the NTSB’s investigative domain.
More self-driving regulation incoming — The NHTSA has taken a mostly laissez-faire approach toward self-driving technology, given that the number of companies actually selling the tech is very few right now. That lax outlook seems to be changing, now. Earlier this summer the NHTSA announced a new rule requiring automotive companies to report any crash that occurs while a car is in semi-autonomous or full self-driving modes.
In a statement about the new probe, the NHTSA reminded the public that “no commercially available motor vehicles today are capable of driving themselves.”
The NHTSA has the power to force recalls for automakers in the United States. Perhaps the threat of that action will finally force Tesla to be more honest about Autopilot’s limitations. And, if we’re really lucky, Musk might even put measures in place to make Autopilot safer for Tesla owners and the general public.