Tesla’s own engineers don’t think Autopilot is as good as CEO Elon Musk says it is. That information comes from the California Department of Motor Vehicles, citing a conversation it had with Tesla’s director of Autopilot software.
“Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently,” the California DMV said in notes about a March 9 conference call with Tesla representatives, including Autopilot director CJ Moore. Level 2 refers to semi-autonomous driver assistance software that requires human supervision but does certain things, like stop a car if it detects an obstacle ahead.
It’s unclear which tweet, in particular, the comments are referring to. Legal transparency group PlainSite obtained a memo from the meeting after making a public records request.
The comments from Tesla are important because they contradict Musk’s own public statements about Autopilot. In recent months he has said that Autopilot could be capable of driving Tesla’s car autonomously within the next year. Today, the software mostly just follows highway lane markings and curves. The company offers a Full Self-Driving (FSD) package in limited beta that unlocks the ability to drive on local and city streets autonomously, but videos from early testers show it’s really bad and requires a human to frequently intervene when the software gets confused by pedestrians and other obstacles.
Dangerous bugs — Musk told investors in January that he is “highly confident the car will be able to drive itself with reliability in excess of human this year.” He has also tweeted saying that the latest FSD beta can drive him around with no intervention. Tesla first began talking about its goal to make Autopilot fully autonomous back in 2015, when Musk said it would be ready in two years. He has repeatedly pushed back that timeline.
The comments from Tesla insiders are also important considering a string of crashes that have been suspected to be linked to Autopilot. Just last month, two men died in a Model S that crashed with nobody in the driver’s seat. Autopilot is supposed to disengage if a driver isn’t holding the steering wheel, but it’s easy to fool. And over the years there have been numerous reports of Tesla drivers acting recklessly, such as falling asleep on the highway and playing games on their phones.
That’s dangerous not just for the driver but for every driver, cyclist, or pedestrian around them. Nobody wants a car with a distracted (or absent) driver barrelling down public roads at 60mph or more.
Musk’s comments likely encourage his most loyal followers to test Autopilot’s limits and share their experiences online.
Two U.S. senators recently suggested automakers take further steps to prevent drivers from taking their eyes off the road, such as by using driver-facing cameras to monitor their faces. The National Highway Safety Administration has criticized Tesla for its roll-out of Autopilot, but the agency doesn’t currently regulate autonomous systems out of fear of hampering innovation.