A Tesla driver in the San Francisco Bay Area is under investigation for repeatedly riding on public streets from the back seat of his vehicle while the car operates in Autopilot driver-assistance mode. It’s not a great look for Tesla, which is already facing criticism over the technology which users regularly abuse, and the capabilities of which CEO and “Technoking” Elon Musk frequently overstates.
The man has been identified as Param Sharma from San Francisco, a notorious “rich kid of Instagram” whose had previous run-ins with law enforcement. His most recent antics were first documented by passing drivers several days ago, but despite a new investigation by the California Highway Police, Sharma has doubled down on the behavior by posting new videos to social media.
This fresh controversy comes shortly after two people were killed when a Tesla Model S they were riding in crashed into a tree and police reported that nobody was in the driver’s seat. Tesla has denied the claim, but Sharma’s behavior is a further escalation of repeated instances of Tesla owners pushing the limits of their car’s driver assistance software in ways that could harm them and others.
Actual backseat driving — In a video uploaded yesterday to YouTube, the deeply reckless and evidently insecure Sharma can be seen rapping (poorly) from the back seat as his Model 3 drives down a highway. He then pans his camera around to show his left foot on the steering wheel, and an iPad attached the back of the passenger seat displaying an article entitled “Man seen riding in back seat of Tesla with no driver.” Unless he’s pulling some sort of visual deception, it’s clear that the driver’s seat is unoccupied.
The description of the video indicates his car is operating in Tesla’s Full Self-Driving (FSD) mode, an advanced form of Autopilot that’s capable of operating on local and city streets and can do things like take turns and stop at intersections. The software, which costs $10,000, is available in limited beta to select customers, but early test videos show it struggles to complete basic operations, like moving quickly enough to make an unprotected left turn.
Over-promising — CEO Elon Musk has oversold the ability of Autopilot, claiming that it’s nearly good enough to replace a human. But the company’s head of Autopilot recently contradicted that, telling California’s Department of Transportation that Elon’s comments do “not match engineering reality.”
Musk first predicted back in 2015 that fully autonomous driving would be ready by 2017, but has repeatedly pushed that timeline.
Tesla warns drivers that Autopilot is not capable of full autonomy, and it is supposed to deactivate when a driver takes their hands off the wheel. But it’s been found easy to fool and Amazon is rife with “arm relief” devices, which are weights Tesla owners can use on their steering wheels to trick the Autopilot system.
The National Highway Transportation Safety Administration (NHTSA) recently opened a string of investigations into crashes linked to Autopilot. The agency has not created regulations overseeing self-driving software out of fears of slowing innovation, so Tesla can largely roll out Autopilot as it wishes. But if it doesn’t do more to rein in bad behavior, it risks damage to its reputation and regulations it might not like. Plus, you know, more people could get killed.
Tesla should implement more safety measures to prevent this behavior, like a driver-facing camera that checks for attentiveness and seat sensors to ensure there’s someone in the driver’s seat at all times. Two U.S. Senators recently introduced a bill proposing just that as a requirement in new vehicles with driver assistance technology.
Sharma is welcome to get himself killed, but other lives are also being put at risk with his behavior. Tesla Autopilot is far from good enough to drive a vehicle on its own. Hopefully, Tesla acts and the police do, too.