Tech

Tesla can’t withhold information on Autopilot crashes anymore

HUAI'AN, CHINA - JUNE 13, 2021 - Residents view a new energy Tesla car at a shopping mall in Huai 'a...

$23K

The fines per day Tesla will face if it fails to keep NHTSA properly informed.

Barcroft Media/Barcroft Media/Getty Images

The National Highway Traffic Safety Administration (NHTSA) has passed a new rule that will require automotive companies to report any crash that occurs while a car is in semi-autonomous or fully self-driving mode.

Lax regulations — The agency has long taken a hands-off approach to the regulation of self-driving technology in the interest of allowing U.S. companies to lead the industry. But the agency empowered with improving road safety has increasingly shown an antagonistic stance towards Tesla in particular, which it has attacked for rolling out self-driving technologies to customers prematurely and without adequate safeguards.

The NHTSA has said Tesla was partly to blame for a 2018 crash that killed one man, and more recently opened probes into 23 separate crashes involving its vehicles that may have been in the company’s Autopilot assisted driving mode. It’s currently investigating a crash that occurred in Texas where two men were killed when their Model S crashed into a tree and nobody was in the driver’s seat.

In those cases, the agency wasn’t immediately sure whether or not the cars were in Autopilot as Tesla wasn’t required to disclose that information. But now under the new guidelines, it will be required to inform the NHTSA when assisted driving technologies are enabled at the time of a crash. Failure to comply will result in strict fines of up to $22,992 per day and a maximum penalty of “more than $100 million.”

Safer roads — The NHTSA says the information will "help the agency identify potential safety issues and impacts resulting from the operation of advanced technologies on public roads and increase transparency."

Other companies including Google’s Waymo and GM’s Cruise will also be subject to the new rules. They’ve seen much less negative publicity regarding their self-driving programs because they are largely still in private testing and not available to consumers (besides a small self-driving taxi pilot that Waymo is operating in Arizona). New data, however, could reveal that these programs are similarly bad when it comes to safety.

Tesla believes that using its existing fleet of customers, it can train its self-driving technology to study their behaviors and learn how to drive like a human would. But the company’s CEO Elon Musk has overstated the capability of Autopilot and Tesla has allowed some drivers to gain early access to its advanced “Full Self-Driving” software that is anything but self-driving and frequently requires human intervention.

Delays, delays — Tesla first unveiled Autopilot in 2015 with a promise that its cars would be fully-autonomous by 2017. But most players in the self-driving space have walked back earlier claims after realizing the challenges of making a car drive itself were harder than once expected.

Critics say that semi-autonomous software poses a threat because drivers become complacent, look away from the road, and then are too slow to take back control when the car errs. Musk has agreed with that assessment but continues to tout Autopilot.

Two U.S. senators recently proposed legislation that would require future cars implement technology to monitor for drivers looking away from the road.