big brother is watching

The one feature that would make Tesla's Autopilot so much safer

Competitor systems all use something called driver monitoring to literally keep an eye on what the person behind the wheel is up to. So why doesn't Tesla?

Tesla is in the news again. Well, Tesla is always in the news because Elon Musk has a Steve Jobsian ability to draw media attention, but this week the news isn't great.

On April 17, a Tesla Model S crashed near Houston, Texas, killing both occupants. Police are convinced that no one was in the driver's seat when the car crashed into a tree at a high rate of speed.

It seems that Tesla's Autopilot system — which cannot drive itself and should never be used without an attentive driver in the driver's seat — was being used without an attentive driver in the driver's seat, with fatal results.

Sam Abuelsamid is an e-mobility analyst with Guidehouse Insights, a marketing intelligence firm with a focus on technology. He has been highly critical of Tesla’s Autopilot system for years not because it doesn’t work — Autopilot works well if used properly — but because the system is so easily abused.

“While Tesla states in the instructions that drivers are supposed to keep hands on the wheel and eyes on the road, they do little to enforce it, and in fact, Elon Musk frequently undermines those instructions,” Abuelsamid tells Inverse. “He does TV interviews in moving vehicles with his hands off the wheel and retweets videos from fans doing stupid things in the car which, combined with his constant promotion of how safe Autopilot and Full-Self Driving, creates the impression among many that these vehicles are in fact self-driving.”

Here’s the background — Neither Autopilot nor the Full-Self Driving feature is actually self-driving. The key distinction with self-driving is whether the driver or the carmaker is legally responsible for what happens to the vehicle if something goes wrong.

The concept of legal liability is a crucial one because some entity has to be legally responsible for what a two-ton hunk of metal is doing at all times. As it is currently deployed, there must always be a competent driver ready to take back control while Autopilot is active.

The redesigned Tesla Model S interior with Autopilot rendered on the dashboard screen.

Tesla

For what it is, Autopilot is a competent advanced driver assist system (ADAS) that works well within its limits. The problem is that Tesla doesn’t have the necessary guardrails to avoid misuse. Most other systems have a key feature: they carefully monitor for distracted driving.

The Super Cruise system that is available in a handful of Cadillac and Chevrolet models, including the new Escalade and 2022 Silverado pickup truck, uses an infrared camera to watch the driver’s eyes. This ensures that the driver is actually paying attention to the road and not sleeping, looking at their phone, or doing any number of dangerous distractions.

Ford, BMW, and other carmakers use driver monitoring for their ADAS tech as well. In fact, pretty much every carmaker but Tesla does it.

“Every vehicle, but especially those that are capable of lane-centering, should be equipped with active driver monitoring,” Abuelsamid says. “Tesla only uses the torque sensor in the steering wheel, which is not a reliable gauge of hands on the wheel and tells you nothing at all about where the driver is looking.”

Every new GMC Hummer EV will have GM’s Super Cruise ADAS system. A camera in the little hump just behind the steering wheel keeps an eye on the driver to ensure they’re paying attention to the road ahead.

GMC

“If we are really serious about addressing distracted driving, an infrared camera like the ones used by GM, Ford, BMW, and soon Nissan will actually tell you if the driver is alert and watching the road,” he says. “It can also provide indications of the driver being impaired, drowsy or ill and bring the car to a stop or alert medical attention.”

Elon Musk, who dissolved Tesla’s public relations team a few years ago, says in a tweet that “data logs recovered so far” from the crashed car in Houston show that Autopilot was not enabled and that Tesla’s Full-Self Driving feature had not been purchased for the car. He also asserts that “standard Autopilot would require lane lines to turn on, which this street did not have.”

@elonmusk

As there is no more Tesla PR team, I tweeted at Elon (seemingly the only way to possibly get a reply) to ask for specifics about the data Tesla has received on the crash or to respond to this article, but received no response. I also would have asked about whether Autopilot had been active at any point before the vehicle crashed.

Both the National Highway Transportation Safety Administration (NHTSA) and the National Transportation Safety Board are sending investigators to Texas to ascertain what happened in the crash. We’ll likely hear much more about Autopilot and the importance of driver monitoring very soon.

Related Tags