New legislation could require future cars monitor for distracted drivers

The move follows a fatal Tesla crash in Texas, recently, and others accidents involving vehicles with semi-autonomous features.

Young, handsome man pulled over with his car  and making video call on phone.
urbazon/E+/Getty Images

New legislation was proposed on Monday that would require all new cars feature driver-monitoring systems within six years. The legislation was drafted by two senators who recently asked regulators to investigate a fatal Tesla crash in Texas.

Police initially reported that nobody was in the driver’s seat of the Model S — one was in the passenger seat, while the other was in the rear. That suggests the Autopilot driver-assistance mode was activated, something Tesla disputes.

Driver complacency — Tesla uses monitoring techniques to ensure someone is in the driver’s seat whenever Autopilot is enabled. Its monitoring is rudimentary, however, mostly relying on sensors to determine a driver’s hand is on the steering wheel — but Consumer Reports in a test found it’s very easy to trick the software by resting a metal chain on the wheel. Accessories to trick the system can also be readily procured online.

The newly proposed bill specifies the technology should attempt to detect what’s going in with a driver’s face:

  • Driver distraction
  • Driver disengagement
  • Automation complacency by drivers
  • Foreseeable misuse of advanced driver-assist systems

As other carmakers have implemented advanced driver assistance technology, they have implemented facial recognition for monitoring. New cars from Cadillac and Ford both use driver-facing cameras to monitor drivers and if they’re found to not be looking at the road, assisted driving is disabled.

Privacy could be an obvious concern with such technology — internet marketers would love to use a driver’s emotional state and location to target them with ads. But the legislation includes a brief section addressing this:

(3) PRIVACY. — The rule issued under paragraph (1) shall incorporate appropriate privacy and data security safeguards, as determined by the Secretary of Transportation.

Critics have argued that Tesla’s roll-out of advanced driver assistance software has been dangerous because CEO Elon Musk has oversold Autopilot’s capabilities, leading owners to become overconfident in the abilities of their cars... like the driver who slept while his car was barreling down the highway. Or the Apple engineer who died after his Model X crashed while he was playing games on his phone.

Even as Musk has said the latest version of Autopilot can drive for him much of the time, he has also admitted that driver complacency is a problem as autonomous software gets better, but still has a chance of erring. “There is a dangerous transition point,” he said in an interview last year. “Where self-driving is good, but it occasionally has issues, because people maybe get too comfortable, and then they stop paying attention like they should. And then 99.9% of the time, it's good, 1 in 1,000 times it's not.”

Nanny features — If Musk wants to continue to roll out Autopilot and use Tesla owners as guinea pigs, the cars should probably do more to protect drivers from themselves. Other carmakers “nanny” drivers in a variety of ways, such as disabling the use of the media center when the car is in motion. It protects drivers, passengers, and other road users, but also keeps carmakers safe from liability.