Tesla's Autopilot makes drivers less attentive to the road

Critics of the semi-autonomous software have argued that it's dangerous because drivers could be too slow to take over when things go awry.

Tesla 3 at Stanford Shopping Center in Palo Alto, Calif., on Thursday, January 11, 2018. (Photo by S...
San Francisco Chronicle/Hearst Newspapers via Getty Images/Hearst Newspapers/Getty Images

Tesla has always told its drivers that they must stay attentive to the road at all times when using Autopilot and has disputed incidents where the technology was blamed for a crash. But MIT recently conducted a study to see just how much drivers pay attention to the road when Autopilot is enabled, and the results aren’t great.

There’s a gray area between cars that are fully autonomous, meaning they can operate entirely on their own, and ones that are assisted, or only drive themselves in limited situations. Tesla has taken the tactic of releasing its autonomous technology to customers before it’s entirely ready and adding more capabilities over time. Critics have said this is dangerous, as drivers will come to expect that their cars can do everything, and they’ll fail to take over fast enough when the car makes an error.

Glance measurements — The study looked at 290 human-initiative disengagements of Autopilot, or moments when drivers needed to take over. To analyze reactiveness, the researchers analyzed glance patterns across a range of drivers and found that, with Autopilot enabled, glances off the road were longer on average. And the frequency of glances down or to the center console area were frequent and longer, with 22 percent of glances exceeding two seconds. Overall, drivers with Autopilot enabled looked at the road less and focused more on areas unrelated to driving.

picture alliance/picture alliance/Getty Images

That doesn’t bode well for Tesla. The National Highway Transportation Safety Administration (NHTSA) has taken a hands-off approach to self-driving technology over the years, in the interest of allowing the technology to mature. But that might be set to change if Tesla continues to push half-baked technology onto drivers.

Crashes of Teslas in Autopilot mode over the years have led to investigations, and a string of crashes recently triggered NHTSA to begin probing the company and decide whether Autopilot should be allowed at all. Tesla is also now required to report any crash that occurs while its cars are in semi-autonomous mode, something not required before.