The rise of autonomous cars is something out of a driving instructor’s worst nightmare. Without hordes of teenagers desperate for the freedom of the open road, they’d be out of a job. But instructors around the country say they aren’t worried yet, as autonomous cars aren’t quite here yet, and as drivers adapt to new technology, they’ll need teachers more than ever. But they’re already changing how people drive — and not for the better. Driving instructors across the country worry that people are already becoming too reliant on the autonomous and driver-assistant safety technology available to us, even as it saves lives.

In some ways, the technology creates its own problems. Sharon Fife, a Driving School Association of the Americas past president and owner of the D and D Driving School in Dayton, Ohio, says there are new risks for drivers who trust their assistive or semi-autonomous systems to do the work of property defensive driving for them.

“It’s going to make them more relaxed, which makes me more concerned about the other drivers,” Fife told Inverse. “If they think the car is going to stop for them and they don’t have to worry, that would make me a little more concerned.”

The average driver has an accident every five years, she says. While she acknowledges that increased safety functions and partial autonomy undoubtedly make cars safer, it could make drivers more dangerous at the same time.

“To me, the industry is trying to fix the car and not the driver,” Fife says. “They’ve given up on teaching someone how to drive.” In European countries, getting a license requires 80-90 hours of training, which is significantly more than what is required in even the most stringent states in the U.S., she says. “I just don’t think we take driving seriously.”

Fife’s concerns echo those raised by the British House of Lords and top engineers in the U.S., particularly after the death of Joshua Brown, a Tesla owner whose Model S vehicle hit a tractor-trailer while using its Autopilot system last summer.

Tesla Model S P85+
Tesla Model S cars can even park themselves. 

According to Fife, the Driving School Association of the Americas now recommends a section on autonomous systems, Fife says. The idea, she said, is to teach drivers how to identify the tech in any car they get in, so they aren’t surprised or dependent on any one kind of system. “You just have to learn with the attitude that you’re going to have to do that with whatever car you get in,” Fife says. She adds archly, “There’s a lot of information in the owner’s manual.”

Across the country, instructors preach a mantra of constant vigilance, even with the most sophisticated autonomous systems.

“I rely on myself, and I teach my students you have to drive in the car as if you don’t have anything,” Mary Kay Speckhart, training manager at the Professional Driving School of Northeast Ohio tells Inverse. “Should you utilize them and let them assist, yes, but don’t let them do the work for you.”

2015-02-26 Tesla crossing
Driving instructors want you to still look before crossing train tracks, even if the electric barriers are there and you have an autonomous car to help.

Speckhart says a lot of her young students come in distracted by all of the technology in modern cars, something she blames entirely on bad examples set by the adults in their life.

“I had to learn how to drive differently because of the safety equipment in the car,” she says. “A lot of adults don’t realize that if they don’t use the safety equipment correctly it can harm you. The adults are never re-educated, so the young people, because they’re inexperienced, when they’re in the car with adults are mimicking them, and they might be mimicking the wrong thing.”

Like most instructors, Speckhart said she appreciates the safety benefits of increasingly autonomous cars. But she also expressed another common fear — that the vehicles’ advanced systems have the possibility to glitch, or “go stupid” on unprepared drivers. Christopher Sellers, a driving instructor at the All American Driving School in Sparks, Nevada told Inverse: “I’m not a computer guy, but when I’m at my desk and my computer freezes up— I think about that,” Sellers says. “What if this happened when I was in a car and I couldn’t take control of the car?”

Over a weekend this February, Sellers said he experienced a similar situation in real life, when an advanced safety feature in his car — his anti-lock brakes — encountered a situation it couldn’t solve. Sellers had driven out to Lake Tahoe for a ski trip, something he’s done regularly in the winter for 35 years. While driving down a particularly steep, icy slope, Sellers’s anti-lock brakes couldn’t get the car to stop, forcing him to ease the car into second gear and crawl down the hill to spare his brakes. “I had to personally take control of that vehicle because that car wasn’t doing what it was designed to do,” he says. And when he applies that thinking to an even more automated vehicle, with a less-educated driver, he gets worried.

“Computers go down,” he says. “My trust in a machine to make a judgment in a split second — I just can’t trust that.”

Sellers has found driver assist safety features, like systems that let you know if you’ve crossed a lane line, to be helpful in teaching. Still, he says smart cars aren’t a substitute for well-trained drivers. Fortunately, he joked that there was a way to reconcile the two. He noted that Mercedes-Benz has been touting its autonomous car and offered to help out.

“It’d be nice to have Mercedes-Benz to come to our office and say, ‘you know what, I’d like your opinion,’” he says. “I’ve already got my hand up.”

Photos via Flickr / Alexandre Prévot, Flickr / ** RCB **, Flickr / delete08

Dyani Sabin is a science writer from small-town Ohio transplanted to New York City. Former biology researcher and library supervisor, you can also find her writing at Scienceline.

What's Next