Conditionally autonomous cars probably sound cooler than whatever’s parked in your driveway, but in the long run, Jackie DiMarco says it’s full autonomy or bust. DiMarco, who’s the chief engineer for Ford’s autonomous vehicles department, told Inverse why “level three” autonomous vehicles are something of a non-starter — and could actually turn out to be more dangerous that a level two vehicle.

First, a quick primer. The Society of Automotive Engineers has outlined six levels of autonomy, ranging from zero (just a normal car) to five (never take control of the car ever). The levels are hard to pin down exactly, but cruise control is normally taken as level one, Tesla’s Autopilot mode is level two, and cars that can drive without supervision around a designated city area are considered level four.

That leaves level three, also known as “conditionally autonomous.” Many engineers understand level three to be where a car is capable of driving itself from A to B like in levels four and five, but the driver still needs to be able to take control at a moment’s notice like with levels one and two.

“That puts you in a difficult spot because you have that sense of security that the vehicle’s driving itself and [you] may be relaxed, but you have to be ready to take over at any moment,” DiMarco says. “We didn’t really see that as the right approach.”

It’s easier to be lulled into a sense of false security with Level 3 autonomy. Imagine, you’re on the interstate through Kansas, and the car’s been driving for hours in a straight line. You’re practically asleep. And then, suddenly, the car’s self-driving program cuts out, and you need to make a decision that could be a matter of life or death.

California, the state with perhaps the most autonomous vehicle testing, specifically tracks “disengagements,” when an autonomous system shuts down and hands off control, as one of the most important metrics for measuring systems progress and relative safety. It’s further complicated by the fact that handing off control to a human driver is one of the hardest processes to get right, and most susceptible to human error. Taken together, shooting for an autonomous system that operates with the possibility of disengagements just isn’t worth it.

Many car manufacturers, like Ford, see level four autonomy arriving within the next few years. Gill Pratt, Toyota Research Institute chief executive, said in January that level four autonomous cars will start operating in limited environments by 2020. Ole Harms, CEO of Volkswagen company Moia, told Inverse in December that he thinks the company will have its first fleet of autonomous cars up and running by 2021. Ford itself wants to get level four ride-sharing vehicles on the roads by 2021:

Level three seems like an obvious stop-gap between now and then, but Ford thinks it’s better to jump straight to level four. Many of the scenarios consumers dream about, like playing video games instead of driving, aren’t possible with level three as the driver needs to pay full attention without actually having much to do except stare at the road and worry about having to take control. It’s the worst of both worlds.

“If you can skip level three and go to level four, then you’re really removing that requirement for the driver to be situationally aware at all times,” DiMarco says.

That doesn’t mean that Ford doesn’t see the value in semi-autonomous driving. Features like lane keep assist can help prevent accidents and make driving less of a chore. Despite ongoing work on full autonomy, DiMarco says that Ford has tripled its investments in level one and two technologies.

“Maybe there are times you want to drive, maybe there are times that you want to let the vehicle do it itself,” DiMarco says. “We’ve certainly drawn up concepts that are very futuristic and cool. That’s the ultimate goal, right? Is that every vehicle would be capable of both. Give everybody the best of both worlds.”

Photos via Getty Images / Justin Sullivan