UK says drivers in self-driving cars shouldn't be liable for accidents

The UK's proposal concerns accidents caused by the failure of self-driving technology.

A Waymo autonomous van entering a parking lot.

The United Kingdom's Law Commission has released a new proposal suggesting that "drivers" in autonomous vehicles shouldn't be held liable for accidents when the software makes an error. Under the law, the user would not be prosecuted for careless driving, speeding, or running red lights while the car is operating in self-driving mode.

One major issue with the proposal is that it stipulates drivers would have between 10 and 40 seconds to regain control of a vehicle once it begins to, say, plow through a red light, during which time they wouldn't be responsible for accidents. That suggests the Law Commission is talking not only about fully autonomous vehicles but rather about semi-autonomous ones, like a Tesla in Autopilot mode.

It's a start — That is somewhat concerning for some Tesla owners, who have a history of being overly confident and flagrantly ignoring the automaker's warnings to keep their eyes on the road. CEO Elon Musk frequently exaggerates the capabilities of Autopilot to operate vehicles unassisted with many of his loyal fanbase only too to play guinea pig.

The motoring industry has defined different levels of autonomy, and the Law Commission's proposal doesn't extend through all of them. Tesla Autopilot is only Level 2, meaning a human driver is (at least, from a legal perspective) required to be in full control at all times. Under Level 3 and 4, drivers can take their eyes and attention off the road for short periods of time, say exit to exit on a highway. Level 5 — the holy grail — is full autonomy in which a human never needs to watch the road.

The UK will need to comprehensively address these different levels of "self-driving" or else drivers in a Tesla might think they're indemnified from liability when Autopilot isn't actually self-driving but really just a glorified lane-keeping technology. There are big variances from Level 2 up to Level 5 that the proposal doesn't address.

It's a good start, however, and should encourage automakers to be more careful about how they market the capabilities of their systems. Competitors to Tesla's self-driving ambitions, like Google's Waymo, have been much more cautious about rolling out their technology and incorporate multiple types of technology, including Light Detection and Ranging (or Lidar) — which Tesla's don't include — to ensure the highest level of safety.

Regulation — Here in the United States, autonomous technology is surprisingly not regulated on a universal level. At this stage, self-driving vehicles being tested on public roads are considered normal cars with some extra software thrown in, and don't need exemptions from the federal government to operate. The federal government only deals with issues related to vehicle design, such as by requiring airbags. States can regulate self-driving operations, but rules vary widely between them.