An autonomous car operated by Uber struck and killed a pedestrian in Tempe, Arizona, late Sunday. While Uber’s self-driving cars have been involved in a number of minor collisions in the past, this is the first known fatal accident involving a fully autonomous vehicle on public streets. In the wake of the crash, Uber is reportedly pausing all self-driving vehicle programs.
According to the Tempe Police Department, the pedestrian was crossing the street outside of the crosswalk when she was hit by the vehicle. At the time of the crash, there was a human backup driver in the car, but it was in self-driving mode. It’s currently unclear what ultimately caused the collision, but the Tempe Police Department’s findings will likely have a huge impact on the future of autonomous vehicles. If the investigation determines that Uber’s self-driving car was at fault, it could set the stage for an early political referendum: Who is responsible when an autonomous vehicle crashes into a person?
Because self-driving vehicles are a nascent technology, there isn’t an existing precedent for culpability when it comes to car crashes. A San Francisco bicyclist recently sued General Motors after colliding with one of its self-driving cars, but the suit hasn’t yet been decided. Monday’s crash could instill a sense of urgency in policymakers to codify rules governing self-driving cars.
Arizona is a key testing ground for autonomous vehicles because the state has taken a lax stance towards regulating the self-driving car industry. It was this hands-off approach that brought Uber’s self-driving test fleet to the state after the ridesharing company had a dispute with the California government over regulatory requirements.
Whatever the investigation finds, Monday’s tragedy is a watershed moment for the self-driving car industry. If there was something wrong with Uber’s autonomous vehicle technology, that’s obviously a huge problem. And if it was operating as intended, then we know that even functioning autonomous vehicles can’t completely account for the unpredictability of driving.
Ultimately, the collision brings into sharp relief the inevitable question looming over autonomous vehicles: Who do we blame when a human-trained machine is responsible for a death?