Autonomous Uber 'Driver' Involved In Crash Streamed Hulu — Is She To Blame?
The modified Volvo struck and killed a 49-year-old woman crossing the street.
The first recorded fatality due to a self-driving car accident may have had more to do with human error than previously thought. Police in Tempe, Arizona, released a report to Reuters late Thursday night that shows 44-year-old Uber “safety” driver Rafaela Vasquez was streaming reality singing competition The Voice on Hulu’s mobile app at approximately the time of the collision that killed 49-year-old Elaine Herzberg.
An Uber spokesperson tells Inverse that the responsibility to ensure self-driving car accidents don’t occur is on both the company and the person behind the wheel, but that there’s a zero-tolerance policy for cell phone use while operating a self-driving test vehicle. Police deemed the Arizona fatality “entirely avoidable” had Vasquez been paying attention, but the fact still remains that the software Uber was testing on the modified Volvo XC90 erred in deciding not to swerve.
What Went Wrong in Tempe?
R-Telematica collects vehicle data to report on features that can reduce car accidents, including the autopilot software and hardware that Uber’s self-driving cars rely on to avoid obstacles. R-Telematica CEO and Kasko2go co-founder Dmitry Bakutin tells Inverse in an emailed statement that an autonomous vehicle records objects in front of it and decides whether to brake or not based on the threshold of hazard recognition. A low threshold would constantly brake, whereas a more precise threshold would change depending on the road and driving conditions.
“Presumably, the Uber self-driving car accident happened due to the threshold settings for recognizing dangerous objects. The car recognized the person, but did not brake,” Bakutin wrote. “If the threshold settings were not a constant, but a variable function, which is calculated by using our technology and depends on the accident rate of a particular section of the road, then purely theoretically this could have prevented the collision.”
The National Transportation Safety Board (NTSB) is conducting one of multiple external safety assessments, Uber’s spokesperson says, to determine if the software and hardware technology in the self-driving fleet could be at fault.
Meanwhile, internal investigations are focused around the safety culture at Uber, specifically at procedures and training that includes enforcing the policy on cell phone usage. There’s already a formal written policy, posters around the self-driving Uber offices, and instruction during vehicle operator training — and yet, a handful of people other than Vasquez have been terminated for being on their personal device behind the wheel.
How Safe Are Self-Driving Cars, Actually?
Vasquez was responsible for intervening if the Uber’s autonomous system failed, but Hulu records showed her account was playing an episode of The Voice for 42 minutes, ending at 9:59 p.m., the approximate time that the car struck Herzberg at 44 miles-per-hour while the homeless woman was crossing the street with her bicycle. The police noted that Herzberg was unlawfully crossing the four-lane road outside of a marked crosswalk, but that Vasquez might be liable for manslaughter.
The Uber employee originally denied using either of her two cell phones that she had in the car with her throughout the test ride, but footage from inside the car released by Tempe police shows Vasquez looking down and seemingly reacting to her device until 0.5 seconds before the crash, when she slammed on the brakes — less than a second after the impact.
While Uber wants to relaunch self-driving car tests this summer, it doesn’t have a hard timeline and it’s unclear where that testing might take place. Besides Arizona, which rebuked Uber’s testing license, California, and Michigan have been the friendliest states for autonomous vehicle thus far, but that may change after the series of accidents and growing public distrust of the industry. Waymo, the self-driving car arm of Google’s parent company Alphabet, was predicted to kill 50 people by 2022. Meanwhile, 16 people die from regular car accidents every day, but high-profile cases like Tesla’s Model X crash that fatally engulfed a 38-year-old Apple software engineer in flames, capture media attention and spark fear.
Who Is Responsible for Self-Driving Car Safety?
The eventual goal for Uber’s self-driving fleet is to have a completely autonomous system that can pick up passengers and deliver them to their destination safely without a human backup operator. Right now, the developmental system requires human attention at all times, so what happened in Tempe cannot be entirely blamed on the machinery itself. But, nevertheless, Uber settled with Herzberg’s family within a week of the accident.
The investigation of that machinery is still ongoing, Eric Weiss, a spokesperson for the NTSB, tells Inverse in an email. Meanwhile, Uber won’t say for certain what it will implement on its end, but a public statement should emerge within the next few weeks. And while they’re on hold, companies like Waymo continue to advance in the race to rule the autonomous driving market without notable safety setbacks. Wherever Uber places in that race, it will be the first company to navigate the tricky topic of who’s to blame when self-driving cars make fatal mistakes.
Additional reporting to this article was contributed by Mike Brown.