You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Footage of the fatal self-driving Uber crash raises new questions about legal liability.

Jeff Swensen/Getty Images

Last weekend, a 49-year-old woman was struck and killed by an autonomous Uber—which had an operator in the driver’s seat—while she was walking a bike across a road in Tempe, Arizona. As I wrote on Tuesday, it wasn’t clear who would be responsible for her death under current Arizona law, given the state lax approach to regulating  the new technology.

Tempe Police Chief Sylvia Moir told newspapers earlier this week that an initial probe showed no fault on Uber’s part because the pedestrian came into the street “like a flash.” Dash-cam footage released earlier today, however, paints a slightly different picture of events.

The footage cuts off before showing the car striking the pedestrian, but captures the events immediately leading up to the crash. Early reports suggested the woman had entered the road when the Uber vehicle approached, but the car’s headlights show her already in the middle of the road less than two seconds before the collision. That may have been too slow for a human driver to respond at those speeds, but it raises questions about whether or not the self-driving car’s sensors detected her at all.

One expert told the Associated Press that the short clip “is strongly suggestive of multiple failures of Uber and its system, its automated system, and its safety driver.” Another expert told CNN that, even in the dark, the pedestrian “should have been in [Uber’s] system purview to pick up.”

The footage could prompt Arizona officials to adopt new regulations for future tests in the state. It’s impossible to know if any single rule would have changed this accident’s outcome, but other jurisdictions had taken steps to prevent similar collisions. Under Nevada law, for example, self-driving cars undergoing open-road tests must be accompanied by a pilot car driven by a human operator.