You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

A Self-Driving Uber Killed a Woman. Whose Fault Is It?

The fatality in Tempe, Arizona, took place in a regulatory vacuum.

Jeff Swensen/Getty Images

When a driverless car kills someone, who’s to blame?

That’s no longer a hypothetical question. A self-driving car operated by Uber struck and killed a woman on a street in Tempe, Arizona, on Sunday night, likely marking a grim milestone for the nascent technology: the first pedestrian killed by such a car on public roads.

Police say the 49-year-old woman was walking a bike across the street, outside the crosswalk, at around 10 p.m. The Uber was traveling at 40 miles per hour in autonomous mode, with an operator in the driver’s seat, when she was hit. Police have not yet determined who was at fault. (The car apparently didn’t slow down, and the operator didn’t appear impaired.) Nonetheless, Uber immediately suspended its self-driving tests in Arizona and nationwide, as many in the tech industry reacted with alarm.

There’s an ongoing debate about legal liability when it comes to collisions in which an autonomous vehicle harms someone else through no fault of that person. Would the blame lie with the self-driving car’s owner, manufacturer, a combination of the two, or someone else? In their quest to become the Mecca of self-driving cars, Arizona regulators have largely left those questions unanswered, The New York Times reported last year:

Arizona officials said the public is essentially protected by basic rules that require a licensed driver somewhere in the driverless car. They added that they planned to take a back seat to the experts when it comes to rule-making. The state insurance regulator, for example, said he would wait for the insurance industry to guide regulators on liability policies for driverless cars, amid questions about who is responsible in a crash if the car isn’t driven by a human.

This laissez-faire regulatory strategy was designed to entice Silicon Valley companies looking to test products outside of California, which has taken a more cautious approach to the new technology. It worked. The Times reported that Ford, GM, Google, Intel, and Uber are all testing self-driving cars in Arizona. Exact figures are sparse, but there are at least hundreds of driverless vehicles in use in the state.

“What we see in today’s sad news is another example of tech experimentation outpacing thoughtful regulation,” Elizabeth Joh, a U.C. Davis law professor who specializes in technology and the law, told me on Monday. Questions about who should bear legal responsibility for self-driving car accidents, she said, can draw upon tort law, which wrestles with questions about liability and negligence.

In a Brookings Institution paper published in 2014, UCLA law professor John Villasenor argued that product-liability law offered the best guidance for determining legal fault with an emerging technology like self-driving cars. “Products liability has been one of the most dynamic fields of law since the middle of the 20th century,” he wrote, pointing to the courts’ flexibility in adapting old doctrines to new commercial goods.

These legal approaches pertain only to civil proceedings—lawsuits, in other words. What if a self-driving car commits the equivalent of vehicular manslaughter? “criminal penalties are a different story,” Claire Cain Miller wrote in the Times in 2014, “for the simple reason that robots cannot be charged with a crime.” As Ryan Calo, a robotics law expert at the University of Washington School of Law, told her, “Criminal law is going to be looking for a guilty mind, a particular mental state—should this person have known better? If you’re not driving the car, it’s going to be difficult.”

Though it lost ground to Arizona and other states on hardware testing, California is still breaking new ground in the legal front. In December, the state department of motor vehicles rejected a GM-backed proposal that would shift liability from the companies onto the consumer if a self-driving car’s sensors weren’t properly maintained. But in March, the state took a step in Arizona’s direction, repealing a rule that required someone to be in the driver’s seat during autonomous tests. The revision won’t go into effect until April.

Villasenor urged Congress to leave this issue to the states, noting that disputes over liability for car manufacturer defects have “always been the province of state courts applying state tort remedies.” But Hill legislators may have different ideas. The Republican-controlled House of Representative last September passed a bill supported by manufacturers that “would make it so that states can no longer write legislation that the auto industry considers restrictive.” Instead, the law would allow federal regulators to “make the guidelines more uniform”—and, presumably, more favorable to the auto industry than in some of the most restrictive states. The Senate has yet to take up the measure.

This was not Uber’s first accident involving a self-driving car in Tempe. One year ago, a woman driving a Honda CRV turned into an intersection and hit a self-driving Uber Volvo, which flipped and damaged two other cars. A Tempe police report found differing accounts of blame among drivers and participants in the crash, none of whom were seriously injured.

There are a number of arguments in favor of driverless cars. They will reduce traffic jams and cut pollution. They’re also theoretically safer compared to human drivers, who are sometimes intoxicated, distracted by their smartphone, or otherwise reckless. Some commentators opined that autonomous-vehicle research shouldn’t be scaled back when tens of thousands of Americans are killed by human-operated cars each year.

Joh noted that focusing on those deaths misses the more immediate point. “This is the deployment of a new technology, with a host of foreseeable issues and questions,” she explained. “The Uber incident should force states to reconsider what safeguards they should have in place beforehand.”

According to the National Association of State Legislatures, 21 states have laws regulating self-driving vehicles in some way. They run the gamut from wide-open regulatory regimes like Arizona to stricter regulations like those in Nevada, which
requires that two operators in a self-driving car during a test on public roads. The car must also be accompanied by a pilot vehicle driving directly ahead of it. While it’s impossible to know if these precautions would have prevented what happened in Tempe, they could reduce the chance of similar accidents in the future.

That further regulations may be useful isn’t a slight against self-driving cars as a whole. Even the most utopian tech evangelists must have known that autonomous vehicles would eventually be involved in fatal accidents. Silicon Valley and the auto industry have a responsibility to make these cars as safe as possible, but the onus is also on state legislators to build a regulatory landscape that protects everyone else.