Tempe, Arizona has been one of the testing grounds for self-driving vehicles for a few years now. They were excited by the opportunity, and up until yesterday— it had only proved to be fortuitous. But, Sunday evening, this testing took a turn for the worse when one of the autonomous vehicles struck and killed a pedestrian in the street. While there was a backup driver in the vehicle, she did not intervene; leading to the first incident that will bring into question the credibility and safety of a vehicle without a driver.

Uber’s Self-Driving Vehicle Kills A Pedestrian: Where Do Autonomous Vehicles Go From Here?

Tempe exists as a testing ground for plenty of companies testing out autonomous vehicles. From Uber to a few other technology companies that have been toying with the idea. The idea itself is quite attractive, and if done correctly, it could potentially make the roads much safer. There’s no room for human error, distracted drivers, or even drunk drivers. But, even though there’s no room for human error— what happens when there’s a technology glitch?

Think about you cell phone, your computer, your nav system in the car. Those are all technological systems. But you, just as I, have inevitably faced some frustration from time to time over it not doing it’s job. “Stupid technology,” we say, and then go about our day. But what happens if that same type of glitch occurs in an autonomous vehicle?

So, who’s fault is it?

The footage from cameras facing inside and outside of the vehicle has led the police chief to question who’s at fault for this incident? It came as a shock to many, and led them to question the integrity of that police department. According to the police chief’s statement, “it would have been difficult to avoid this collision in any kind of mode based on how she [the pedestrian] came from the shadows right into the roadway.” According to reports, she was also crossing from outside of a crosswalk— making it more difficult for the autonomous vehicle to account for her entrance to the roadway.

But, there was a backup driver, a human being, inside the vehicle that is supposed to take over if need be. It’s safe to assume that this circumstance would be one in which a backup driver should take over. But, how do we determine who’s at fault? Is it the self-driving vehicle, the backup driver, or the woman crossing the street, from the shadows, after dark?

Where does Uber go from here?

While the investigation is ongoing, Uber has suspended it’s testing. They are working with authorities to better understand what happened, and where to go from here. Research is being done, but as of now, the future of Uber’s Self-Driving Vehicles is unclear. Accidents will inevitably happen on the roadway. Fender bender’s are a part of everyday life for driver’s. But, when there is no driver— how do we determine where the fault lies?