Self-driving Policies, Tech of Uber Face Questions Following Deadly Crash


In the wake of the deadly crash where one of the self-driving SUVs struck of Uber killed a pedestrian in Arizona, a report from the New York Times has dug into the program of the company and discovered that it is significantly trailing the competition. Specifically, while Waymo, the former project of Google, could have its cars average 5,600 between incidents where a test driver was required to take control, and the Cruise of GM averaged some 1,200 miles, the documents of Uber reveal that it was not consistently meeting an internal goal of averaging approximately 13 miles.

The New York Time report also confirms what Jalopnik discovered: that not like every other company that is testing the technology of self-driving cars, it is only making use of a single driver for the monitoring of both safety and performance. Nissan, Ford, and Toyota all confirmed the use of two operators as part of the policy of their company. On the other hand, Waymo said that ever since 2015, it has a single driver when making use of “validated” software and hardware, but it adds a second tester when any of that changes, or for new cities, drivers, and kinds of roads.

The report by the NYT also notes that not like the publicly available reports of California, Arizona has no such requirement, and the test of Uber in California have not been going on long enough for it to be able to issue one there. Additionally, Dara Khosrowshahi, the new CEO of Uber, is said to have considered shutting down the said program.

Another major question that surfaced is when and if the sensors of the car was able to pick up the victim, Elaine Herzberg.

Velodyne Lidar produces the sensors that the self-driving cars use to “see” their surroundings in addition to the cameras, and the president of the company wrote to Bloomberg about that.

In an email following the release of the video of the crash, Thoma Hall said: “Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our Lidar doesn’t make the decision to put on the brakes or get out of her way…The problem lies elsewhere.”

As a part of the transparency that could likely help make or break the public trust of autonomous tech following an incident like this, Uber will need to explain more regarding the systems that were active during the said incident, and what they actually responded to.