Published in Transportation

Software failed in Uber pedestrian fatality

by on29 May 2018

NTSB confirms

The USA National Transportation Safety Board (NTSB) has issued a preliminary report about the first Uber pedestrian fatality. In short, the sensors worked and the software failed.

The good news for the whole automotive and transportation industry is that according to the preliminary report, the sensors worked as expected spotting Elaine Herzberg about six seconds prior to impact. This would give a self-driving car or an operator enough time to bring the car to a stop, despite rather high 43 miles per hour speed.

While 43 miles (69.2 KM/h) per hour doesn’t sound like a great speed, but since Uber self-driving is a technology that is being evaluated, one automotive executive independently told Fudzilla that the car in this phase should be driving at a slower speed than that. The drive and the car didn’t break the official speed limit, but driving the car at a slower speed would probably give the casualty a better chance of survival upon impact.

Software got confused

The official preliminary report stated:

As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.

At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision.

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior.

The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

The sad part is that the vehicle involved in the fatality was a modified Volvo XC90 SUV. This vehicle comes with emergency braking capabilities by default, but Uber automatically disabled these capabilities while its software was active.

Software classification

The big issue was the object classification caused by the software and the fact that the pedestrian was walking by the bike was probably never programmed in the Uber self-driving system. The objects (pedestrian and bicyclists) are expected either to walk or run or cycle but not walk by the bicycle.

It is hard to get to the bottom of things as we don’t really have the access to Uber self driving system, but this would be our best guess.

In the meantime, the driver was looking down, claiming she was not looking at the phone but the touchscreen that was used to monitor the self-driving car software. Right…

She did manage to apply the brakes a fraction of a second before impact and reduce the speed down to 39 MPH at the time of impact. Most likely, if she should have paid attention on the road, the casualty was likely to be avoided.

The confusing part is that the software saw the pedestrian walking though three lanes before the impact and despite the fact that it had six seconds, enough time to enforce the emergency breaks, it did nothing. Despite the fact that the pedestrian walking by the bike was a false positive, the system should not actually hit any kind of obstacle.

Car must avoid collision 

The key task of driving is avoiding collisions. I remember that at the time of the fatal collision, I was at Nvidia GTC technology conference chatting with a few of our colleagues close to the automotive industry.

Even back then, Fudzilla strongly suggested that is unlikely that sensors like Lidar or Radar can fail to detect the pedestrian. These sensors are incredibly precise and rather safe from failures. The Lidar with its light rays paints a great picture of the vehicles surrounding. It is up to software developers to make sense of the signals and detect and classify objects such as trees, cars, trucks, bushes, stones, boxes, cones, potholes, animals, bikers or anything else that might be on the street.

Even with false positives, a car should never hit anything as even hitting a large rabbit or pheasant can seriously damage the car and put the people in the car in serious danger.   

Elaine Herzberg will make it to the history books and her family probably end up with some sort of compensation, but the fact that this could have been avoided proves the whole point of  self driving, that humans can easily get distracted.

If the Uber driver was paying attention, this could have been possible to prevent. Just in the USA in 2017 there were 40,100 non self driving casualties.

Just a week ago, a fatal accident in Hungary left nine people dead when a transportation van slammed into a truck as the driver was on Facebook Live. One day in the not so distant future, ADAS systems will be able to prevent these deaths.

Last modified on 29 May 2018
Rate this item
(0 votes)

Read more about: