The death of a woman killed by one of Uber’s self-driving cars shows that we are still a long way from transitioning to an autonomous driving utopia.
Elaine Herzberg, 49, was hit by a self-driving Volvo XC90 SUV in Arizona in March. Uber suspended the program in Arizona as well as in Pittsburgh, San Francisco and Toronto.
That driverless vehicle experts say the crash could have been avoided shows how complicated the issue is.
Cortica, a firm that develops artificial intelligence for autonomous vehicles, analysed the dash cam video and concluded the car, which failed to brake or swerve before the collision, had enough time to react and potentially save Ms Herzberg’s life.
Cortica CEO, Igal Raichelgauz, said the firm’s self-driving AI system detected Ms Herzberg 0.9 seconds before impact, when the car was about 15 metres from her, travelling at 61km/h.
He said the autonomous car’s cameras and radar system should have had enough time to pick up the pedestrian and react to the situation.
“The advantage of machine response time and control means the right actions could be made to certainly mitigate the damage,” Mr Raichelgauz told CNET.
There was a human safety driver at the wheel, but the video showed she was clearly distracted and looking down from the road.
The car should have detected the woman crossing the road. It was fitted with Lidar sensors (Light Detection and Ranging Systems), which is how Uber’s autonomous cars detect the world around them. Lidar is supposed to work well in the dark.
A top executive for the maker of Lidar sensors said she was ‘baffled’ as to why the vehicle failed to recognise Ms Herzberg. Marta Thoma Hall, President of Velodyne Lidar, said the company doesn’t believe its technology failed.
“We are as baffled as anyone else,” Ms Hall wrote in an email.
“Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our Lidar doesn’t make the decision to put on the brakes or get out of her way.”
Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles, told CNET the dash cam footage seemed to show the vehicle was at fault.
“Although this video isn’t the full picture, it strongly suggests a failure by Uber’s automated driving system.
“The victim is obscured by darkness – but she is moving on an open road. Lidar and radar absolutely should have detected her and classified her as something other than a stationary object.”
One possible explanation is that a faulty machine learning process caused the software to interpret the pedestrian and her bike as a plastic bag or piece of cardboard. Even little things, like a few patches of tape on a stop sign, have been observed to fool these systems. Self-driving cars have also been known to see shimmering exhaust as solid objects.
One system could not see through certain kinds of weather – such as fog – even if objects were still totally visible to the human eye.
If the classification is off, the system’s predictions might be off too. These systems expect humans to move in specific ways and plastic bags to move in another. If classification is the problem Uber might have to collect hundreds of thousands of additional images to retrain their system.
Ms Herzberg’s death was not the first involving self-driving technology. In 2016, a man in Florida was killed while at the wheel of a Tesla when using its Autopilot feature, which uses a computer vision-based vehicle detection system that differs from the technology used by Uber.
It is not clear if or when Uber will return the vehicles to the road.