WASHINGTON -- Uber Technologies’ self-driving vehicle unit lacked an effective safety culture at the time when one of its test vehicles struck and killed a pedestrian in Tempe, Arizona, last year, U.S. National Transportation Safety Board Chairman Robert Sumwalt said Tuesday.
“The inappropriate actions of both the automatic driving system as implemented and the vehicle’s human operator were symptoms of a deeper problem, the ineffective safety culture that existed at the time,” Sumwalt said as he opened a board meeting to determine the probable cause of the collision.
The probe is the NTSB’s first to examine a fatal crash involving a self-driving test vehicle. The case is being closely watched in the emerging autonomous vehicle industry, a sector that has attracted billions of dollars in investment from companies such as General Motors and Google parent Alphabet in an attempt to transform transportation.
Elaine Herzberg, 49, was hit and killed by an Uber self-driving car as she walked her bicycle across a road at night. Uber halted self-driving car tests after the crash investigative information released since the March 2018 collision highlighted a series of lapses -- both technological and human -- that the board may cite as having contributed to the crash. Uber resumed self-driving testing late last year in Pittsburgh.
The Uber vehicle’s radar sensors first observed Herzberg about 5.6 seconds prior to impact before she entered the vehicle’s lane of travel and initially classified her as a vehicle. The self-driving computers changed its classification of her as different types of objects several times and failed to predict that her path would cross the lane of self-driving test crossover, according to the NTSB.
The modified Volvo crossover being tested by Uber was not programed to recognize and respond to pedestrians walking outside of marked crosswalks, nor did the system allow the vehicle to automatically brake ahead of an imminent collision. The responsibility to avoid accidents fell to the single safety driver monitoring the vehicle’s automation system, while other companies place a second human in the vehicle for added safety.
The safety driver was streaming a television show on her mobile phone in the moments before the crash, despite company policy prohibiting drivers from using mobile devices, according to police. The NTSB has also said that Uber’s Advanced Technologies Group that was testing self-driving cars on public streets in Tempe did not have a standalone safety division, a formal safety plan, standard operating procedures or a manager focused on preventing accidents.
Uber made extensive changes to its self-driving system after several reviews of its operation and findings by NTSB investigators. The company told the NTSB that the new software would have been able to correctly identify Herzberg and triggered controlled braking to avoid her more than 4 seconds before the original impact, the NTSB has said.