Uber's 'ineffective safety culture' blamed for driverless car crash
WASHINGTON - The chairman of the US National Transportation Safety Board (NTSB) offered a harsh critique of Uber Technologies Inc's overall approach to safety in draft findings released on Tuesday of a fatal self-driving car crash in Arizona in March 2018.
"The inappropriate actions of both the automatic driving system as implemented and the vehicle's human operator were symptoms of a deeper problem," NTSB Chairman Robert Sumwalt said, citing the "ineffective safety culture that existed at the time of Uber."
The NTSB submitted draft findings in Sumwalt's written testimony Congress ahead of a Senate hearing Wednesday as a probable cause in the crash was ongoing. Reuters reviewed the draft testimony.
According to Sumwalt's draft testimony, the probable cause of the Uber crash was the failure of the vehicle operator to monitor the driving environment "because she was visually distracted throughout the trip by her personal cell phone."
Cited as contributing factors were Uber's inadequate safety risk-assessment procedures and ineffective oversight of the vehicle operator. The testimony said the findings were "subject to change pending the board's adoption of the final report."
A spokeswoman for Uber's self-driving car effort, Sarah Abboud, did not comment immediately on Tuesday, said before the hearing the company has "adopted critical program improvements to further prioritize safety."
The testimony also cited the pedestrian’s crossing outside a crosswalk and the Arizona Department of Transportation’s insufficient oversight of autonomous vehicle testing.
Sumwalt's testimony also said the board was recommending the National Highway Traffic Safety Administration require entities testing self-driving vehicles submit a safety self-assessment report to the agency and for the agency to determine if plans include appropriate safeguards.
While Uber has made improvements, Sumwalt's testimony cited ongoing concerns about safety issues in the self-driving vehicle sector. "We remain concerned regarding the safety culture of the numerous other developers who are conducting similar testing," said.
The NTSB's recommendations will likely reverberate across the industry. The Arizona crash was the first-ever death attributed to an autonomous vehicle and prompted significant safety concerns about the nascent self-driving car industry, which is working to get vehicles into commercial use.
The crash killed 49-year-old Elaine Herzberg as she was walking a bicycle across a street at night in Tempe, Arizona.
The NTSB previously disclosed that the Uber vehicle had significant software flaws, noting the software failed to properly identify Herzberg as a pedestrian and did not include a consideration for jaywalking pedestrians. Herzberg was not crossing at an intersection. Uber had also deactivated a Volvo automatic emergency braking system in the XC90 test vehicle it had modified.
The NTSB said on Tuesday it planned to identify the need for "safety risk management requirements for testing automated vehicles on public roads," signaling a broader question about how advanced vehicles are tested and US. government oversight.
In the aftermath of the crash, Uber suspended all testing of self-driving vehicle. It resumed testing last December in Pennsylvania with revised software and significant new restrictions and safeguards.
In March, prosecutors in Arizona said Uber was not criminally liable in the self-driving crash. Police have investigated whether the safety driver who was behind the wheel and supposed to respond in the event of an emergency should face criminal charges.
Police have said the crash was "entirely avoidable" and that the backup driver was watching "The Voice" TV program at the time of the crash.
The NTSB said Uber failed to adequately monitor backup safety drivers and lacked other significant safety rules.Reuters