The Arizona Republic reports the National Transportation Safety Board found that the main factor in a March 2018 incident in which a pedestrian was struck and killed by a self-driving Uber in Tempe, Arizona was the vehicle’s operator, as “she was watching ‘The Voice’ on her phone instead of the road.” However, the NTSB “identified several other contributory causes in its final report submitted on Tuesday.” The Republic says “many things went wrong,” including “Uber’s failure to program its cars to predict the movement of people jaywalking, and the company’s decisions to turn off the standard Volvo emergency brakes and to require their own system to pause a full second before emergency braking.”
The Detroit News reports the NTSB “voted unanimously on Tuesday that the probable cause of the crash was ‘the failure of the vehicle operator to monitor the environment and the operation of the automated driving system because she was visually distracted throughout her trip by her personal cellphone.’” NTSB Chairman Robert Sumwalt said, “This crash was about testing the development of automated driving systems on public roads. Its lessons should be studied by any company in any state.” The News adds that “NTSB board member Jennifer Homendy took the White House to task for resisting calls from safety advocates to make it mandatory for automakers to submit safety assessments of their self-driving test programs. She said it was ‘laughable’ that the Trump administration has argued it does not have the legal authority to force carmakers to make safety assessments public.”
The AP reports the NTSB “criticized the National Highway Traffic Safety Administration, the government’s road safety agency, for failing to lead in regulating tests [of autonomous vehicles] on public roads.” With regard to NHTSA, Homendy said, “In my opinion they’ve put technology advancement here before saving lives,” adding, “There’s no requirement. There’s no evaluation. There’s no real standards issued.” According to the AP, “NHTSA has issued voluntary guidelines including safety assessment reports from autonomous vehicle companies, but only 16 have filed such reports, the NTSB said.” The AP adds that “the board voted to recommend that NHTSA require companies to turn in the reports and set up a process for evaluating them.”
Reuters reports the NTSB noted “a ‘lack of federal safety standards’ for automated driving systems.” Sumwalt said, “The collision was the last link of a long chain of actions and decisions made by an organization that unfortunately did not make safety the top priority.” Reuters adds that Sumwalt “praised Uber’s cooperation.” He said, “I did notice that when I talked to their CEO he did not hang up on me,” adding, “It would be easy just to thumb it off. Blow it off. Say, NTSB, they’re wrong, they’re bad, and hang up on us. But Uber has not done that.”
From the news release of the American Association for Justice.