Self-Driving Uber Did Not Have Emergency Braking Turned On When It Hit Pedestrian, NTBS Says

0
>>Follow Matzav On Whatsapp!<<

The self-driving Uber that struck and killed a pedestrian in March initially misidentified the woman as a vehicle and was deliberately put on the road without its emergency braking system turned on, federal investigators said Thursday.

“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior,” according to a preliminary report by the National Transportation Safety Board.

The human back-up driver did not begin braking until after 49-year-old Elaine Herzberg was hit crossing a dark Tempe, Az., thoroughfare, the NTSB said.

Federal safety investigators have not given a cause for the crash. But they described a series of initial findings that raise far-reaching questions about Uber’s decision making, engineering and approach to safety as it worked to develop a potentially lucrative driverless system on public roads.

The Uber 2017 Volvo XC90 that killed Herzberg comes factory-equipped with an automatic emergency braking function, called City Safety, according to the NTSB. Uber disables that and several other safety features when the car is being controlled by Uber’s self-driving system, but keeps them on when the car is being driven manually by a person.

Uber has indicated that the Volvo crash mitigation systems are designed to assist drivers, not to be part of a self-driving system.

In response to questions, an Uber spokeswoman provided a statement that did not address specific findings. It said the company has worked closely with the NTSB and is reviewing the safety of its program. “We’ve also brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture, and we look forward to sharing more on the changes we’ll make in the coming weeks,” the statement said.

According to the NTSB, the Uber SUV’s sensors first detected Herzberg “about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.”

Then at 1.3 seconds before Herzberg was hit, “the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision,” the NTSB said. But since emergency braking maneuvers were “not enabled” by Uber, it’s up to the human safety driver to takeover. The NTSB, without comment, outlined the inherent disconnect in Uber’s procedures.

“The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator,” according to the preliminary report.

One reason Uber would have disabled automatic emergency braking in its self-driving cars is to avoid what can be a herky-jerky ride when the cars’ cameras or sensors keep seeing potential problems ahead and braking, experts said. The NTSB said Uber was seeking to reduce “erratic vehicle behavior.”

Uber may have been seeking”to reduce the number of ‘false positives,’ where the computer potentially misclassifies a situation and the automatic emergency braking engages unnecessarily,” said Constantine Samaras, a robotics expert and assistant engineering professor at Pittsburgh’s Carnegie Mellon University. “False positives like that could also be dangerous, especially at higher speeds.”

Samaras said that instead of false positives, there appeared to be a false negative in this case.

“The car saw the pedestrian six seconds before impact but misclassified them until 1.3 seconds before impact. Even at that point, the computer determined that emergency braking was needed, but the function was disabled and there is no mechanism to alert the driver,” he said.

“We know that humans are a terrible back-up system. We’re easily distracted and we have slower reaction times,” Samaras added. “Alerting the driver to these types of situations before the crash seems like a no-brainer.”

The safety driver, Rafaela Vasquez, is seen in a Tempe police video looking down several times just before the crash. She said she was looking at elements of Uber’s self-driving system, and not her cellphones, which she said she did not use until she called 911. The NTSB continues to investigate that and other elements of the crash, and added that “the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.”

Vasquez was not tested for alcohol or drugs, but police said she showed no signs of impairment.

The victim, Herzberg, tested positive for marijuana and methamphetamine, according to the NTSB. She was dressed in dark clothes, walking outside the crosswalk. The bicycle she was pushing had reflectors, the NTSB said, but they were facing away from the oncoming Uber.

Data retrieved after the crash “showed that all aspects of the self-driving system were operating normally at the time of the crash,” investigators said, and there were no error messages.

Investigators said Uber’s system is not designed to warn the safety driver that he or she should stop the car in this situation. That decision echoes shortcomings in recent decades in other fields as people have increasingly relied on automation, according to Duke University robotics expert Missy Cummings.

“This lesson has been written in blood over and over and over again,” said Cummings, director of the university’s Humans and Autonomy lab. She cited the Three Mile Island nuclear accident in 1979, as well as several airplane crashes, saying in both circumstances engineers decided not to give human operators critical information they needed to try to prevent tragedies.

Cummings said computer vision also remains problematic, and often has trouble figuring out which objects on the road present actual dangers, sometimes getting confused by something as innocent as a plastic bag. And the communications between vision and braking system can be fuzzy and fail for a variety of reasons, she said. “This is why we need to be careful about putting these cars on the road before we work these issues out,” Cummings said.

Uber announced this week that it is shutting down its self-driving operation in Arizona, and laying off nearly 300 workers, mostly backup drivers. The company has been talking with state and local officials in Pennsylvania in hopes of restarting testing on public roads this summer in Pittsburgh, where its main driverless research group is based.

But Uber has irked Pittsburgh Mayor William Peduto, an enthusiastic early Uber supporter who is now emphasizing concerns about safety. He has demanded, for instance, that the company agree to limit its self-driving cars to 25 miles per hour in the city, no matter the posted speed limit, since slower speeds increase the chance pedestrians will survive being hit.

“They never responded to the conditions he raised in the talks,” Peduto spokesman Timothy McNulty said.

Driverless car legislation passed by the U.S. House last year and pending in the Senate would not allow localities to make such demands. Automakers and driverless tech companies have pushed hard in Washington against what they warn could be a debilitating “patchwork” of state and local regulations that would stymie the economic and safety benefits they see with driverless cars.

“If Uber agreed to the local demands voluntarily the restrictions wouldn’t be subject to state or federal statute,” McNulty said.

(c) 2018, The Washington Post · Michael Laris 

{Matzav.com}


LEAVE A REPLY

Please enter your comment!
Please enter your name here