Uber Accident: A Setback for Autonomous Vehicles?
March 19, 2018

Pete Goldin
ITSdigest

Last night, March 18, a pedestrian was struck and killed by one of Uber's self-driving cars in Tempe, AZ. In response, Uber halted testing of autonomous vehicles across North America on Monday.

Tempe police stated: "The vehicle involved is one of the Uber’s self-driving vehicles. It was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel."


The Washington Post stated: "Missy Cummings, a robotics expert at Duke University who has been critical of the swift rollout of driverless technology across the country, said the computer-vision systems for self-driving cars are 'deeply flawed' and can be 'incredibly brittle,' particularly in unfamiliar circumstances. Companies have not been required by the federal government to prove that their robotic driving systems are safe. 'We’re not holding them to any standards right now,' Cummings said, arguing that the National Highway Traffic Safety Administration should provide real supervision."

"The fact is these things will save lives and we need to get there"

The Post also quoted Timothy Carone, an associate teaching professor specializing in autonomous systems at the University of Notre Dame, who said: "fatal crashes involving autonomous vehicles, while tragic, will become become more commonplace as testing is introduced and further expanded. The road testing is the only way the systems can learn and adjust to their environments, eventually reaching a level of safety that cuts down on the number of motor vehicle deaths overall ... It’s going to be difficult to accept the deaths … but at some point you’ll start to see the curve bend ... The fact is these things will save lives and we need to get there."

"This tragic accident underscores why we need to be exceptionally cautious when testing and deploying autonomous vehicle technologies on public roads," said Senator Edward J. Markey (D-MA), a member of the Senate Commerce, Science and Transportation Committee. "If these technologies are to reap their purported safety, efficiency, and environmental benefits, we must have robust safety, cybersecurity, and privacy rules in place before these vehicles are traveling our roadways to prevent such tragedies from occurring. I’m committed to work with my Senate Colleagues on developing a comprehensive autonomous vehicle legislative package that ensures these important protections are included."

Ironically, just this past Friday Uber and Waymo "urged Congress to pass sweeping legislation to speed the introduction of self-driving cars into the United States," Reuters reported. "Some congressional Democrats have blocked the legislation over safety concerns, and Monday’s fatality could hamper passage of the bill, congressional aides said Monday."

"There should be a national moratorium on all robot car testing on public roads"

"There should be a national moratorium on all robot car testing on public roads until the complete details of this tragedy are made public and are analyzed by outside experts so we understand what went so terribly wrong,” said John M. Simpson, Consumer Watchdog’s Privacy and Technology Project Director. "Arizona has been the wild west of robot car testing with virtually no regulations in place. That’s why Uber and Waymo test there. When there’s no sheriff in town, people get killed."

"Uber simply cannot be trusted to use public roads as private laboratories without meaningful safety standards and regulations," Simpson added.

The question: Will this tragedy ultimately push autonomous vehicle researchers to focus even more on safety, or will this set the industry back and cause states like AZ to rethink driverless-friendly policies?

Share