Uber Halts Nationwide Testing Of Self-Driving Vehicles Following Death of Pedestrian
Even as robotics experts, universities and tech luminaries sound the alarm about the potential for a future filled with killer robots powered by artificial intelligence, this technology already has arrived … minus the stringent ethics.
Fox News is reporting that a Tempe, Arizona woman was struck and killed near a crosswalk by an Uber vehicle that was in full autonomous mode at the time of the accident, despite having a human inside the vehicle. Fox stated that this is “an incident believed to be the first of its kind.”
While strictly correct that this is the first pedestrian killed, regular readers of Activist Post might recall that in July, 2016 I warned about some disturbing indications that this would be inevitable.
At the time, I highlighted the failure of Tesla’s autopilot sensors to detect an oncoming tractor trailer, which killed the test driver. Previous to that, there were ominous signs of this potential when Google’s self-driving cars first had failures that resulted in them being hit, but later actually caused an accident with a bus. As I stated then:
These incidents and dilemmas have thus far occurred during training and testing, which might mitigate some of the seriousness, but nonetheless points to some genuine flaws that should preclude these vehicles from being widely employed.
Now that autonomous vehicles have been unleashed upon the public, we are starting to see the unfortunate ramifications. To Uber’s credit, they at least are announcing a halt to all autonomous testing nationwide.
Aside from the technical challenges, questions have been raised about the ethics and morality that will be required in certain fatal situations. That area, too, has raised eyebrows. Is it right to sacrifice the lives of some to save others?
The standards are already becoming morally complex. Google X’s Chris Urmson, the company’s director of self-driving cars, said the company was trying to work through some difficult problems. Where to turn – toward the child playing in the road or over the side of the overpass?
Google has come up with its own Laws of Robotics for cars: “We try to say, ‘Let’s try hardest to avoid vulnerable road users, and beyond that try hardest to avoid other vehicles, and then beyond that try to avoid things that that don’t move in the world,’ and then to be transparent with the user that that’s the way it works,” Urmson said. (Source)
The truth is that researchers are still in the process of developing foolproof sensor systems and artificial intelligence that can properly recognize all surroundings and develop true situational awareness, yet they continue to be deployed into the real world. It’s also worth noting that the general public is overwhelmingly concerned about having A.I. vehicles in public, as Fox News cites a 78% disapproval.
Now we will wait to see if the response to this event will be a technological solution or a political one. As The Daily Sheeple rightly notes, this very well could be a crisis that the government can’t let go to waste. Currently, regulations for autonomous vehicles tend to vary by state. Will this Uber accident spur quick calls for stricter federal oversight?
The fatal crash will most likely prompt an even bigger and overbearing government response complete with regulations for self-driving cars. Legislators are already debating how much freedom the private sector should have. The proposed bills would preempt states from establishing their own laws overseeing autonomous testing, which could clash with California’s well-established system. But the bill is stalled in the Senate, with several lawmakers “expressing concern about the amount of leeway offered to the private sector.” Translation: the intrusive government is debating how much if any, freedom the private sector deserves. (Repeat: “we are free.