Mind Matters Natural and Artificial Intelligence News and Analysis
Urban traffic Pexels
Pexels

Are self-driving cars really safer?

A former Uber executive says no. Before we throw away the Driver’s Handbook…
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

This might be a good time to look more closely at the claims we’ve heard over the years that self-driving cars will improve safety:

People are so bad at driving cars that computers don’t have to be that good to be much better. Any time you stand in line at the D.M.V. and look around, you’re like, Oh, my God, I wish all these people were replaced by computer drivers. Ten to 20 years out, driving your car will be viewed as equivalently immoral as smoking cigarettes around other people is today. – venture capitalist Mark Andreessen

Andrew Goldman, “Bubble? What Bubble?” at New York Times (July 7, 2011)

This fearful thinking might be standing in the way of real progress. Because if you recognize that self-driving cars are going to prevent car accidents, AI will be responsible for reducing one of the leading causes of death in the world. – Mark Zuckerberg

Biz Carson, “Mark Zuckerberg: We shouldn’t worry about AI overtaking humans ‘unless we really mess something up’ Biz Carson” at Business Insider (February 28, 2016)

GM believes self-driving cars can significantly avoid accidents and crashes caused by human behavior, and eventually lead to safer transportation. – General Motors Chairman and CEO Mary Barra

GM Corporate Newsroom, “Mary Barra Outlines GM’s Road Map for Safer, Better and More Sustainable Transportation Solutions” at General Motors (September 15, 2017)

Tesla *with* Autopilot engaged is twice as safe & continues to make steady improvements – Elon Musk at Twitter (January 8, 2019)

But what if self-driving cars are not safer? What if—as deployed in the real world—they drive worse than humans?

Robbie Miller

Robbie Miller— the former Uber executive who, just days before an autonomous Uber struck and killed a pedestrian, warned the company of problems with their self-driving cars—says that the industry has “propelled us into the realm of safety theater—meaning creating the illusion of safety instead of actually delivering on safety.”

That’s not the half of it. In April, Miller released a study claiming self-driving vehicles were actually recording incident rates higher than that of your typical motorist. Contrasting data from the Strategic Highway Research Program (SHRP) and the California DMV, he concluded that autonomous test vehicles created more injuries per mile than the average human motorist with a few years of practice.

That’s not what we’re being sold. Automakers have repeatedly suggested that AV testing is a gateway to a safer world, with major breakthroughs close at hand. But Miller argued that focusing on the number of miles a manufacturer covers with its self-driving fleet doesn’t yield much more than reduced public safety.

Matt Posky, “Uber Whistleblower: Autonomous Vehicles Need New Safety Metrics, Aren’t Really Any Safer” at The Truth About Cars (August 19, 2019)

In his study, Closing the Curtains on Safety Theater, Miller argues that the industry metrics are misleading, especially the key metric of “Total miles driven” which is a “leading indicator of how likely a self-driving vehicle program is to cause injury or property damage.”

Why is it misleading? Researchers at the RAND corporation found that it is nearly impossible to clock enough miles to ensure a self-driving vehicle is safe:

Given that current traffic fatalities and injuries are rare events compared with vehicle miles traveled, we show that fully autonomous vehicles would have to be driven hundreds of millions of miles and sometimes hundreds of billions of miles to demonstrate their safety in terms of fatalities and injuries. Under even aggressive testing assumptions, existing fleets would take tens and sometimes hundreds of years to drive these miles — an impossible proposition if the aim is to demonstrate performance prior to releasing them for consumer use.

Nidhi Kalra, Susan M. Paddock, “How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?” at RAND Corporation

Miller’s safety calculations agree and challenge the industry safety message: He posits that, based on available measures, self-driving cars cause more injury and more property than driver-controlled vehicles:

He admits that other drivers cause some of these accidents and that self-driving vehicles do a “good job at avoiding collisions where they would be found at-fault.” But he continues:

…they are significantly worse at avoiding crashes that should be preventable. This is because self-driving technology has not advanced enough and safety drivers don’t intervene early enough.

Robbie Miller, “Closing the Curtains on Safety Theater” at Medium Pronto AI


We’ve made a similar point here at Mind Matters News: Automated vehicle technology can lead to worse driving, in part by creating disengaged drivers.

Sadly, even those tasked with protecting us from ourselves get googly-eyed over the science-fiction promises. The National Highway Traffic Safety Administration (NHTSA)the government agency tasked with ensuring safe roads and vehicles—uncritically parrots the industry marketing literature: “When you consider more than 37,133 people died in motor vehicle-related crashes in the U.S. in 2017, you begin to grasp the lifesaving benefits of driver assistance technologies.” (NHTSA, “Automated Vehicles for Safety”)

And worse, next month, “Congress is expected to push for legislation that paves the way for widespread deployment of self-driving vehicles” (Automotive News, August 19, 2019) even though there is no consensus on safety standards.

Let me be clear: A grossly misguided faith (in all the senses of that word) is driving the self-driving car hype. There is little data to support the industry’s safety claims; and, if Miller is to be believed, what data we have says that autonomous vehicles are less safe than humans.

Technology is best used when it helps us be better at what we do rather than replace what we do. It is time for self-driving car proponents to return to reality and admit this. It is the only way we’ll be safe.


Also by Brendan Dixon on safety issues around self-driving cars:

Does a Western bias affect self-driving cars? How a driver is expected to act varies by culture. Self-driving cars (autonomous vehicles) will need to adapt to different rules and we will, very likely, need to change those rules to make the vehicles work.

Should Tesla’s Autopilot feature be illegal? A recent study from the United Kingdom on driver competence suggests that maybe it should.

and

Autopilot is not just another word for “asleep at the wheel” As a recent fatal accident in Florida shows, even sober, attentive drivers often put too much trust into Tesla’s Autopilot system, with disastrous results.


Brendan Dixon

Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Brendan Dixon is a Software Architect with experience designing, creating, and managing projects of all sizes. His first foray into Artificial Intelligence was in the 1980s when he built an Expert System to assist in the diagnosis of software problems at IBM. Since then, he’s worked both as a Principal Engineer and Development Manager for industry leaders, such as Microsoft and Amazon, and numerous start-ups. While he spent most of that time other types of software, he’s remained engaged and interested in Artificial Intelligence.

Are self-driving cars really safer?