Mind Matters Natural and Artificial Intelligence News and Analysis
xavier-lee-304841-unsplash
Light streaks from moving cars at night
Photo by Xavier Lee on Unsplash

Autopilot Is NOT Just Another Word for “Asleep at the Wheel”

As a recent fatal accident in Florida shows, even sober, attentive drivers often put too much trust into Tesla’s Autopilot system, with disastrous results
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

On March 1 of this year, 50-year old Jeremy Beren Banner died because his Tesla Autopilot failed to “see” a truck cross his path:

According to the Palm Beach County Sheriff’s Office, Friday’s crash took place on State Road 7, near Pero Family Farms just north of Atlantic Avenue.

A report from the Sheriff’s Office said the tractor-trailer was making a left turn onto a divided highway to head north when the southbound 2018 Tesla Model 3 hit the semi’s driver side, tearing off the Tesla’s roof as it passed under the trailer.

Aric Chokey and Tom Krisher, “Tesla crash: Officials likely to probe if Autopilot driving system played role in most recent fatality” at SunSentinel

The Autopilot system likely misinterpreted the truck as an overpass and assumed it was safe to pass under. It was not.

Curiously, his accident bears a marked similarity to that which killed another Florida driver in 2016:

The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver — who Tesla notes is ultimately responsible for the vehicle’s actions, even with Autopilot on — nor the car noticed the big rig or the trailer “against a brightly lit sky” and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle’s radar didn’t help in this case because it “tunes out what looks like an overhead road sign to avoid false braking events.”

Jordan Golson, “Tesla driver killed in crash with Autopilot active, NHTSA investigating” at The Verge

Tesla, naturally, has responded to the NTSB report on the Palm Beach crash:

Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance. (Emphasis added)

The Verge notes that “It is at least the fourth fatal crash involving Autopilot.”

Basically, Tesla is making the point we continue to make here at Mind Matters News: AI systems, including driver assistance such as Tesla’s poorly-named Autopilot, do their best work when combined with a human.

Tesla, as with other (so-called) self-driving car ventures, has oversold the capabilities of their system. Elon Musk regularly inflates what Tesla Autopilot can do and makes unrealistic predictions (such as that he will have a “million” Tesla robotaxis operating by the end of 2020).

Like all tools, AI systems, when used correctly, can augment our abilities, but they are nowhere near replacing us. And we endanger ourselves, and others, when we believe they can.

Suspected DUI stop: Driver explained Tesla had been set on autopilot/ CHP San Francisco

So, for once, I agree with Tesla: The Tesla Autopilot can improve safety if used by an attentive driver. But that’s not how they name or sell their system; nor is it how customers use them.

Earlier this month, police in the Netherlands had trouble pulling a driver over for tailgating: The driver was drunk and asleep at the wheel of his Tesla: “Eventually, the officers managed to wake the driver up using a siren…

That wasn’t the first time this type of behavior was recorded. Two other drivers had pulled similar stunts, one in Palo Alto (“It took seven miles to pull over a Tesla with a seemingly asleep driver”) and another on the area’s Bay Bridge (“When police woke the man up, he assured officers that everything was fine because the car was ‘on autopilot.’”) The police charged him on suspicion of DUI and tweeted, “Car towed (no it didn’t drive itself to the tow yard).”

But, as the fatal accident in Florida shows, even sober, attentive drivers often put too much trust in Tesla’s Autopilot system, with disastrous results.

We need to step away from the overhyped expectations for (so-called) self-driving cars, and all similar AI ventures: Fully autonomous systems we can trust are not going to arrive anytime soon:

“From 2020, you will be a permanent backseat driver,” The Guardian said in 2015. Fully autonomous vehicles will “drive from point A to point B and encounter the entire range of on-road scenarios without needing any interaction from the driver, Business Insider wrote in 2016.

It’s clear now that many of these estimates were overblown; just look at the trouble Uber had in Arizona.

Ben Dickson, “The Predictions Were Wrong: Self-Driving Cars Have a Long Way to Go” at PC Mag

Instead, we need to recognize that well-designed, and properly used, AI-systems—even Tesla’s Autopilot—can help us to be better at human tasks. For example, in a recent development in medicine, AI might help doctors detect lung cancer faster and more reliably.

They cannot replace us.

Also by Brendan Dixon: News from the real world of self-driving taxis: not yet WayMo includes a human in all their “robotaxis,” just in case, because the vehicles (at last report) were still confounded by common conditions

Further reading: Guess what? You already own a self-driving car Yes, the car you own today is probably a “self-driving” car and you may not know it. But that is because of the creative ways the term can be defined. (Jonathan Bartlett)

and

Are Tesla’s robot taxis a phantom fleet? Jonathan Bartlett suspects that a dire quarterly report is powering the fleet, not genuine innovation


Brendan Dixon

Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Brendan Dixon is a Software Architect with experience designing, creating, and managing projects of all sizes. His first foray into Artificial Intelligence was in the 1980s when he built an Expert System to assist in the diagnosis of software problems at IBM. Since then, he’s worked both as a Principal Engineer and Development Manager for industry leaders, such as Microsoft and Amazon, and numerous start-ups. While he spent most of that time other types of software, he’s remained engaged and interested in Artificial Intelligence.

Autopilot Is NOT Just Another Word for “Asleep at the Wheel”