Mind Matters Natural and Artificial Intelligence News and Analysis
ke-wen-623575-unsplash
Crosswalk with fake car and pedestrians
Photo by ke wen on Unsplash

Does a Western Bias Affect Self-Driving Cars?

How a driver is expected to act varies by culture
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Traffic rules vary widely around the globe, as people cope with different circumstances:

On a recent research trip, I found myself in the back of a taxi in New Delhi, amid a dizzying mix of car horns and exhaust. What initially seemed like complete chaos—a flurry of cars, buses, auto-rickshaws, motorbikes, and the occasional cow—turned out to be a flow of traffic where everyone works together to keep everything moving. People gave each other just enough space to merge into lanes or roundabouts without stopping. This is considered rude in the West as we’re taught to wait for those who have the right of way, but it’s perfectly normal in India.

So I began to wonder: How would a California-trained autonomous vehicle handle this traffic culture?

Jasper Dekker, “The problem with autonomous cars that no one’s talking about” at Fast Company

The rules of the road we follow are not followed everywhere. And it’s not as simple as which side of the road we drive on. In Japan, splashing water on pedestrians is not only discourteous, it is illegal. It is also illegal in the province of Prince Edward Island in Canada. In France, you must have a breathalyzer on board (though, to complicate matters, there is no longer a fine for noncompliance).

What about inconsiderate honking? In Britain, it is illegal to honk your car’s horn unless the car is moving and other drivers are present. But, Dekker notes, drivers in New Delhi will “give audible signals to let human drivers know they’re in their blind spot.” That would probably be considered inconsiderate honking in Newcastle.

Even rule-bound countries have hair-raising intersections that defy codification. I would enjoy watching a Tesla trying to navigate the Arc de Triomphe roundabout in Paris.

Just as not everyone writes the letters of the alphabet in the same way* but we can all read the outcome, not everyone follows the same rules of the road but experienced locals tend to know what they (really) are. Self-driving cars (autonomous vehicles) will need to adapt to different rules and we will, very likely, need to change those rules to make the vehicles work.

One change might be prohibiting pedestrians on roads that permit self-driving cars. Jonathan Bartlett, for example, argues that the road that permits self-driving cars should be treated as a “virtual rail” where they would have, for example, “Signage to make clear that the roadway is a virtual rail, thus the cars may not have drivers.”

I am skeptical that autonomous vehicles will ever freely roam our roads—the varying conditions exceed any amount of training data. There are just too many exceptions that can put life at risk. But I agree with Dekker that carmakers and governments need to define what a safe “training program entails and how AVs [Autonomous Vehicles] can graduate.”

If self-driving proponents and makers are actually concerned with making our roads safer—versus living out a sci-fi fantasy based on 1960s cartoons—then they should welcome such proposals. If, on the other hand, they do not, then it’s fair for us to ask: What’s really driving their vision? I suspect it’s green, flat, and useful for buying things but has nothing to do with keeping us safe. Will we allow ourselves to be so easily fooled?

*Note: There is a history to this problem in the computer industry. Years ago — before the recent improvements in machine learning (the art of getting computers to do things without each and every step being explicitly programmed)— one big computer challenge was recognizing handwriting. That difficulty brought about the demise of more than one too-early device, such as Apple’s Newton(1993), about which you can read at a museum page for old computing equipment.

One company, which has long since lost its way due to other problems, devised a clever workaround: Instead of creating a computer to recognize handwriting, why not develop one that can recognize something close to how humans write and then train humans to write those characters. PalmOS’s graffiti recognition system succeeded, for a number of years by reaching a compromise between what computers can do and what humans can do.


Also by Brendan Dixon on issues around self-driving cars:

Self-driving cars: Following the money up a cooling trail. The market for lithium for electric car batteries is slowing. One way we can assess entrepreneurs’ claims (think Elon Musk) is to ask, what physical components does the product require and how is the market responding?

Should Tesla’s Autopilot feature be illegal? A recent study from the United Kingdom on driver competence suggests that maybe it should.

Even Uber didn’t believe in Uber’s self-driving taxis We found that out after Google’s Waymo sued the company.

True Believer Loses Faith in Fully Self-Driving Cars Levandowski sees the future—and it is tech aids for safer driving.

The Real Future of Self-Driving Cars Is — Better Human Drivers! Manufacturers are improving safety by incorporating warning systems developed for self-driving cars into conventional models

and

Autopilot is not just another word for “asleep at the wheel” As a recent fatal accident in Florida shows, even sober, attentive drivers often put too much trust into Tesla’s Autopilot system, with disastrous results.


Brendan Dixon

Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Brendan Dixon is a Software Architect with experience designing, creating, and managing projects of all sizes. His first foray into Artificial Intelligence was in the 1980s when he built an Expert System to assist in the diagnosis of software problems at IBM. Since then, he’s worked both as a Principal Engineer and Development Manager for industry leaders, such as Microsoft and Amazon, and numerous start-ups. While he spent most of that time other types of software, he’s remained engaged and interested in Artificial Intelligence.

Does a Western Bias Affect Self-Driving Cars?