Mind Matters Natural and Artificial Intelligence News and Analysis
elephant mask beautiful young hipster woman
elephant mask beautiful young hipster woman in the city

AI Is No Match for Ambiguity

Many simple sentences confuse AI but not humans
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Groucho Marx (1890–1977) used to start one of his quips with “I once shot an elephant in my pajamas.” That seems clear enough but then he follows up with “How he got into my pajamas I’ll never know.”

The punch line depends on the ambiguity of the question. At first, we interpret the words in a common-sense way; we assume that Groucho was wearing his own pajamas. The joke consists in surprising us with a grammatically possible but fantastic alternative. Contemporary comedian Emo Phillips quips that ambiguity is the devil’s volleyball.1

Computers have no sense of humor. Given the sentence without context, they don’t have a clue who is wearing Groucho’s pajamas. A class of such ambiguous expressions called Winograd Schemas2 continues to baffle AI software.

An example of a Winograd Schema from Gary Smith’s fun book The AI Delusion is the sentence

I can’t cut that tree down with that axe. It is too small.

Does the vague pronoun “It” refer to the tree or the axe? Humans immediately understand that “It” refers to the axe. AI would not be so sure.
Note that the meaning of “It” can be clarified by simply changing the word “small” to “thick.” Then we get:

I can’t cut that tree down with that axe. It is too thick.

The pronoun “It” now obviously refers to the tree.

A third word substitution points “It” to an implied subject that is not part of the sentence:

I can’t cut that tree down with that axe. It is too late.

Now “It” refers neither to the axe nor the tree, but to the time of day, which we infer from the word “late” (an adjective).

Still other word choices can render the meaning of the sentence irresolvably ambiguous for humans as well as for AI. For example:

I can’t cut that tree down with that axe. It is too cursed.

The word “cursed” can refer either to the tree or to the axe. Here, even human resolution is impossible without context.

Ambiguities like this, explains Smith, caused IBM Watson’s computer handlers concern before Watson played the trivia TV quiz show Jeopardy in 2011. Questions containing ambiguity would give Watson trouble and thus give humans the edge. However, the question writers were unwilling to compromise their questions to favor the computer, so all agreed to use old questions from the vault. Famously, IBM Watson then whupped the human Jeopardy players in the overall competition. (See our podcast with Smith, “When I Nod My Head, Hit It!” And Other Commands that Confuse AI.)

The original Winograd Schema3, developed by Stanford computer science professor Terry Winograd, is

The city councilmen refused the demonstrators a permit because they feared violence.

The city councilmen are clearly those who “feared violence.” If we change the word “feared” to “advocated,” we get:

The city councilmen refused the demonstrators a permit because they advocated violence.

The “demonstrators” would clearly be the ones who “advocated violence.” At least, that’s clear to humans. With no added context, a computer, lacking common sense, would have problems interpreting either version of the sentence.

Here are some other Winograd Schemas where the change of a single word redirects a pronoun reference.4 In the following examples, changing the verb “given” to “received” changes whom the pronoun “she” refers to:

Joan made sure to thank Susan for all the help she had given.

Joan made sure to thank Susan for all the help she had received.

Here’s a Winograd Schema where the meaning of the sentence is altered by changing the word “slow” to “fast”:

The delivery truck zoomed by the school bus because it was going so fast.

The delivery truck zoomed by the school bus because it was going so slow.

Again, we interpret these sentences easily without ambiguity. A computer has no common sense and will likely be puzzled. Here are a few more to try out.

But won’t the great AI of the future get around this problem? Maybe. That’s the goal of gatherings called the Winograd Schema Challenge.5 Smith notes in The AI Delusion that AI success at these meetings so far is a bit above 50%. The result of random guessing for such problems is 50% so that is hardly an impressive figure.

The bottom line is, we have yet to ingrain AI with the common sense to solve simple Winograd Schemas.


Notes:
1 Comedian Emo Philips often uses ambiguities as gag lines.
2 See Ernest Davis, Leora Morgenstern, and Charles Ortiz, The Winograd Schema Challenge, New York University
3 “Winograd Schema Challenge,” Wikipedia
4 These examples are taken from a collection of Winograd Schemas prepared by Ernest Davis of the Computer Science Department at New York University.
5 Nuance Communications, Inc. sponsors the Commonsense Reasoning ~ Winograd Schema Challenge as an alternative to the Turing test for computer intelligence.

Don’t miss Robert J. Marks’s discussion with Gary Smith of a variety of AI delusions in AI Delusions: A statistics expert sets us straight and The US 2016 Election: Why Big Data Failed

More from the strange world of AI by Robert J. Marks:

Big Data Can Lie: Simpson’s Paradox

and

Things Exist That Are Unknowable: A tutorial on Chaitin’s number

Further reading: Robert J. Marks: There are things about human beings that you cannot write code for


AI Is No Match for Ambiguity