Mind Matters Natural and Artificial Intelligence News and Analysis
Young businesswoman thinking while using a laptop at work
Thinking woman at computer Adobe Stock licensed

Jeffrey Shallit, a computer scientist, doesn’t know how computers work

Patterns in computers only have meaning when they are caused by humans programming and using them.
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Materialist computer scientist Jeffrey Shallit, who believes that computers think, takes issue with a recent post in which I point out that computation is not thinking because computation inherently lacks meaning and meaning is the hallmark of thinking. He responds:

Egnor claims that “Mental activity always has meaning—every thought is about something. Computation always lacks meaning in itself.” This is a classic blunder, made by people who have little understanding of the nature of computation. Of course computations have meaning. When we sum the infinite series 1+1/4+1/9+… using a program such as Maple, by typing sum(1/n^2,n=1..infinity); who can reasonably deny that the answer Π2/6 it produces has meaning?

Jeffrey Shallit, “Yet More Unsubstantiated Claims by Egnor” at Recursivity

Let me help Dr. Shallit understand how his computer works. A programmer programmed his program so that when Shallit presses a key on his keyboard—“1” say—electrons in his computer flow in a specific pattern programmed by the programmer. When Shallit presses all of the keys of his infinite sum, the patterns of electrons generated by his input and the programmer’s program gives rise to a pattern of electrons on his computer screen that he then reads as meaning the answer to his sum. All of the thinking belongs to Shallit (below left) and the programmer. The computer does no thinking at all.

It’s analogous to a book—say, Shakespeare’s Hamlet: Consider the paper and the ink and the press that prints the book. Shakespeare understood “Something is rotten in the state of Denmark” and the reader understands it. The graphic designer can understand it. But the printing press and ink and paper don’t understand it.

All of the meaning in a book or in a computation comes from the humans who create and use it. Neither the book nor the computer understand anything. Computation is a means of expressing human thought, just as printing a book is a way of expressing human thought. Neither is any kind of thought in itself. Shallit continues:

[Egnor’s] classic error was debunked as long ago as 1843, when Ada Lovelace wrote, “Many persons who are not conversant with mathematical studies, imagine that because the business of the engine is to give its results in numerical notation, the nature of its processes must consequently be arithmetical and numerical, rather than algebraical and analytical. This is an error. The engine can arrange and combine its numerical quantities exactly as if they were letters or any other general symbols; and in fact it might bring out its results in algebraical notation, were provisions made accordingly.”

Jeffrey Shallit, “Yet More Unsubstantiated Claims by Egnor” at Recursivity

Shallit misunderstands Lovelace’s point. Breaking with the view in her day that computation was merely a means of processing numbers, Ada Lovelace (1815–1852) argued that computation could also be symbolical and could be used to carry out processes according to algebraic and other kinds of rules. This is of course true but it does not mean that computers can think algebraically or think at all, nor does Lovelace even speak to that issue. All numerical or symbolic meaning in computation is provided by human users and programmers. Sans humans, computation is merely electrons moving around, without meaning.

Here’s an illustration: A helicopter pilot spots stones that somewhat vaguely and messily appear to form “S O S” on the windswept beach of a deserted island, after a recent storm. Assume that the pilot is not familiar with international distress signals of Western world origin. So he doesn’t know whether the rather unusual pattern formed by the stones (the computation) has any significance for him. He contacts the local naval base. If they were to say that no meaning is associated with the pattern “S O S”, the pilot might conclude that the stones were just an unusual outcome of the storm winds. That is, if no human mind was involved in making the pattern, “S O S” has no meaning. If, on the other hand, he is told that S O S is a well-recognized rescue signal, he would know that there were humans marooned on the island, perhaps survivors of a shipwreck during the storm.

Patterns in nature never have meaning unless caused by a mind. Patterns in computers only have meaning when they are caused by humans programming and using them. Computational patterns in themselves, discounting humans, have no intrinsic meaning. Computation is not thought.

Shallit doesn’t give up easily:

[If] you want examples related to the real world, just consider the data collected and processed to produce weather predictions. If these computations had no meaning, how is it that short-term weather forecasts are so accurate?

Jeffrey Shallit, “Yet More Unsubstantiated Claims by Egnor” at Recursivity

Goodness gracious, what a clueless assertion. Weather data is numbers collected by humans who measure variables (temperature, wind speed, etc.). Humans program computers to correlate certain input data patterns to output data patterns, according to human meanings and purposes. Humans interpret the resulting weather data, and they write and use computer programs to make their interpretations easier and more accurate, in the same way that a bookkeeper uses a spreadsheet to organize financial data. Computers can help us predict weather but they don’t know if it’s raining or shining. Computers don’t “know” anything.

For a (final) example of what should by now be an obvious point, consider using your word processor to type the word “lie”. You, of course, intend a specific meaning—perhaps you mean “recline”. But “lie” is a homonym; it can mean “recline” or “tell a falsehood.”

Your computer, which is incapable of thought of any kind, doesn’t know whether you mean “recline” or “tell a falsehood”. From the standpoint of computation, “lie” is merely a pattern of electron flow caused by keystrokes, and that pattern is exactly the same regardless of the meaning of the homonym. Computation knows nothing of meaning so your computer can’t by itself discern different meanings of homonyms.

The only way your computer could “discern” the various meanings associated with “lie” would be if the the word processor is programmed to respond differently to different contexts for “lie.” Of course, programmers have done that, enabling word processors to check syntax and grammar. But all of the meaning is put in by humans. The word processor itself generates no meaning because it has no thoughts.

Computation is the physical matching of states of matter according to a set of rules. The meanings of the states of matter that go into and out of the computation, and the rules by which the computation proceeds, come entirely from the minds of the humans who program, input, and read the computation. There is no inherent meaning in the material states, any more than there is inherent meaning in scattered stones on a wave swept beach that accidentally spell out “S O S” or in a pattern of electrons that spell “lie” on a screen.

It’s remarkable that Dr. Shallit—a professor of computer science—doesn’t understand computation. Materialism is a kind of intellectual disability that afflicts even the well-educated. To put it simply, machines don’t and can’t think. Dr. Shallit’s wristwatch doesn’t know what time it is. Dr. Shallit’s iPod doesn’t enjoy the music it plays or listen to his phone calls. His television doesn’t like or dislike movies. And his computer doesn’t, and can’t, think.


Note: The post with which Jeffrey Shallit took issue is “The mind is the opposite of a computer: Matthew Cobb, a materialist, only scratches the surface when he explains why your brain is not a computer. Mental activity always has meaning—every thought is about something. Computation, by contrast, always lacks meaning in itself. A word processing program doesn’t care about the opinion that you’re expressing when you use it. In fact, what makes computation so useful is that it doesn’t have its own meaning. Because the mind always has meaning and computation never does, the mind is the opposite of computation. (Michael Egnor)

Neurosurgeon Michael Egnor’s earlier tussles with computer scientist Jeffrey Shallit:

Can animals “reason”? My challenge to Jeffrey Shallit. He believes that animals can engage in abstract thinking. What abstractions do they reason about? Shallit is denying the obvious. That animals can think only about concrete things, and not about abstract ones, is obvious. All of our experience with animals tells us this.

Do either machines—or brains—really learn? A further response to Jeffrey Shallit: Actually, brains don’t learn either. Only minds learn. Learning is an ability of human beings, considered as a whole, to acquire new knowledge, not an ability of human organs considered individually.

and

Machines really can learn! A computer scientist responds to my parable: Jeffrey Shallit argues that a computer is not just a machine, but something quite special.


Michael Egnor

Professor of Neurosurgery and Pediatrics, State University of New York, Stony Brook
Michael R. Egnor, MD, is a Professor of Neurosurgery and Pediatrics at State University of New York, Stony Brook, has served as the Director of Pediatric Neurosurgery, and is an award-winning brain surgeon. He was named one of New York’s best doctors by the New York Magazine in 2005. He received his medical education at Columbia University College of Physicians and Surgeons and completed his residency at Jackson Memorial Hospital. His research on hydrocephalus has been published in journals including Journal of Neurosurgery, Pediatrics, and Cerebrospinal Fluid Research. He is on the Scientific Advisory Board of the Hydrocephalus Association in the United States and has lectured extensively throughout the United States and Europe.

Jeffrey Shallit, a computer scientist, doesn’t know how computers work