University College London cognitive neuroscientist Brian Butterworth, author of a forthcoming book, Can fish count? (Basic Books, 2022), reckons that, one way or another, in a modern urban society, we process about 16,000 numbers in an average day. Numbers create conceptual relationships between vastly different things. From the publisher’s introduction to his book, we learn, “The philosopher Bertrand Russell once observed that realizing that a pair of apples and the passage of two days could somehow both be represented by the concept we call “two” was one of the most astonishing discoveries anyone had ever made.”
At The Scientist, Catherine Offord, discussing his work, offers a critical distinction between estimations of quantity and actual counting: “Our perception of quantity, separate from counting or estimation of magnitude more generally, is foundational to human cognition, according to some neuroscientists” :
This ability, known as numerosity perception, is distinct from counting—the process of keeping a tally while going through a set of objects—and is present in infants long before they learn words or symbols for particular numbers. It is evident, too, among adults in isolated human populations that typically don’t use numbers much in their daily lives.Catherine Offord, “Is Your Brain Wired for Numbers?” at The Scientist (October 1, 2021)
Some researchers think that this rough sense of number called numerosity is simply an extension of the ability to notice the size of one thing in relation to another. Others argue that it is an independent development, “hardwired” into our brains, and even that it is the origin of the ability to reason with numbers, which includes arithmetic.
Testing the hardwiring thesis, Andreas Nieder reported spikes of activity in the prefrontal cortex of macaques reacting to three objects of interest, relative to two or four such objects. He and colleagues later recorded evidence of neurons in the avian endbrains of crows that respond to specific numbers of objects from one to five. Crows don’t have a neo- or prefrontal cortex so, as Offord notes, the researchers suggested convergent evolution (convergence on a common goal rather than common ancestry) as an explanation. Even so, they say, the quality is probably innate as it develops.
Harvey notes that at least some activity reported as number-specific may instead be related to attention or other aspects of task performance rather than to numerosity per se, and adds in an email that it’s unlikely that macaques and humans, which diverged more than 20 million years ago and have different brain structures, are using exactly the same neural machinery.”Catherine Offord, “Is Your Brain Wired for Numbers?” at The Scientist (October 1, 2021)
Now, Nieder’s group isn’t saying that numerosity is inherited from a common ancestor. In the case of macaques vs. crows, inheritance from a common ancestor of both would take us a long way back into the history of life. But the quality has only been shown to be highly developed in a few far distant life forms. That is why the researchers think that the ability is more likely a result of convergent evolution. They are suggesting that it is an “innate” quality that developed differently in different life forms that were all seeking to solve the same problem — should they take risks and expend energy to acquire this food resource or that one?
The reality is that we don’t have the information we need to decide as yet whether numerosity is a hardwired trait or simply an outcome of general awareness of one’s environment. We would need consistent neuroscience resolution down to the single neuron to find out if there are neurons specialized to specific numbers.
Meanwhile, the debate heats up when we talk about human math skills: “Humans’ use of symbols to represent numerical concepts and perform calculations is unique across the phylogenetic tree,” as Offord puts it at The Scientist. Nieder told her that the debate is “probably not on the way to being settled… Almost nothing in this domain is really agreed upon by everyone.”
Another problem to throw into the mix is that much that we humans take for granted as practical math started out as abstractions. Take the Pythagorean theorem that we probably learned in school: “In a right-angled triangle, the square of the hypotenuse side is equal to the sum of squares of the other two sides.” This seems to have started out as a formula discussed by early mathematicians and geometers. Gradually, mathematicians realized that it could be used to compute a variety of measurements that have immense practical value.
It’s the same with the concept of zero. In the sixth century A.D., Indian philosophers grasped its role as both a placeholder and a mathematical value. Zero proved immensely helpful later in commercial and other everyday mathematics. Clearly, there is an element in human number sense that requires a high level of ability to process abstractions before most practical benefits can become apparent. And that may be even harder to explain than numerosity.
Here are some fun recent items on animals and number sense:
Pigeons can solve the Monty Hall problem. But can you? The dilemma pits human folk intuition against actual probability theory, with surprising results. In one 2010 study, pigeons outperformed humans in the three-doors test but in a second 2012 study, they only beat preschoolers, not college kids.
Is our “number sense” biology, culture — or something else? It’s a surprisingly controversial question with a — perhaps unsettling — answer. Mathematics supports a dualist view of the universe. Both concrete and abstract, depending. Both the Chimp Chocolate Stakes and Chaitin’s Unknowable Number.
Why animals can count but can’t do math. A numerical cognition researcher outlines the differences between recognizing numbers and doing math. Psychologist Silke Goebel says that the cardinality principle — the highest number in a series sums the numbers, takes children some time to learn.