Proving a negative is difficult. Think about it. For example, demonstrating that there are no leafy green crows is hard to do without exhaustively examining every crow in existence. On the other hand, proving there are no crows naturally emblazoned with the text of the King James Bible is a bit easier to do. Proving a negative is possible if the extremes are large enough. Such as result is known as a no-go theorem.
One of the most profound no-go theorems can be found in quantum physics. Physicist John Bell (1928–1990) proved — entirely from first principles — that there is a fundamental difference between how particles interact classically compared with how they interact within quantum physics.
In classical physics, when particles are spatially separated, measurements on one particle do not affect the other particle. However, within quantum physics, particles can become entangled, resulting in measurements on one particle impacting the other spatially separated particle. These two modes of interaction result in very different probability distributions of possible interactions. By taking enough experimental measurements, we can conclusively demonstrate that entangled particles behave fundamentally differently from classical non-entangled particles.
While Bell’s theorem is defined mathematically over an infinity of possible situations (which we cannot test in our finite universe), there is a reason why we can approach certainty regarding an infinite result with a finite number of measurements. This reason is known as the Law of Large Numbers.
This law states that the average, which is defined over a finite number of measurements, approaches the expected value (also called the mean), which is defined over an infinite number of measurements. Thus, even though we are always left with a probabilistic result after a finite number of measurements, this probability gets closer and closer to one as we take more measurements.
This brings us to a more general result known as the conservation of information. Design theorists William Dembski and Robert J. Marks defined the law of conservation of information in their 2009 paper “Conservation of Information in Search” and then proved the result in their follow-on 2010 paper “The Search for a Search”. The conservation of information (COI) says the expected active information produced by any combination of random and deterministic processes is guaranteed to be zero or less. Active information is itself the difference between two different probability distributions.
We can see the conservation of information is a generalization of Bell’s no-go theorem in quantum mechanics. It contrasts the difference between two probability distributions, and then take the expectation to get a hard limit. Finally, we measure whether this limit is met by averaging a large number of physical measurements.
One outcome is that not only has Dembski and Marks’ law been empirically validated by one of the most tested physics hypotheses in recent history, but the conservation of information is a general result that can be applied to any physical phenomenon where we can calculate two different distributions. This includes biological history and theory of mind, two areas where there are contrasting theories explaining the physical phenomena we see.
The benefit of Dembski and Marks’ law is we have as close to certainty as we can get with empiricism. While Dembski’s other concept, “complex specified information,” is defined probabilistically, the conservation of information is a strict inequality.
Consequently, while complex specified information always leaves us in the realm of probability, regardless of what we observe in the physical world, the conservation of information allows us to approach certainty, just as we can be almost completely certain in the case of Bell’s theorem.
Further reading on information theory:
But is determinism true? Does science show that we fated to want whatever we want? (Michael Egnor)
At the movies: can AI restore blurred images? Working with pixels, we can do remarkable things—as long as we are not asking for magic (Robert J. Marks)
Why information theory is like a good run. Information theory can help us understand a wide range of fields besides computers. (Eric Holloway)
COVID-19: When 900 bytes shut down the world. A great physicist warned us, information precedes matter and energy: Bit before it.