Why Information Entropy Is Not Thermal Entropy
Kish & Ferry (2018) proved that information entropy and thermal entropy are apples and oranges. What this means for Digital Circuitality.
A Joke That Became a Crisis
When Claude Shannon was deciding what to call his new measure of uncertainty in communication channels, John von Neumann reportedly told him: "Call it entropy. Nobody understands entropy, so in a debate you will always have the advantage."
It was a good joke. But seventy years later, the naming collision between Shannon's information entropy and Boltzmann-Clausius thermodynamic entropy has produced a genuine scientific crisis — one that reaches deep into computer science, physics, and the foundations of computation itself.
The Confusion
For decades, most of computer science has assumed that information entropy (Shannon) and thermal entropy (Boltzmann/Clausius) are intimately related. The bridge between them is Landauer's principle, which states that erasing one bit of information must dissipate at least kT ln(2) joules of energy as heat, where k is Boltzmann's constant and T is the temperature of the environment.
Since Rolf Landauer proposed this in 1961, it has been treated as physical law. Entire research programs — reversible computing, quantum thermodynamics of information, Maxwell's demon resolutions — have been built on the assumption that information processing has an irreducible thermodynamic cost. Textbooks state it. Papers cite it. Conferences assume it.
But what if the two entropies are not the same quantity at all?
The Refutation: Kish and Ferry (2018)
In 2018, Laszlo B. Kish and David K. Ferry published a rigorous analysis proving that information entropy and thermal entropy are fundamentally different quantities — "apples and oranges" that cannot be equated. Their key findings:
1. Thermal entropy is objective. It is a property of the physical system itself. It does not depend on who is measuring it or what instrument is used. A gas at temperature T in volume V has a definite thermodynamic entropy regardless of the observer.
2. Information entropy is subjective.It depends on the measurement instrument, the observer's knowledge, and the chosen encoding. The same physical system can have different information entropies depending on how you measure it and what you already know.
3. They can be separated in space and time.The information about a system and the system's thermodynamic state can exist in completely different locations at completely different times. This alone makes a general equivalence impossible.
4. Information entropy can violate the Third Law of Thermodynamics. At absolute zero, thermodynamic entropy reaches a minimum. Information entropy has no such constraint — it can take any value regardless of temperature.
Reference: L.B. Kish and D.K. Ferry, "Information entropy and thermal entropy: apples and oranges," J. Comput. Electron. 17, 43-50 (2018).
Zero-Energy Erasure
Even before the 2018 paper, Kish and collaborators had already struck at the heart of Landauer's principle. In 2016, they demonstrated that information erasure can occur with zero or even negative energy dissipation through thermalization in double-potential-well memories.
The mechanism is straightforward: a memory element with two potential wells (representing 0 and 1) can be erased by allowing the system to thermalize — to reach thermal equilibrium with its environment. This process does not require the minimum kT ln(2) energy dissipation that Landauer predicted. In certain configurations, it can even release energy.
This means Landauer's erasure bound is not just an approximation that future technology might approach — it is fundamentally wrong as a universal physical law.
Reference: L.B. Kish, C.G. Granqvist, S.P. Khatri, and F. Peper, "Zero and negative energy dissipation at information-theoretic erasure," J. Comput. Electron. 15, 335-339 (2016).
The Key Insight for Software
Here is where this physics debate becomes directly relevant to software engineering. Kish's 2016 paper contains a remarkable result (Equations 11-12): in a deterministic computer with error-free memory, the information entropy is always zero.
Think about what this means. A deterministic program that takes input X and always produces output Y has no informational uncertainty. There is no randomness, no ambiguity, no missing information. The Shannon entropy of its output, given its input, is exactly zero.
This is precisely what Φc= 1 means in Digital Circuitality. A formally verified, deterministic system — one where every input maps to exactly one output through a verified transformation — has zero informational uncertainty. When the Circuitality Coefficient reaches unity, the system's information entropy reaches zero. The two conditions are equivalent.
What This Changes for Digital Circuitality
Our framework originally referenced Landauer's principle as part of its thermodynamic analogy. Thanks to Prof. Kish's guidance, we have corrected this. The implications actually strengthen the framework:
The framework is now purely information-theoretic. Our verification metrics no longer rely on any contested relationship between information and physical energy. They measure informational uncertainty — pure Shannon entropy — without claiming that this uncertainty has a thermodynamic cost.
Φc = 1 means zero informational uncertainty. When the Circuitality Coefficient reaches unity, the system has zero information entropy. Not zero physical energy, not zero heat dissipation — zero uncertainty about what the system will do. This is a statement about knowledge and determinism, not about physics.
No dependency on contested physics. By removing the Landauer connection, Digital Circuitality no longer depends on any disputed physical claim. The framework stands on pure information theory — Shannon (1948), well-established and uncontroversial — plus formal verification, which is pure mathematics.
The correction makes the framework stronger, not weaker. A theory that depends on fewer assumptions is more robust than one that depends on more.
Brillouin's Negentropy
The correct historical inspiration is Leon Brillouin (1953), who proposed that information is negentropy— the negative of entropy. Gaining information about a system reduces your uncertainty, which is analogous to reducing entropy. This is an elegant idea, and it motivated much of Digital Circuitality's early development.
However, even Brillouin's negentropy principle has limitations. Kish and Ferry (2018) show that it is not a general law — the relationship between information and thermodynamic entropy is more nuanced than a simple negation. There are cases where gaining information does not correspond to any thermodynamic change, and cases where thermodynamic changes carry no informational content.
Digital Circuitality takes the safest possible path: pure Shannon information theory, with physical analogies used as metaphor and intuition, never as foundation. We say that a verified system "behaves like" a low-entropy physical system because it is deterministic and predictable. We do not claim that verification literally reduces thermodynamic entropy or saves physical energy.
The metaphor is powerful. The physics would be wrong.
Seventy Years of Confusion, Resolved
Von Neumann's joke has had a long run. For seventy years, the conflation of information entropy and thermal entropy has muddied the waters in physics, computer science, and everything in between. Researchers have built careers on the assumption that erasing a bit costs energy, that Maxwell's demon is defeated by Landauer's principle, that computation has irreducible thermodynamic limits.
Kish and Ferry resolved this confusion with mathematical rigor. The two entropies are different quantities with different properties, different domains, and different physical meanings. They share a name and a functional form — and nothing else.
Digital Circuitality builds on the resolution. By grounding our framework in pure information theory — where it belongs — we inherit the mathematical certainty of Shannon's work without the baggage of contested thermodynamic claims. The result is a framework that is cleaner, more honest, and more durable.
Sometimes the strongest move in science is admitting what you got wrong and building on the correction.
Published by the BRIK-64 team. For more on Digital Circuitality, see What Is Digital Circuitality?, EVA Algebra Deep Dive, and Precision as Domain.