BRIK64
Foundations

The science behind Digital Circuitality.

A deterministic, verified computational system has informational entropy zero. This is a statement about information, not about heat.

[01] FORMAL DEFINITION

Digital Circuitality

Shannon (1948) defined the entropy of a discrete source as H(X) = -Σ p(x_i) log_2 p(x_i). When a system is completely deterministic — every input produces exactly one output through every path — the probability distribution collapses to a Dirac delta on the correct result: H(X) = 0.

A system S exhibits Digital Circuitality if and only if:

Full coherence means zero informational uncertainty

When Φc certifies a system, every state is known and every path is verified

Φc = 1 certifies that every input domain is bounded, every operation verified, every output range proven, and no execution path is undefined. There is no informational uncertainty because there is no unknown state.

Conventional software operates with informational uncertainty > 0. Unverified execution paths, unexplored states, unbounded inputs. Testing reduces informational uncertainty but never eliminates it — Dijkstra (1976): “Testing shows the presence of bugs, never their absence.”

Digital Circuitality eliminates informational uncertainty by construction, not by sampling.

[02] THERMODYNAMIC ANALOGY

The analogy and its limits

The term “thermodynamic” in Digital Circuitality is an analogy, not a physical claim. A physical circuit is coherent when energy flows from source to sink without leaks, all signal paths are closed, and the circuit reaches steady state.

Physical circuit propertyComputational metric
Energy flow without leaksTransfer efficiency
Closed signal pathsCircuit closure (Φc)
Signal integritySignature verification
Full connectivityVerification completeness
Circuit complexityStructural complexity metrics

What the analogy does NOT claim: No physical energy cost from Φc = 1. No equivalence between computational and thermodynamic coherence. No claim that thermodynamic laws govern compilation.

[03] EVA ALGEBRA

Composition operators

SEQ

Sequential

Do A, then B. Output of A feeds input of B.

PAR

Parallel

Do A and B independently. No data dependency.

COND

Conditional

If X then A, else B. Both branches verified.

Each operator preserves the correctness of its operands. If Part A works and Part B works, their composition is guaranteed to work. This is what hardware has always had — and software never did.

[04] COHERENCE METRICS FRAMEWORK

CMF: Three metrics, one condition

Φc

Circuit Closure

Certifies that every branch has a complete input-to-output path. No dangling operations. The computational analog of a closed electrical loop.

Integrity

Signal Integrity

Verifies that observed behavior matches the expected specification exactly. The analog of signal integrity — no distortion, no noise.

Coverage

Verification Completeness

All paths have been verified. No execution path has unknown behavior. The analog of full connectivity in a circuit.

Certification Condition

Certification is binary. All three conditions — closure, integrity, and coverage — must hold simultaneously. There is no partial certification.

If any condition fails, the program does not compile.

[05] INFORMATION THEORY BASIS

Informational entropy ≠ thermal entropy

Recent research in information physics has demonstrated that informational entropy and thermal entropy are fundamentally different quantities. Treating them as interchangeable is a category error.

For Digital Circuitality, the consequence is direct: the coherence framework measures informational entropy, not thermal entropy. No thermodynamic claims are needed for the framework to be rigorous. The verification operates on purely informational foundations.

[06] BRILLOUIN CONNECTION

From Landauer to Brillouin as inspiration

Digital Circuitality draws conceptual inspiration from Brillouin’s work on the relationship between information and entropy, while operating on purely informational foundations grounded in Shannon’s framework.

The system does not depend on any physical thermodynamic claims. It acknowledges the historical inspiration while maintaining rigorous separation between informational and physical domains.

[07] DETERMINISTIC VERIFICATION

Zero informational uncertainty by construction

A formally verified, deterministic system has zero informational uncertainty. Every state is known, every path verified, every domain bounded, the circuit is closed.

This is what Φc= 1 means in Digital Circuitality: the system’s informational entropy is zero — not by testing, but by mathematical construction.

[08] UNIVERSAL TRANSPILATION

Transpilation through informational closure

Traditional transpilers operate at the syntactic level: parse an AST in one language, emit an AST in another. BRIK-64 operates at the semanticlevel — extracting the computational essence (what it computes, not how it’s expressed) and encoding it as a PCD circuit.

The critical property: if two programs in different languages produce the same PCD circuit, they are functionally equivalent. PCD captures the informational content of computation independent of syntactic vehicle.

The arithmetic

  • • 10 input languages → PCD → 14 output targets
  • • 10 + 14 = 24 components for 10 × 14 = 140 transpilation paths
  • • Same architectural idea as LLVM (frontends + IR + backends)
  • • The addition LLVM doesn’t have: formal equivalence certification

The TCE certifies that the PCD circuit is closed (Φc = 1), guaranteeing the computation is deterministic, total, and informationally preserving. The equivalence is algebraic, not tested.

[09] REFERENCES

Academic foundations

Shannon, C.E. (1948)

A Mathematical Theory of Communication.

Bell Syst. Tech. J. 27, 379–423

Foundation: informational entropy, the framework in which the entire system operates.

Brillouin, L. (1953)

The Negentropy Principle of Information.

J. Appl. Phys. 24, 1152–1163

Inspiration: conceptual information-entropy connection. Not foundation, inspiration.

Dijkstra, E.W. (1976)

A Discipline of Programming.

Prentice-Hall

Motivation: “Testing shows the presence of bugs, never their absence.”

Kish, L.B. et al. (2016–2018)

Research on the distinction between informational and thermal entropy.

Journal of Computational Electronics

Foundational: informational entropy and thermal entropy are distinct quantities.

ACKNOWLEDGMENT

Prof. Laszlo B. Kish(Texas A&M University) reviewed the foundational theoretical framework of Digital Circuitality. His research on the distinction between informational and thermal entropy informed the theoretical foundations of the system.

The logical chain

1. Shannon (1948) establishes that deterministic systems have zero informational entropy

2. Modern research confirms informational entropy is distinct from thermal entropy

3. A deterministic, verified computer has zero informational uncertainty

4. BRIK-64 builds a compiler that certifies this property by mathematical construction

5. The BPU materializes this certification in silicon, where verification is physical and non-maskable