Foundations of Inference
We present a simple and clear foundation for finite inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying lattices of logical state
We present a simple and clear foundation for finite inference that unites and significantly extends the approaches of Kolmogorov and Cox. Our approach is based on quantifying lattices of logical statements in a way that satisfies general lattice symmetries. With other applications such as measure theory in mind, our derivations assume minimal symmetries, relying on neither negation nor continuity nor differentiability. Each relevant symmetry corresponds to an axiom of quantification, and these axioms are used to derive a unique set of quantifying rules that form the familiar probability calculus. We also derive a unique quantification of divergence, entropy and information.
💡 Research Summary
The paper “Foundations of Inference” proposes a minimalist, symmetry‑driven framework for finite inference that unifies and extends the classic Kolmogorov probability axioms and Cox’s desiderata for rational belief. The authors begin by modeling a finite collection of logical statements as a partially ordered set equipped with two binary operations—join (∨) and meet (∧)—which form a lattice. No assumptions about negation, continuity, or differentiability are made; the only structural requirements are the usual lattice symmetries: commutativity, associativity, and distributivity.
A quantification function Q is introduced to assign a real number to each lattice element. Four axioms, each directly reflecting a lattice symmetry, govern Q: (1) non‑negativity (Q(A) ≥ 0 for all A), (2) normalization (Q(⊤) = 1 for the top element representing the tautology), (3) additivity for disjoint joins (if A∧B = ⊥ then Q(A∨B) = Q(A)+Q(B)), and (4) preservation of distributivity (Q(A∧(B∨C)) = Q((A∧B)∨(A∧C))). From these axioms the familiar probability calculus follows uniquely. Notably, the law of complement Q(¬A) = 1 − Q(A) emerges as a theorem rather than an assumption, showing that negation is a derived operation within the lattice.
Having established a probability measure, the authors turn to information‑theoretic quantities. By demanding that any divergence measure respect the same lattice symmetries and be additive over independent components, they prove a uniqueness theorem that forces the divergence to take the Kullback‑Leibler form D(P‖Q) = ∑ p_i log(p_i/q_i). Consequently, Shannon entropy H(P) = −∑ p_i log p_i and mutual information I(P;Q) arise naturally as special cases. Importantly, the derivations avoid any continuity or differentiability assumptions; the results are purely algebraic consequences of the lattice structure.
The paper also discusses the relationship to measure theory. A σ‑algebra in the infinite case is a complete lattice, so the same quantification axioms apply, providing a seamless bridge between the finite lattice framework and classical probability measures. This demonstrates that the proposed axioms not only reproduce Kolmogorov’s theory but also extend it to settings where traditional analytic assumptions are unavailable.
In the concluding section the authors argue that their symmetry‑based axiomatization subsumes both Kolmogorov’s and Cox’s approaches. Every Kolmogorov probability space satisfies the lattice axioms, and every Coxian rational belief system respects the same additive and normalization constraints. By stripping away extraneous assumptions, the framework offers a more parsimonious foundation for inference, while simultaneously delivering a unique, mathematically rigorous derivation of divergence, entropy, and information. The work therefore opens the door to new applications in areas such as quantum logic, non‑classical probability, and generalized measure theory, where the underlying logical structure may deviate from Boolean logic but still retain lattice symmetries.
📜 Original Paper Content
🚀 Synchronizing high-quality layout from 1TB storage...