A statistical mechanical interpretation of algorithmic information theory III: Composite systems and fixed points
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp.425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer’s LNCS, vol.5407, pp.422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), and statistical mechanical entropy S(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature itself as a thermodynamic quantity, namely, for each of all the thermodynamic quantities above, the computability of its value at temperature T gives a sufficient condition for T in (0,1) to be a fixed point on partial randomness. In this paper, we develop the statistical mechanical interpretation of AIT further and pursue its formal correspondence to normal statistical mechanics. The thermodynamic quantities in AIT are defined based on the halting set of an optimal computer, which is a universal decoding algorithm used to define the notion of program-size complexity. We show that there are infinitely many optimal computers which give completely different sufficient conditions in each of the thermodynamic quantities in AIT. We do this by introducing the notion of composition of computers into AIT, which corresponds to the notion of composition of systems in normal statistical mechanics.
💡 Research Summary
This paper deepens the statistical‑mechanical interpretation of algorithmic information theory (AIT) by establishing a formal correspondence with ordinary statistical mechanics and by introducing a notion of “composition of computers” that mirrors the composition of physical systems. Building on the author’s earlier works (Tadaki 2008, 2009), the paper first recalls the thermodynamic quantities defined from an optimal computer U: the partition function
(Z_U(T)=\sum_{p\in\mathrm{Dom}(U)}2^{-|p|/T}),
the free energy (F_U(T)=-T\log_2 Z_U(T)), the average energy (E_U(T)=\sum_{p}|p|,2^{-|p|/T}/Z_U(T)), and the entropy (S_U(T)=(E_U(T)-F_U(T))/T). Here the temperature (T\in(0,1)) coincides with the “partial randomness” of a real number, a refined measure of compressibility expressed via program‑size complexity (K(\cdot)).
The central novelty is the definition of a composition operation (C_1\oplus C_2) for two optimal computers (C_1) and (C_2). A program for the composed computer consists of a pair ((p_1,p_2)); it halts iff both (p_1) halts on (C_1) and (p_2) halts on (C_2). This construction yields the exact additive relations familiar from thermodynamics:
(Z_{C_1\oplus C_2}(T)=Z_{C_1}(T)Z_{C_2}(T)),
(F_{C_1\oplus C_2}(T)=F_{C_1}(T)+F_{C_2}(T)),
(E_{C_1\oplus C_2}(T)=E_{C_1}(T)+E_{C_2}(T)),
(S_{C_1\oplus C_2}(T)=S_{C_1}(T)+S_{C_2}(T)). Consequently, the composed system behaves thermodynamically as the sum of its parts, confirming that the AIT framework can faithfully model composite physical systems.
Using this composition, the author constructs an infinite family of distinct optimal computers ({U_i}{i\in\mathbb{N}}) by repeatedly composing a base optimal computer (U) with itself (e.g., (U_i = \underbrace{U\oplus\cdots\oplus U}{i\text{ times}})). For each (U_i) the four thermodynamic quantities are scaled versions of those for (U), but crucially the computability of a quantity at a given temperature depends on the particular computer. The main theorem (Theorem 3.1) states that for any two different indices (i\neq j),
- the computability of (Z_{U_i}(T)) does not imply the computability of (Z_{U_j}(T));
- the same independence holds for (F, E,) and (S).
In other words, each computer (U_i) yields its own set of “fixed points” of partial randomness: \
Comments & Academic Discussion
Loading comments...
Leave a Comment