The Tsallis entropy and the Shannon entropy of a universal probability
We study the properties of Tsallis entropy and Shannon entropy from the point of view of algorithmic randomness. In algorithmic information theory, there are two equivalent ways to define the program-size complexity K(s) of a given finite binary string s. In the standard way, K(s) is defined as the length of the shortest input string for the universal self-delimiting Turing machine to output s. In the other way, the so-called universal probability m is introduced first, and then K(s) is defined as -log_2 m(s) without reference to the concept of program-size. In this paper, we investigate the properties of the Shannon entropy, the power sum, and the Tsallis entropy of a universal probability by means of the notion of program-size complexity. We determine the convergence or divergence of each of these three quantities, and evaluate its degree of randomness if it converges.
💡 Research Summary
The paper investigates the behavior of three information‑theoretic quantities—Shannon entropy, the power sum ∑ m(s)^q, and Tsallis entropy—when applied to a universal probability distribution m, using the framework of algorithmic randomness. A universal probability is a lower‑computable semi‑measure that dominates all other such semi‑measures up to a multiplicative constant; it is closely linked to Kolmogorov (program‑size) complexity K(s) by the relation K(s)=‑log₂ m(s)+O(1).
First, the authors prove that the Shannon entropy H(m)=‑∑ m(s) log m(s) diverges to infinity. The key technical tool is Theorem 6, which shows that for any infinite r.e. set A and any total recursive function f with lim f(n)=∞, the sum ∑_{U(p)∈A} f(|p|)·2^{‑|p|} diverges. By choosing A={0,1}* and f(n)=n, they obtain divergence of ∑ K(s)²‑K(s), which dominates the negative entropy term, establishing Corollary 7: H(m)=∞ for every universal probability.
Next, the paper turns to the power sum P_q(m)=∑ m(s)^q. For q≥1, Theorem 8(i) shows that if q is right‑computable, then P_q(m) converges to a left‑computable real number that is at least weakly Chaitin 1/q‑random. This means the limit carries a degree of randomness proportional to 1/q. Conversely, for 0<q<1, Theorem 8(ii) and Theorem 9 prove divergence to infinity; moreover, if the sum were right‑computable, then q itself would have to be weakly Chaitin 1/q‑random, establishing a converse relationship.
The authors also address the natural conjecture that for any universal probability and any computable q>1, the sum P_q(m) would be 1/q‑compressible (i.e., its compression rate equals 1/q). Theorem 10 disproves this by constructing a specific universal probability m that forces P_q(m) to be weakly Chaitin random for every computable q>1, thus not 1/q‑compressible. The construction modifies a base universal probability r by assigning a specially chosen value to the empty string λ, ensuring that m remains universal while m(λ)^q inherits the randomness of Chaitin’s Ω‑type constant θ.
Finally, the paper extends the analysis to Tsallis entropy, defined for a semi‑probability distribution p as S_q(p) = (1‑∑ p_i^q)/(q‑1). For a universal probability m, Theorem 11 shows that when q>1, S_q(m) can attain any computable degree of randomness; essentially, the same flexibility observed for the power sum carries over to Tsallis entropy. When 0<q<1, Theorem 12 demonstrates that S_q(m) diverges to infinity, mirroring the behavior of the power sum in this regime. The authors verify that as q→1, S_q(m) converges to the Shannon entropy, preserving consistency with classical information theory.
Overall, the paper establishes a deep connection between algorithmic randomness (via Kolmogorov complexity), computability notions (right‑/left‑computable reals), and statistical measures of information. It shows that universal probabilities are so “uneven” that their Shannon entropy is infinite, yet their generalized entropies (power sums, Tsallis) can be finely tuned to exhibit prescribed randomness degrees, depending on the parameter q. These results bridge algorithmic information theory with non‑extensive statistical mechanics, suggesting potential applications in cryptography, complexity theory, and the study of random processes where the degree of randomness, rather than mere convergence, is of central interest.
Comments & Academic Discussion
Loading comments...
Leave a Comment