Enumerable Distributions, Randomness, Dependence

Enumerable Distributions, Randomness, Dependence
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Mutual information I in infinite sequences (and in their finite prefixes) is essential in theoretical analysis of many situations. Yet its right definition has been elusive for a long time. I address it by generalizing Kolmogorov Complexity theory from measures to SEMImeasures i.e, infimums of sets of measures. Being concave rather than linear functionals, semimeasures are quite delicate to handle. Yet, they adequately grasp various theoretical and practical scenaria. A simple lower bound i$(\alpha:\beta) = \sup,_{x\in N},(K(x) - K(x|\alpha) - K(x|\beta)) $ for information turns out tight for Martin-Lof random $ \alpha,\beta $. For all sequences I$(\alpha:\beta) $ is characterized by the minimum of i$(\alpha’:\beta’) $ over random $ \alpha’,\beta’ $ with $ U(\alpha’)=\alpha, U(\beta’)=\beta $.


💡 Research Summary

The paper tackles a long‑standing problem in algorithmic information theory: how to define mutual information I for infinite binary sequences (and their finite prefixes) in a way that is both mathematically robust and applicable to practical scenarios. Traditional definitions rely on probability measures, but measures are linear functionals and often fail to capture the subtleties of non‑computable or non‑enumerable distributions that arise when dealing with infinite objects. To overcome this limitation, the author generalizes Kolmogorov‑Complexity‑based reasoning from measures to semimeasures—functions that are the infimum of a family of probability measures. Because semimeasures are concave rather than linear, they are more delicate to manipulate, yet they naturally encompass all enumerable distributions and therefore provide a richer framework for analyzing information in algorithmic contexts.

The central technical contribution is the introduction of a lower bound for mutual information, denoted

\


Comments & Academic Discussion

Loading comments...

Leave a Comment