Entropic Inference
In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes’ rule, and therefore unifies the two themes of these workshops – the Maximum Entropy and the Bayesian methods – into a single general inference scheme.
💡 Research Summary
The paper “Entropic Inference” presents a unified theoretical framework that brings together two historically distinct approaches to probabilistic reasoning—Maximum Entropy (MaxEnt) and Bayesian updating—under the single principle of Maximum Relative Entropy (ME). The authors begin by re‑examining the notion of information from an epistemological standpoint. Rather than treating information merely as a measure of signal strength or channel capacity (as in Shannon’s theory), they define it as “a constraint that changes the beliefs of a rational agent.” This definition naturally dovetails with the Bayesian view, where beliefs are encoded in probability distributions and are revised when new evidence arrives. However, Bayesian theory itself does not prescribe a unique, principled rule for how the revision should be performed beyond the formal statement of Bayes’ theorem.
To fill this gap, the authors employ an eliminative induction strategy. They start with a broad class of possible updating rules and systematically discard those that violate a set of desiderata: (1) consistency with prior information, (2) invariance under re‑parameterisation (coordinate‑independence), (3) additivity for independent subsystems, (4) continuity, and (5) the requirement that updating by a null constraint leaves the prior unchanged. Remarkably, after imposing all these criteria, only one functional survives—the logarithmic relative entropy (also known as the Kullback‑Leibler divergence):
\
Comments & Academic Discussion
Loading comments...
Leave a Comment