Information and Entropy
What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes’ rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops – the Maximum Entropy and the Bayesian methods – into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.
💡 Research Summary
The paper tackles the age‑old question “What is information? Is it physical?” by adopting a Bayesian perspective in which information is defined not as a physical commodity but as any constraint that forces a rational agent to revise its beliefs. In this view, information is the driver of belief change: it limits the set of admissible probability distributions and thereby compels an update from a prior to a posterior.
To formalize the update process, the author introduces an “eliminative induction” framework. All conceivable updating rules are taken as candidates, and four rational desiderata—consistency, coordinate‑invariance, subsystem independence, and the ability to handle arbitrary constraints—are imposed. The only functional that survives these stringent criteria is the logarithmic relative entropy
\
Comments & Academic Discussion
Loading comments...
Leave a Comment