On Shannon-Jaynes Entropy and Fisher Information

On Shannon-Jaynes Entropy and Fisher Information
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

The fundamentals of the Maximum Entropy principle as a rule for assigning and updating probabilities are revisited. The Shannon-Jaynes relative entropy is vindicated as the optimal criterion for use with an updating rule. A constructive rule is justified which assigns the probabilities least sensitive to coarse-graining. The implications of these developments for interpreting physics laws as rules of inference upon incomplete information are briefly discussed.


💡 Research Summary

The paper revisits the Maximum Entropy Principle (MEP) as a systematic rule for assigning and updating probability distributions when only incomplete information is available. It begins by distinguishing two conceptual stages: (i) the prior assignment of a probability distribution in the absence of full data, and (ii) the posterior update when new constraints or observations become known.

In the prior stage, the authors argue that the most unbiased choice is the distribution that is least sensitive to coarse‑graining – that is, a distribution whose entropy changes minimally when the underlying microscopic variables are averaged over a finite resolution. By formulating this requirement as a variational problem, they show that the functional to be minimized contains two competing terms: the Shannon–Jaynes entropy (S


Comments & Academic Discussion

Loading comments...

Leave a Comment