The Art of Probability Assignment

The Art of Probability Assignment
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes’ theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkably insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.


💡 Research Summary

The paper tackles the longstanding problem of assigning probabilities to physical observables when prior knowledge is scarce. It frames probability assignment as a Bayesian inference task, emphasizing that the choice of prior distribution plays a crucial role when data are limited, while becoming virtually irrelevant in the large‑sample regime. The author first demonstrates that with abundant observations the posterior distribution is dominated by the likelihood, leading to “prior insensitivity”: different reasonable priors converge to nearly identical posteriors. This validates the common practice of using non‑informative or weakly informative priors in high‑statistics experiments.

In contrast, when only a handful of measurements are available, the posterior can be heavily biased by the prior. To mitigate this, the paper proposes a principle of “minimum sensitivity to prior variations.” The idea is to select a posterior that changes as little as possible when the prior is perturbed within a plausible neighbourhood. For continuous variables the author shows that this criterion is mathematically equivalent to minimizing the Fisher information of the posterior subject to the physical constraints (e.g., known moments, conservation laws). The Fisher information functional I


Comments & Academic Discussion

Loading comments...

Leave a Comment