Maximum-entropy moment-closure for stochastic systems on networks

Maximum-entropy moment-closure for stochastic systems on networks
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Moment-closure methods are popular tools to simplify the mathematical analysis of stochastic models defined on networks, in which high dimensional joint distributions are approximated (often by some heuristic argument) as functions of lower dimensional distributions. Whilst undoubtedly useful, several such methods suffer from issues of non-uniqueness and inconsistency. These problems are solved by an approach based on the maximisation of entropy, which is motivated, derived and implemented in this article. A series of numerical experiments are also presented, detailing the application of the method to the Susceptible-Infective-Recovered model of epidemics, as well as cautionary examples showing the sensitivity of moment-closure techniques in general.


💡 Research Summary

The paper addresses a fundamental problem in the analysis of stochastic processes defined on networks: the need to close an infinite hierarchy of equations that describe the evolution of joint probability distributions of increasing order. Traditional moment‑closure techniques approximate higher‑order joint distributions by functions of lower‑order marginals (e.g., the pair approximation). While useful, these methods suffer from two serious drawbacks: non‑uniqueness (different closures can be constructed from the same set of low‑order marginals) and inconsistency (the approximated higher‑order distribution may not reproduce the given marginals, leading to unnormalised or contradictory probabilities). A classic illustration is the pair approximation applied to triangles, where the commonly used formula fails to be a proper probability distribution.

To overcome these issues, the author proposes a maximum‑entropy (MaxEnt) moment‑closure. The idea is simple yet powerful: given a collection of known marginal distributions (P_g(x_g)) for a set of subgraphs (\mathcal G), the unknown full distribution (P_G(x)) should be chosen as the distribution that maximises the Shannon entropy \


Comments & Academic Discussion

Loading comments...

Leave a Comment