The dynamics of message passing on dense graphs, with applications to compressed sensing

The dynamics of message passing on dense graphs, with applications to   compressed sensing
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Approximate message passing algorithms proved to be extremely effective in reconstructing sparse signals from a small number of incoherent linear measurements. Extensive numerical experiments further showed that their dynamics is accurately tracked by a simple one-dimensional iteration termed state evolution. In this paper we provide the first rigorous foundation to state evolution. We prove that indeed it holds asymptotically in the large system limit for sensing matrices with independent and identically distributed gaussian entries. While our focus is on message passing algorithms for compressed sensing, the analysis extends beyond this setting, to a general class of algorithms on dense graphs. In this context, state evolution plays the role that density evolution has for sparse graphs. The proof technique is fundamentally different from the standard approach to density evolution, in that it copes with large number of short loops in the underlying factor graph. It relies instead on a conditioning technique recently developed by Erwin Bolthausen in the context of spin glass theory.


💡 Research Summary

The paper addresses a fundamental gap in the theoretical understanding of Approximate Message Passing (AMP) algorithms, which have become a cornerstone for sparse signal recovery in compressed sensing. While extensive simulations have shown that the empirical performance of AMP can be accurately tracked by a one‑dimensional recursion known as state evolution, a rigorous proof of this phenomenon had been lacking. The authors fill this void by proving that state evolution holds asymptotically in the large‑system limit (N → ∞ with a fixed measurement ratio δ = M/N) for sensing matrices whose entries are independent and identically distributed (i.i.d.) Gaussian.

The proof does not follow the traditional density‑evolution route used for sparse factor graphs, because the factor graph underlying AMP is dense and contains an enormous number of short cycles. Instead, the authors adapt a conditioning technique originally introduced by Erwin Bolthausen in the context of spin‑glass theory. By conditioning on the current iterate and carefully decomposing the Gaussian measurement matrix, they show that the residuals at each iteration behave as if they were perturbed by fresh Gaussian noise that is independent of the past. This key observation allows them to derive a closed‑form scalar recursion for the mean‑square error (MSE) τ_t^2:

 τ_{t+1}^2 = σ^2 + (1/δ) E


Comments & Academic Discussion

Loading comments...

Leave a Comment