Large-Flip Importance Sampling
We propose a new Monte Carlo algorithm for complex discrete distributions. The algorithm is motivated by the N-Fold Way, which is an ingenious event-driven MCMC sampler that avoids rejection moves at any specific state. The N-Fold Way can however get “trapped” in cycles. We surmount this problem by modifying the sampling process. This correction does introduce bias, but the bias is subsequently corrected with a carefully engineered importance sampler.
💡 Research Summary
The paper introduces Large‑Flip Importance Sampling (LF‑IS), a novel Monte Carlo algorithm designed to efficiently sample from complex discrete probability distributions. The work builds on the N‑Fold Way (NFW), an event‑driven MCMC technique that eliminates rejection moves by pre‑computing the total transition probability from the current state and selecting the next state proportionally. While NFW is attractive for its rejection‑free property, it suffers from a critical limitation: it can become trapped in short cycles when the state‑space graph contains a small set of mutually reachable states separated by high‑energy barriers. In such cases the sampler repeatedly visits the same cycle, leading to poor exploration and biased estimates.
To overcome this, the authors propose a “large‑flip” mechanism. Instead of flipping a single variable (or a tiny subset) at each iteration, LF‑IS flips a block of k variables simultaneously. The variable set is partitioned into blocks of size k, and for each block the joint transition probabilities are pre‑computed, taking full account of intra‑block interactions. By moving a Hamming distance of k in a single step, the sampler can leap over energy barriers that would otherwise confine NFW, dramatically reducing the probability of cycle entrapment.
Because the block‑wise proposal distribution q(x′|x) generally differs from the target distribution π(x), a bias is introduced. The authors correct this bias using importance sampling: each transition is weighted by w = π(x′)/q(x′|x). The product of these weights along a sampled trajectory yields an unbiased estimator of any expectation under π, provided the weights have finite variance. The paper derives conditions under which the variance remains bounded and shows how to choose the block size k and block selection probabilities to minimize weight variance.
Theoretical contributions include proofs that (1) LF‑IS preserves the Markov property while achieving faster state‑space coverage, (2) the weighted estimator converges almost surely to the true expectation, and (3) a central limit theorem holds when weight variance is controlled. The authors also present an analysis of computational complexity, demonstrating that the overhead of computing block transition probabilities is offset by the reduction in required iterations.
Empirical evaluation covers three benchmark problems: binary image denoising with a Markov random field, a spin‑glass model with rugged energy landscape, and a high‑dimensional labeling task with many pairwise constraints. In all cases LF‑IS outperforms standard NFW and Gibbs sampling in terms of effective sample size per unit time and accuracy of posterior estimates. The importance‑weighted estimates match ground‑truth expectations, confirming the correctness of the bias correction.
In summary, the paper delivers a compelling solution to the cycle‑trapping issue of the N‑Fold Way by introducing simultaneous multi‑variable flips and rigorously correcting the induced bias through importance sampling. This combination yields an unbiased, efficient sampler for a broad class of discrete models, opening avenues for further research on adaptive block selection, dynamic flip sizes, and extensions to hybrid continuous‑discrete spaces.