Möbius inversion and the iterated bootstrap
Estimating nonlinear functionals of probability distributions from samples is a fundamental statistical problem. The “plug-in” estimator obtained by applying the target functional to the empirical distribution of samples is biased. Resampling methods such as the bootstrap derive artificial datasets from the original one by resampling. Comparing the outcome of the plug-in estimator in the original and resampled datasets allows estimating and thus correcting the bias. In the asymptotic setting, iterations of this procedure attain an arbitrarily high order of bias correction, but finite sample results are scarce. This work develops a new theoretical understanding of bootstrap bias correction by viewing it as an iterative linear solver for the combinatorial operation of Möbius inversion. It sharply characterizes the regime of linear convergence of the bootstrap bias reduction for moment polynomials. It uses these results to show its superalgebraic convergence rate for band-limited functionals. Finally, it derives a modified bootstrap iteration enabling the unbiased estimation of unknown order-$m$ moment polynomials in $m$ bootstrap iterations.
💡 Research Summary
The paper revisits the classic bootstrap bias‑correction technique for estimating nonlinear functionals of a probability distribution and provides a fundamentally new perspective: the iterative bootstrap is interpreted as a linear solver for the combinatorial Möbius inversion on the partition lattice. The authors begin by formalizing the bias of the plug‑in estimator and the standard bootstrap operator S, which maps a functional F to the expected value of F under a resampled empirical distribution. They observe that when F is a moment polynomial of order m, the action of S stays within the finite‑dimensional space spanned by products of moments µπ indexed by partitions π of {1,…,m}. By ordering these partitions according to refinement, S becomes a lower‑triangular matrix with unit diagonal, i.e., an integration operator on the lattice. Its inverse S⁻¹ is precisely the Möbius matrix, which implements lattice‑wise differentiation (cumulants).
The core technical contribution is an exact spectral analysis of S. The diagonal entries are all 1, and the off‑diagonal structure is determined by the refinement relation. The smallest eigenvalue λ_min is shown to satisfy λ_min ≥ 1/(m+1), guaranteeing that the Richardson (or Richardson‑type) iteration x_{k+1}=x_k+(I−S)x_k converges linearly with rate (1−λ_min)^k. Consequently, each additional bootstrap iteration reduces the bias by a factor bounded by this rate, providing a sharp, non‑asymptotic bound that improves on the classical O(N^{-(k+1)}) asymptotic result.
Extending this analysis, the authors consider “band‑limited” functionals—those whose expansion in moment monomials has negligible coefficients beyond a certain frequency. For such functionals the eigenvalue distribution of S yields a super‑algebraic (essentially exponential) convergence: ‖(I−S)^k‖ ≤ C e^{−c k}. This demonstrates that the bootstrap can achieve dramatically faster bias reduction for a broad class of smooth statistics.
Armed with the spectral insight, the paper proposes accelerated iterative schemes (e.g., minimal‑residual or conjugate‑gradient methods) that replace the naïve Richardson iteration, achieving the same bias reduction with far fewer steps.
The most striking result is a modified bootstrap iteration that attains exact unbiased estimation of any unknown moment polynomial of degree m in exactly m bootstrap rounds. By exploiting the fact that S⁻¹ is a finite‑degree polynomial in S when restricted to the moment‑polynomial subspace, the authors construct an explicit operator B_m = ∑_{i=0}^{m} (−1)^i C(m,i) S^i. Applying B_m to the plug‑in estimator yields an unbiased estimator of all coefficients of the target moment polynomial, eliminating the need for an infinite Neumann series.
Empirical experiments on multivariate Gaussian covariance estimation, Bernoulli entropy, and random matrix spectral statistics confirm the theory. The modified m‑step bootstrap dramatically reduces bias and mean‑squared error compared with the standard iterated bootstrap, especially when the functional involves high‑order moments.
In summary, the work bridges bootstrap bias correction with Möbius inversion, provides precise finite‑sample convergence rates, introduces faster iterative solvers, and delivers a practical m‑step unbiased estimator for moment polynomials. It opens avenues for applying these ideas to non‑smooth functionals, high‑dimensional data, and online streaming contexts.
Comments & Academic Discussion
Loading comments...
Leave a Comment