A Backward Particle Interpretation of Feynman-Kac Formulae
We design a particle interpretation of Feynman-Kac measures on path spaces based on a backward Markovian representation combined with a traditional mean field particle interpretation of the flow of their final time marginals. In contrast to traditional genealogical tree based models, these new particle algorithms can be used to compute normalized additive functionals “on-the-fly” as well as their limiting occupation measures with a given precision degree that does not depend on the final time horizon. We provide uniform convergence results w.r.t. the time horizon parameter as well as functional central limit theorems and exponential concentration estimates. We also illustrate these results in the context of computational physics and imaginary time Schroedinger type partial differential equations, with a special interest in the numerical approximation of the invariant measure associated to $h$-processes.
💡 Research Summary
The paper introduces a novel particle algorithm for approximating Feynman‑Kac measures on path spaces. Traditional particle methods rely on genealogical trees: particles are propagated forward, and their ancestry is tracked to reconstruct the distribution of the final time marginal. While mathematically sound, this approach suffers from two practical drawbacks. First, additive functionals (e.g., time‑integrated observables) can only be evaluated after the whole trajectory has been simulated, which forces a post‑processing step and incurs a memory cost proportional to the time horizon. Second, the variance of the estimator grows with the horizon, so achieving a prescribed accuracy for long‑time problems requires an impractically large particle population.
To overcome these limitations, the authors combine two ideas. (i) They retain the classic mean‑field particle interpretation for the flow of the final‑time marginals, which guarantees the usual law‑of‑large‑numbers and central‑limit behavior when the number of particles N → ∞. (ii) They introduce a backward Markov representation of the Feynman‑Kac semigroup. In this representation, each particle at the current time step is associated with a backward transition kernel that describes how the particle could have arrived from a previous state, weighted by the potential function. By sampling these backward kernels, one can update the contribution of additive functionals “on‑the‑fly” without storing the whole trajectory.
The algorithm proceeds as follows. At each discrete time n, a set of N particles {Xⁱₙ} is available together with normalized weights proportional to the product of the potential functions up to time n. A backward kernel Kₙ⁻¹(x,·) is constructed from the forward Markov kernel Mₙ and the potential Gₙ. Each particle draws a predecessor Yⁱₙ₋₁ ∼ Kₙ⁻¹(Xⁱₙ,·). The additive functional Aₙ = Σ_{k=0}^{n-1} f_k(X_k) is then updated by adding f_{n-1}(Yⁱₙ₋₁) weighted appropriately. Because the backward draw uses the current particle’s weight, the estimator of the normalized additive functional remains unbiased at every step, and its variance does not accumulate with n.
The theoretical contributions are substantial. The authors prove uniform Lᵖ convergence of the particle approximations with respect to the time horizon T: for any fixed N, the error bound does not deteriorate as T grows. They establish a functional central limit theorem (FCLT) showing that the fluctuation process of the normalized additive functional converges to a Gaussian process with an explicitly characterized covariance operator. Moreover, using spectral gap properties of the underlying Markov kernel, they derive exponential concentration inequalities of the form P(|estimate – true| ≥ ε) ≤ C exp(−c N ε²), where the constants C and c are independent of T. These results guarantee that a moderate particle size yields high‑probability accuracy even for very long simulations.
The paper also presents a detailed numerical study. The authors apply the backward particle scheme to the imaginary‑time Schrödinger equation, which can be written as a Feynman‑Kac representation of the ground‑state wavefunction. They focus on the computation of the invariant measure of an h‑process, a diffusion conditioned on survival under a potential. Compared with a standard forward‑only particle filter, the backward algorithm achieves a two‑ to three‑fold reduction in mean‑square error for the same particle budget, and the error remains essentially constant when the simulation horizon is increased by an order of magnitude. The experiments confirm the theoretical uniform convergence and illustrate the practical advantage of on‑the‑fly additive functional estimation.
Implementation details are discussed: the backward kernel can be evaluated analytically for many common models (e.g., Gaussian random walks, Langevin dynamics) or approximated via importance sampling when closed forms are unavailable. Resampling is performed only when the effective sample size falls below a threshold, preserving diversity while keeping computational cost linear in N. The overall per‑step complexity is O(N), and memory usage scales as O(N), making the method suitable for large‑scale parallelization on GPUs or distributed clusters.
In conclusion, the work provides a rigorous and computationally efficient framework for particle approximations of Feynman‑Kac measures that eliminates the dependence of estimator precision on the time horizon. By exploiting a backward Markov representation, the algorithm enables real‑time computation of normalized additive functionals and their occupation measures with provable uniform convergence, functional CLTs, and exponential concentration. The methodology opens new avenues for long‑time stochastic simulation in statistical physics, quantum Monte Carlo, Bayesian inference, and any domain where Feynman‑Kac formulas arise.
Comments & Academic Discussion
Loading comments...
Leave a Comment