Bayesian Conditional Monte Carlo Algorithms for Sequential Single and Multi-Object filtering
Bayesian filtering aims at tracking sequentially a hidden process from an observed one. In particular, sequential Monte Carlo (SMC) techniques propagate in time weighted trajectories which represent the posterior probability density function (pdf) of the hidden process given the available observations. On the other hand, Conditional Monte Carlo (CMC) is a variance reduction technique which replaces the estimator of a moment of interest by its conditional expectation given another variable. In this paper we show that up to some adaptations, one can make use of the time recursive nature of SMC algorithms in order to propose natural temporal CMC estimators of some point estimates of the hidden process, which outperform the associated crude Monte Carlo (MC) estimator whatever the number of samples. We next show that our Bayesian CMC estimators can be computed exactly, or approximated efficiently, in some hidden Markov chain (HMC) models; in some jump Markov state-space systems (JMSS); as well as in multitarget filtering. Finally our algorithms are validated via simulations.
💡 Research Summary
Bayesian filtering seeks to infer a hidden dynamic process from noisy observations, a task that underpins applications ranging from navigation to target tracking. Sequential Monte Carlo (SMC) methods, commonly known as particle filters, approximate the posterior density by propagating a set of weighted particles through time. While flexible, SMC suffers from weight degeneracy: with a finite number of particles, most weights concentrate on a few particles, inflating estimator variance. Traditional variance‑reduction techniques such as resampling or sophisticated proposal distributions alleviate but do not eliminate this problem.
Conditional Monte Carlo (CMC) is a statistical device that replaces a crude Monte‑Carlo estimator of a moment by its conditional expectation given an auxiliary variable, thereby achieving the Rao‑Blackwell lower bound on variance. The authors observe that the recursive nature of SMC provides a natural auxiliary variable—the particle set at the previous time step. By exploiting this, they construct a “temporal CMC” estimator for any point estimate of the hidden state, e.g., the posterior mean or mode. Concretely, for a function f(x_t) of the current state, the desired posterior expectation θ_t = E
Comments & Academic Discussion
Loading comments...
Leave a Comment