An ILUES-based adaptive Gaussian process method for multimodal Bayesian inverse problems

An ILUES-based adaptive Gaussian process method for multimodal Bayesian inverse problems
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Inverse problems are prevalent in both scientific research and engineering applications. In the context of Bayesian inverse problems, sampling from the posterior distribution can be particularly challenging when the forward models are computationally expensive. This challenge is further compounded when the posterior distribution is multimodal. To address this issue, we propose a Gaussian process (GP)-based method to indirectly build surrogates for the forward model. Specifically, the unnormalized posterior density is expressed as a product of an auxiliary density and an exponential GP surrogate. Iteratively, the auxiliary density converges to the posterior distribution, starting from an arbitrary initial density. However, the efficiency of GP regression is highly influenced by the quality of the training data. Therefore, we utilize the iterative local updating ensemble smoother (ILUES) to generate high-quality samples that are concentrated in regions with high posterior probability. Subsequently, based on the surrogate model and mode information extracted using a clustering method, Markov chain Monte Carlo (MCMC) with a Gaussian mixed (GM) proposal is used to draw samples from the auxiliary density. Through numerical examples, we demonstrate that the proposed method can accurately and efficiently represent the posterior with a limited number of forward simulations.


💡 Research Summary

This paper addresses two major challenges that arise in Bayesian inverse problems (BIPs) when the forward model is computationally expensive and the posterior distribution is multimodal: (1) the high cost of evaluating the forward model at every iteration of a Markov chain Monte Carlo (MCMC) sampler, and (2) the tendency of standard MCMC algorithms to become trapped in a single mode, failing to explore the full posterior landscape. To overcome these difficulties, the authors propose a three‑stage framework that combines an adaptive Gaussian process (GP) surrogate, the Iterative Local Updating Ensemble Smoother (ILUES), and a Gaussian‑mixture proposal distribution derived from clustering the ILUES samples.

The first component rewrites the unnormalized posterior π̃(θ|d) as the product of an auxiliary density p(θ) and an exponential GP surrogate: π̃(θ|d)=p(θ)·exp(f(θ)), where f(θ)=log


Comments & Academic Discussion

Loading comments...

Leave a Comment