Fused $L_{1/2}$ prior for large scale linear inverse problem with Gibbs bouncy particle sampler
In this paper, we study Bayesian approach for solving large scale linear inverse problems arising in various scientific and engineering fields. We propose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting properties and show that it can be formulated as a Gaussian mixture Markov random field. Since the density function of this family of prior is neither log-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can not be applied to sample the posterior. Thus, we present a Gibbs sampler in which all the conditional posteriors involved have closed form expressions. The Gibbs sampler works well for small size problems but it is computationally intractable for large scale problems due to the need for sample high dimensional Gaussian distribution. To reduce the computation burden, we construct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise deterministic Markov process. This new sampler combines elements of Gibbs sampler with bouncy particle sampler and its computation complexity is an order of magnitude smaller. We show that the new sampler converges to the target distribution. With computed tomography examples, we demonstrate that the proposed method shows competitive performance with existing popular Bayesian methods and is highly efficient in large scale problems.
💡 Research Summary
This paper addresses the challenge of Bayesian inference for large‑scale linear inverse problems, such as computed tomography (CT) image reconstruction, where the unknown image must be recovered from noisy, incomplete measurements. Traditional regularisation techniques (e.g., L1, total variation, horseshoe) either promote sparsity or edge preservation, but many of them lead to non‑convex, non‑Lipschitz priors that prevent the use of gradient‑based Markov chain Monte Carlo (MCMC) methods.
The authors introduce a fused L1/2 prior that simultaneously enforces sparsity on pixel values and edge‑preserving smoothness on horizontal and vertical differences. Mathematically the prior takes the form
\
Comments & Academic Discussion
Loading comments...
Leave a Comment