Estimation of instrument and noise parameters for inverse problem based on prior diffusion model
This article addresses the issue of estimating observation parameters (response and error parameters) in inverse problems. The focus is on cases where regularization is introduced in a Bayesian framework and the prior is modeled by a diffusion process. In this context, the issue of posterior sampling is well known to be thorny, and a recent paper proposes a notably simple and effective solution. Consequently, it offers an remarkable additional flexibility when it comes to estimating observation parameters. The proposed strategy enables us to define an optimal estimator for both the observation parameters and the image of interest. Furthermore, the strategy provides a means of quantifying uncertainty. In addition, MCMC algorithms allow for the efficient computation of estimates and properties of posteriors, while offering some guarantees. The paper presents several numerical experiments that clearly confirm the computational efficiency and the quality of both estimates and uncertainties quantification.
💡 Research Summary
The paper tackles the joint estimation of observation parameters (instrument response ι and noise parameters η = {mₑ, vₑ}) together with the latent image x₀ in a linear inverse problem y = H(ι) x₀ + e. While Bayesian regularization with diffusion‑based priors (e.g., denoising diffusion probabilistic models) has shown impressive expressive power for image reconstruction, the non‑linear nature of the diffusion prior makes posterior sampling and, consequently, hyper‑parameter inference extremely challenging.
To overcome this, the authors build on a recently proposed G‑DPS (Gibbs‑Diffusion Posterior Sampling) algorithm. The diffusion prior is expressed through a forward Markov chain p⁺₀:ₜ and a backward chain p⁻₀:ₜ, both consisting of Gaussian transitions. The backward chain is parameterised by a neural network µ_θₜ(·) that is trained to minimise the KL divergence between forward and backward joint distributions, thereby ensuring that the two chains are almost identical. This symmetry allows each latent variable xₜ (t = 0,…,T) to have a Gaussian conditional posterior, which can be sampled analytically using FFT (for t = 0) or simple linear combinations (for t > 0).
For the observation parameters, conjugate priors are deliberately chosen: a Gaussian prior N(m₀, p₀⁻¹) for the noise mean mₑ and a Gamma prior G(a₀, b₀) for the precision γₑ = 1/vₑ. These choices render the conditional posteriors for mₑ and γₑ also Gaussian and Gamma, respectively, enabling direct Gibbs updates. The instrument parameter ι, however, receives only a uniform prior U(·) and leads to a non‑standard conditional posterior. The authors therefore embed a random‑walk Metropolis‑Hastings step with a Gaussian proposal to update ι.
The full Gibbs sampler proceeds as follows: (1) sample γₑ and mₑ from their conjugate conditionals; (2) propose a new ι and accept/reject via Metropolis‑Hastings; (3) update the extended image set x₀:ₜ using the G‑DPS block‑Gibbs scheme that alternates between forward and backward diffusion conditionals. After each iteration the empirical mean of x₀ (an approximation of the posterior mean) is refreshed, and the algorithm stops when successive updates differ by less than a preset tolerance.
Experimental validation uses 32 × 32 MNIST digits. The ground‑truth PSF is a Lorentzian with width ι* = 0.9, the noise mean mₑ* = 0.1 and standard deviation σₑ* = 0.05. After a short burn‑in (~250 iterations), the Markov chains stabilize. Estimated parameters are ι̂ = 0.82 (≈ 9 % error), m̂ₑ = 0.103 (≈ 3 % error) and v̂ₑ = 0.00254 (≈ 1 % error). The reconstructed image closely matches the true image, and pixel‑wise 95 % credible intervals (posterior mean ± 2 · standard deviation) contain the ground‑truth values, demonstrating reliable uncertainty quantification. Two‑dimensional marginal plots further confirm that the joint posterior mass concentrates around the true parameter values.
Key contributions are: (i) a unified Bayesian framework that couples diffusion priors with observation‑parameter inference; (ii) a conjugate‑prior design that preserves analytically tractable conditionals for noise parameters, dramatically reducing MCMC computational cost; (iii) the integration of a Metropolis‑Hastings step for the instrument parameter within a Gibbs loop, enabling efficient exploration of a mixed continuous‑discrete posterior; and (iv) extensive empirical evidence that the method yields accurate point estimates and well‑calibrated uncertainty measures. The approach is readily applicable to other imaging domains where the forward operator is uncertain, such as medical tomography, astronomical deconvolution, and remote sensing, offering a practical route to jointly recover images and calibrate the measurement system.
Comments & Academic Discussion
Loading comments...
Leave a Comment