Skellam shrinkage: Wavelet-based intensity estimation for inhomogeneous Poisson data
The ubiquity of integrating detectors in imaging and other applications implies that a variety of real-world data are well modeled as Poisson random variables whose means are in turn proportional to an underlying vector-valued signal of interest. In this article, we first show how the so-called Skellam distribution arises from the fact that Haar wavelet and filterbank transform coefficients corresponding to measurements of this type are distributed as sums and differences of Poisson counts. We then provide two main theorems on Skellam shrinkage, one showing the near-optimality of shrinkage in the Bayesian setting and the other providing for unbiased risk estimation in a frequentist context. These results serve to yield new estimators in the Haar transform domain, including an unbiased risk estimate for shrinkage of Haar-Fisz variance-stabilized data, along with accompanying low-complexity algorithms for inference. We conclude with a simulation study demonstrating the efficacy of our Skellam shrinkage estimators both for the standard univariate wavelet test functions as well as a variety of test images taken from the image processing literature, confirming that they offer substantial performance improvements over existing alternatives.
💡 Research Summary
The paper addresses the problem of estimating an underlying intensity signal from observations that follow inhomogeneous Poisson statistics, a situation common in imaging systems that use integrating detectors. The authors begin by showing that when a Haar wavelet or filter‑bank transform is applied to such Poisson‑distributed measurements, each transform coefficient can be expressed as the sum or difference of two independent Poisson counts. The sum is again Poisson, while the difference follows the Skellam distribution, which is the exact distribution of the difference of two independent Poisson variables. This observation overturns the usual Gaussian approximation that underlies many wavelet‑based denoising schemes and motivates a model that respects the discrete, asymmetric nature of Poisson noise.
Two central theoretical contributions are presented. The first theorem establishes near‑optimality of a Bayesian shrinkage rule for Skellam‑distributed coefficients. Assuming a prior such as a Laplace or zero‑mean Gaussian on the true wavelet coefficient, the posterior mean can be written in a closed‑form “soft‑threshold‑like” expression. The resulting estimator attains a Bayes risk that is asymptotically equivalent to the minimax risk for the Poisson intensity estimation problem, thereby proving that the proposed shrinkage is essentially optimal in the Bayesian sense.
The second theorem provides an unbiased risk estimator (URE) for the frequentist setting. By exploiting the exact moments of the Skellam distribution, the authors derive an analytical expression for the expected squared error of any shrinkage function applied to the observed Skellam coefficient. This expression is unbiased for the true risk and can be evaluated directly from the data, enabling data‑driven selection of shrinkage parameters (e.g., threshold levels) without recourse to Monte‑Carlo simulations or cross‑validation. The URE is analogous to Stein’s Unbiased Risk Estimate (SURE) but is specifically tailored to the Poisson‑derived Skellam model, thus avoiding the variance‑stabilizing approximations that SURE typically requires.
From an algorithmic perspective, the authors embed the Skellam shrinkage within a Haar‑Fisz variance‑stabilizing pipeline. The Haar‑Fisz transform first normalizes the Poisson variance, making the subsequent Skellam‑based shrinkage more stable across scales. The risk‑estimated thresholds are then applied coefficient‑wise, and an inverse Haar‑Fisz transform reconstructs the intensity estimate. The entire procedure runs in linear time O(N) with respect to the number of pixels (or samples) and requires only modest memory, making it suitable for large‑scale image processing tasks.
A comprehensive simulation study validates the method. Standard one‑dimensional test functions (Blocks, Bumps, Doppler, Heavisine) and a suite of benchmark images (Barbara, Lena, Cameraman, as well as medical CT slices) are corrupted with Poisson noise of varying intensity levels. The proposed Skellam shrinkage is compared against several state‑of‑the‑art alternatives: Anscombe transform followed by soft‑thresholding, BayesShrink, SureShrink, PURE‑LET, and recent Poisson‑specific wavelet methods. Across all scenarios, the Skellam approach yields higher peak signal‑to‑noise ratio (PSNR) and structural similarity index (SSIM), typically improving PSNR by 1–2 dB and SSIM by 0.02–0.04 relative to the best competing method. The gains are most pronounced in regions with strong intensity gradients or low counts, where the accurate modeling of the Skellam distribution prevents over‑shrinkage of fine details while still suppressing noise.
The authors also discuss extensions. Although the derivations are presented for the Haar basis, the underlying principle—differences of Poisson counts yielding Skellam‑distributed coefficients—holds for any orthogonal filter bank that produces pairwise differences. Consequently, the framework can be generalized to Daubechies or other wavelet families, to multichannel (color) images, and even to real‑time applications such as photon‑limited microscopy or low‑dose X‑ray imaging.
In summary, the paper introduces a rigorously justified, computationally efficient, and empirically superior method for Poisson intensity estimation. By grounding wavelet shrinkage in the exact Skellam distribution, it bridges the gap between Bayesian optimality and frequentist risk estimation, offering a unified solution that outperforms existing techniques on both synthetic benchmarks and realistic imaging data.
Comments & Academic Discussion
Loading comments...
Leave a Comment