Bayesian Post-Processing Methods for Jitter Mitigation in Sampling
Minimum mean squared error (MMSE) estimators of signals from samples corrupted by jitter (timing noise) and additive noise are nonlinear, even when the signal prior and additive noise have normal distributions. This paper develops a stochastic algorithm based on Gibbs sampling and slice sampling to approximate the optimal MMSE estimator in this Bayesian formulation. Simulations demonstrate that this nonlinear algorithm can improve significantly upon the linear MMSE estimator, as well as the EM algorithm approximation to the maximum likelihood (ML) estimator used in classical estimation. Effective off-chip post-processing to mitigate jitter enables greater jitter to be tolerated, potentially reducing on-chip ADC power consumption.
💡 Research Summary
The paper addresses the problem of reconstructing a signal from samples that are corrupted by both timing jitter and additive white Gaussian noise (AWGN). While the minimum mean‑squared‑error (MMSE) estimator is optimal in a Bayesian sense, it becomes a highly nonlinear function of the observations as soon as jitter is present, even when both the signal prior and the additive noise are Gaussian. Classical approaches therefore resort to linear MMSE approximations or to maximum‑likelihood (ML) estimation via the Expectation‑Maximization (EM) algorithm. Both of these strategies suffer from fundamental limitations: the linear estimator cannot capture the nonlinear distortion introduced by jitter, and the EM‑ML method is prone to local‑optimum convergence and depends heavily on the choice of initial parameters.
To overcome these drawbacks, the authors formulate the reconstruction problem within a full Bayesian framework. The continuous‑time signal (s(t)) is modeled as a multivariate Gaussian random vector with mean (\mu_s) and covariance (\Sigma_s). The jitter at each sampling instant, (\tau_i), is assumed to be independent zero‑mean Gaussian with variance (\sigma_\tau^2). The additive noise (n_i) is also Gaussian with variance (\sigma_n^2). The observation model is
\
Comments & Academic Discussion
Loading comments...
Leave a Comment