Filtering Additive Measurement Noise with Maximum Entropy in the Mean

Filtering Additive Measurement Noise with Maximum Entropy in the Mean

The purpose of this note is to show how the method of maximum entropy in the mean (MEM) may be used to improve parametric estimation when the measurements are corrupted by large level of noise. The method is developed in the context on a concrete example: that of estimation of the parameter in an exponential distribution. We compare the performance of our method with the bayesian and maximum likelihood approaches.


💡 Research Summary

The paper presents a practical application of the Maximum Entropy in the Mean (MEM) principle to improve parametric estimation when measurements are heavily corrupted by additive noise. The authors focus on a concrete example: estimating the scale parameter θ of an exponential distribution from noisy observations. In the model, each observed datum X_i is the sum of a true exponential sample Y_i ~ Exp(θ) and an independent Gaussian noise term Z_i with mean zero and variance σ², i.e., X_i = Y_i + Z_i. The central challenge is that conventional estimators—maximum likelihood (ML) and Bayesian methods—perform poorly under high noise levels. ML becomes biased because the likelihood function no longer reflects the true data-generating process, while Bayesian inference depends on the choice of prior and can still suffer from bias or inflated variance when the prior is non‑informative.

MEM addresses these issues by constructing a probability distribution for the observed data that maximizes Shannon entropy subject to a moment constraint derived from the data. Specifically, the method seeks a density p(x) that maximizes

 H(p) = –∫ p(x) log p(x) dx

while satisfying the average‑value constraint E_p