On approximation of smoothing probabilities for hidden Markov models
We consider the smoothing probabilities of hidden Markov model (HMM). We show that under fairly general conditions for HMM, the exponential forgetting still holds, and the smoothing probabilities can be well approximated with the ones of double sided HMM. This makes it possible to use ergodic theorems. As an applications we consider the pointwise maximum a posteriori segmentation, and show that the corresponding risks converge.
💡 Research Summary
The paper investigates the smoothing probabilities of hidden Markov models (HMMs) and establishes that, under fairly general conditions, these probabilities exhibit exponential forgetting. In other words, the influence of observations far in the past (or future) on the posterior distribution of a hidden state at time t decays exponentially fast. The authors relax the usual strong mixing assumptions, requiring only that the transition matrix and emission probabilities have strictly positive lower bounds and that the initial distribution is absolutely continuous with respect to the stationary distribution. Under these mild hypotheses they prove a bound of the form
\
Comments & Academic Discussion
Loading comments...
Leave a Comment