Online Expectation-Maximisation

Online Expectation-Maximisation
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Tutorial chapter on the Online EM algorithm to appear in the volume ‘Mixtures’ edited by Kerrie Mengersen, Mike Titterington and Christian P. Robert.


💡 Research Summary

The chapter “Online Expectation‑Maximisation” provides a comprehensive tutorial on extending the classic EM algorithm to streaming and large‑scale data settings where the entire dataset cannot be stored or processed in batch mode. It begins by recalling the standard EM framework, where the E‑step computes the conditional expectation of the complete‑data sufficient statistics given the current parameters, and the M‑step maximises the expected complete‑data log‑likelihood. The limitation of this approach is that it requires repeated passes over all observations, which is infeasible for massive or continuously arriving data streams.

The core contribution of the online EM method is to replace the full‑batch expectation with a stochastic update of the sufficient statistics. When a new observation (Y_t) arrives, the algorithm uses the current parameter estimate (\theta_{t-1}) to compute the conditional expectation (\hat S_t = \mathbb{E}{\theta{t-1}}


Comments & Academic Discussion

Loading comments...

Leave a Comment