Variable Metric Stochastic Approximation Theory
We provide a variable metric stochastic approximation theory. In doing so, we provide a convergence theory for a large class of online variable metric methods including the recently introduced online versions of the BFGS algorithm and its limited-memory LBFGS variant. We also discuss the implications of our results for learning from expert advice.
💡 Research Summary
**
The paper introduces a comprehensive convergence theory for stochastic approximation (SA) algorithms that incorporate a time‑varying positive‑definite metric matrix, a framework the authors call Variable Metric Stochastic Approximation (VMSA). Classical SA, epitomized by the Robbins‑Monro scheme, updates a parameter vector θ using a scalar step size αₜ and a noisy gradient gₜ: θₜ₊₁ = θₜ − αₜ gₜ. VMSA generalizes this to θₜ₊₁ = θₜ − αₜ Hₜ gₜ, where Hₜ ∈ ℝⁿˣⁿ is a symmetric positive‑definite matrix that may change at each iteration.
The authors first lay out three essential assumptions: (1) the metric matrices are uniformly bounded, i.e., m I ≼ Hₜ ≼ M I for constants 0 < m ≤ M; (2) the stochastic gradient is unbiased with bounded second moments; (3) the step sizes satisfy the usual SA conditions ∑αₜ = ∞ and ∑αₜ² < ∞. Under these premises, they construct a Lyapunov function and employ martingale convergence arguments to prove almost‑sure convergence of θₜ to a root of the expected gradient, as well as an O(∑αₜ²) bound on the mean‑square error. Crucially, they show that if the variation of Hₜ is controlled (e.g., ‖Hₜ₊₁ − Hₜ‖ = O(αₜ)), the convergence rate matches that of the classic fixed‑metric SA, while the asymptotic constant can be dramatically reduced because the metric adapts to the local geometry of the objective.
Having established the general theory, the paper proceeds to instantiate VMSA with two prominent online quasi‑Newton methods: the online BFGS algorithm and its limited‑memory variant (LBFGS). In the online setting, each new data point yields a stochastic gradient, and the BFGS update \
Comments & Academic Discussion
Loading comments...
Leave a Comment