A Bregman Extension of quasi-Newton updates II: Convergence and Robustness Properties
We propose an extension of quasi-Newton methods, and investigate the convergence and the robustness properties of the proposed update formulae for the approximate Hessian matrix. Fletcher has studied a variational problem which derives the approximate Hessian update formula of the quasi-Newton methods. We point out that the variational problem is identical to optimization of the Kullback-Leibler divergence, which is a discrepancy measure between two probability distributions. Then, we introduce the Bregman divergence as an extension of the Kullback-Leibler divergence, and derive extended quasi-Newton update formulae based on the variational problem with the Bregman divergence. The proposed update formulae belong to a class of self-scaling quasi-Newton methods. We study the convergence property of the proposed quasi-Newton method, and moreover, we apply the tools in the robust statistics to analyze the robustness property of the Hessian update formulae against the numerical rounding errors included in the line search for the step length. As the result, we found that the influence of the inexact line search is bounded only for the standard BFGS formula for the Hessian approximation. Numerical studies are conducted to verify the usefulness of the tools borrowed from robust statistics.
💡 Research Summary
The paper introduces a broad family of quasi‑Newton update formulas for the Hessian approximation by replacing the traditional Kullback‑Leibler (KL) divergence with a general Bregman divergence in the underlying variational problem. Fletcher’s classic variational formulation—minimizing the KL divergence between the new approximation (B_{k+1}) and the current one (B_k) while enforcing the secant condition (B_{k+1}s_k = y_k)—is first re‑interpreted as an explicit KL‑minimization problem. Recognizing that KL divergence is a special case of Bregman divergence, the authors generalize the objective to
\
Comments & Academic Discussion
Loading comments...
Leave a Comment