Preconditioning without a preconditioner: faster ridge-regression and Gaussian sampling with randomized block Krylov subspace methods

Preconditioning without a preconditioner: faster ridge-regression and Gaussian sampling with randomized block Krylov subspace methods
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We describe a randomized variant of the block conjugate gradient method for solving a single positive-definite linear system of equations. Our method provably outperforms preconditioned conjugate gradient with a broad-class of Nyström-based preconditioners, without ever explicitly constructing a preconditioner. In analyzing our algorithm, we derive theoretical guarantees for new variants of Nyström preconditioned conjugate gradient which may be of separate interest. We also describe how our approach yields state-of-the-art algorithms for key data-science tasks such as computing the entire ridge regression regularization path and generating multiple independent samples from a high-dimensional Gaussian distribution.


💡 Research Summary

The paper introduces a randomized block‑conjugate‑gradient (block‑CG) algorithm for solving a single symmetric positive‑definite linear system (A+μI)x = b, where A ∈ ℝ^{d×d} and μ ≥ 0. The key innovation is to augment the right‑hand side vector b with a random Gaussian sketch matrix Ω ∈ ℝ^{d×m} and run block‑CG on the combined block B =


Comments & Academic Discussion

Loading comments...

Leave a Comment