Fast, High-Accuracy, Randomized Nullspace Computations for Tall Matrices

Fast, High-Accuracy, Randomized Nullspace Computations for Tall Matrices
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In this paper, we develop RLOBPCG, an efficient method for computing a small number of singular triplets corresponding to the smallest singular values of large, tall matrices. The algorithm combines randomized preconditioner from the sketch-and-precondition techniques with the LOBPCG eigensolver: a small sketch is used to construct a high-quality preconditioner, and LOBPCG is run on the Gram matrix to refine the singular vector. Under the standard subspace embedding assumption and a modest singular value gap between the two smallest singular values, we prove that RLOBPCG converges geometrically to the minimum singular vector. In numerical experiments, RLOBPCG achieves near-optimal accuracy on matrices with up to $10^6$ rows, outperforming classical LOBPCG and Lanczos methods by a speedup of up to $12\times$ and maintaining robustness when other iterative methods fail to converge.


💡 Research Summary

The paper introduces RLOBPCG, a novel algorithm for computing a few (typically 1–10) singular vectors associated with the smallest singular values of very tall matrices (m ≫ n, with n on the order of thousands). Computing the smallest singular values is notoriously difficult because standard iterative eigensolvers such as Lanczos or LOBPCG often stall when the singular value gap is small or when the matrix is not rank‑deficient. The authors address this challenge by marrying two ideas: (i) a randomized sketch‑based preconditioner and (ii) the Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) eigensolver.

The algorithm proceeds in four stages. First, a random embedding matrix S ∈ ℂ^{d×m} is drawn, where d = O(n) or O(n log n). S is required to be a subspace embedding for A, i.e., it approximately preserves the Euclidean norm of any vector in the column space of A. Second, the sketch SA is formed and its thin SVD, SA = U Σ V*, is computed. The right singular vectors V and the inverse singular values Σ^{-1} are used to build a preconditioner P = V Σ^{-1}. Third, the initial iterate v₀ is set to the last column of V (the smallest right singular vector of the sketch) and normalized. Finally, LOBPCG is applied to the Gram matrix AᵀA with preconditioner P Pᵀ. In each LOBPCG iteration a search direction w_i = P Pᵀ Aᵀ(Av_i) − ‖Av_i‖² v_i is formed, orthogonalized against the current basis {v_i, x_i}, normalized, and then combined with the two existing basis vectors to form a three‑dimensional trial subspace T =


Comments & Academic Discussion

Loading comments...

Leave a Comment