A Riemannian Alternating Descent Ascent Algorithmic Framework for Nonconvex-Linear Minimax Problems on Riemannian Manifolds
In this paper, we consider a class of nonconvex-linear minimax problems on Riemannian manifolds, which find wide applications in machine learning and signal processing. For solving this class of problems, we develop a flexible Riemannian alternating descent ascent (RADA) algorithmic framework. Within this framework, we propose two easy-to-implement yet efficient algorithms that alternately perform one or multiple projected/Riemannian gradient descent steps and a proximal gradient ascent step at each iteration. We show that the proposed RADA algorithmic framework can find both an $\varepsilon$-Riemannian-game-stationary point and an $\varepsilon$-Riemannian-optimization-stationary point within $\mathcal{O}(\varepsilon^{-3})$ iterations, achieving the best-known iteration complexity. We also reveal intriguing similarities and differences between the algorithms developed within our proposed framework and existing algorithms, thus providing important insights into the improved efficiency of the former. Lastly, we present numerical results on sparse principal component analysis (PCA), fair PCA, and sparse spectral clustering to demonstrate the superior performance of the proposed algorithms.
💡 Research Summary
Title: A Riemannian Alternating Descent‑Ascent Framework for Nonconvex‑Linear Minimax Problems on Manifolds
Problem Setting
The paper addresses the following class of problems:
\
Comments & Academic Discussion
Loading comments...
Leave a Comment