Gradient Hard Thresholding Pursuit for Sparsity-Constrained Optimization

Gradient Hard Thresholding Pursuit for Sparsity-Constrained Optimization
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Hard Thresholding Pursuit (HTP) is an iterative greedy selection procedure for finding sparse solutions of underdetermined linear systems. This method has been shown to have strong theoretical guarantee and impressive numerical performance. In this paper, we generalize HTP from compressive sensing to a generic problem setup of sparsity-constrained convex optimization. The proposed algorithm iterates between a standard gradient descent step and a hard thresholding step with or without debiasing. We prove that our method enjoys the strong guarantees analogous to HTP in terms of rate of convergence and parameter estimation accuracy. Numerical evidences show that our method is superior to the state-of-the-art greedy selection methods in sparse logistic regression and sparse precision matrix estimation tasks.


💡 Research Summary

This paper extends the Hard Thresholding Pursuit (HTP) algorithm, originally designed for sparse recovery in compressive sensing, to the broader class of sparsity‑constrained convex optimization problems of the form
\


Comments & Academic Discussion

Loading comments...

Leave a Comment