An Alternating Direction Method for Finding Dantzig Selectors
In this paper, we study the alternating direction method for finding the Dantzig selectors, which are first introduced in [8]. In particular, at each iteration we apply the nonmonotone gradient method proposed in [17] to approximately solve one subproblem of this method. We compare our approach with a first-order method proposed in [3]. The computational results show that our approach usually outperforms that method in terms of CPU time while producing solutions of comparable quality.
💡 Research Summary
The paper addresses the computational challenge of solving the Dantzig selector problem, a high‑dimensional regression formulation that seeks the sparsest coefficient vector β subject to the infinity‑norm constraint ‖Xᵀ(y − Xβ)‖∞ ≤ λ. While the Dantzig selector can be cast as a linear program, direct solvers become impractical when the number of variables p is large, as the problem size grows quadratically with p. To overcome this bottleneck, the authors propose an algorithm based on the Alternating Direction Method (ADM), also known as ADMM, which splits the original problem into two coupled subproblems involving β and an auxiliary variable z that represents the residual term Xᵀ(y − Xβ).
In the ADM framework, the augmented Lagrangian is formed with a penalty parameter ρ and a Lagrange multiplier u. The algorithm then iterates three steps: (1) update β by minimizing the sum of the ℓ₁ norm of β and a quadratic penalty term, (2) update z by projecting onto the ℓ∞‑ball of radius λ, and (3) update the multiplier u in the usual ADMM fashion. The novelty lies in step (1): instead of using a standard monotone proximal gradient method, the authors employ the non‑monotone gradient method (NGM) introduced in reference
Comments & Academic Discussion
Loading comments...
Leave a Comment