BAGEL: Projection-Free Algorithm for Adversarially Constrained Online Convex Optimization
Projection-based algorithms for Constrained Online Convex Optimization (COCO) achieve optimal $\mathcal{O}(T^{1/2})$ regret guarantees but face scalability challenges due to the computational complexity of projections. To circumvent this, projection-free methods utilizing Linear Optimization Oracles (LOO) have been proposed, albeit typically achieving slower $\mathcal{O}(T^{3/4})$ regret rates. In this work, we examine whether the $\mathcal{O}(T^{1/2})$ rate can be recovered in the projection-free setting by strengthening the oracle assumption. We introduce BAGEL, an algorithm utilizing a Separation Oracle (SO) that achieves $\mathcal{O}(T^{1/2})$ regret and $\tilde{\mathcal{O}}(T^{1/2})$ cumulative constraint violation (CCV) for convex cost functions. Our analysis shows that by leveraging an infeasible projection via SO, we can match the time-horizon dependence of projection-based methods with $\tilde{\mathcal{O}}(T)$ oracle calls, provided dependence on the geometry of the action set. This establishes a specific regime where projection-free methods can attain the same convergence rates as projection-based counterparts.
💡 Research Summary
The paper tackles the fundamental scalability bottleneck of constrained online convex optimization (COCO) – the need for costly Euclidean projections or constrained convex optimization subroutines at every round. While projection‑based methods achieve the optimal (O(\sqrt{T})) regret, they become impractical in high‑dimensional settings because each iteration requires solving a potentially expensive projection problem. Recent projection‑free approaches replace the projection oracle with a linear optimization oracle (LOO). However, LOO‑based algorithms only attain (O(T^{3/4})) regret (or similar) and consequently lag behind projection‑based methods.
The authors propose a new projection‑free algorithm called BAGEL (Blocked Adaptive online Gradient descEnt with infeasible projection) that leverages a stronger oracle: the Separation Oracle (SO). An SO, given a query point, either certifies that the point lies inside the convex feasible set (K) or returns a separating hyperplane that separates the point from (K). By using this richer information, BAGEL can implement an “infeasible projection” step that moves an infeasible iterate back into (K) with only a constant number of SO calls per round.
The algorithm consists of three main components:
- Adaptive Online Gradient Descent (OGD) – At each round the algorithm receives the gradient of the cost function (f_t) and the subgradient of the constraint function(s) (g_t). Instead of a fixed step size, BAGEL uses an adaptive step size \
Comments & Academic Discussion
Loading comments...
Leave a Comment