A Partially Derivative-Free Proximal Method for Composite Multiobjective Optimization in the Hölder Setting
This paper presents an algorithm for solving multiobjective optimization problems involving composite functions, where we minimize a quadratic model that approximates $F(x) - F(x^k)$ and that can be derivative-free. We establish theoretical assumptions about the component functions of the composition and provide comprehensive convergence and complexity analysis. Specifically, we prove that the proposed method converges to a weakly $\varepsilon$-approximate Pareto point in at most $\mathcal{O}\left(\varepsilon^{-\frac{β+1}β}\right)$ iterations, where $β$ denotes the Hölder exponent of the gradient. The algorithm incorporates gradient approximations and a scaling matrix $B_k$ to achieve an optimal balance between computational accuracy and efficiency. Numerical experiments on a collection of benchmark problems are presented, illustrating the practical behavior of the proposed approach and its competitiveness with existing composite algorithms.
💡 Research Summary
The paper introduces a partially derivative‑free proximal method (PDFPM) for solving composite multi‑objective optimization problems of the form
\
Comments & Academic Discussion
Loading comments...
Leave a Comment