Optimization Problems with Difference of Tangentially Convex Functions under Uncertainty

This paper investigates a specific class of nonsmooth nonconvex optimization problems in the face of data uncertainty, namely, robust optimization problems, where the given objective function can be e

Optimization Problems with Difference of Tangentially Convex Functions under Uncertainty

This paper investigates a specific class of nonsmooth nonconvex optimization problems in the face of data uncertainty, namely, robust optimization problems, where the given objective function can be expressed as a difference of two tangentially convex (DTC) functions. More precisely, we develop a range of nonsmooth calculus rules to establish relationships between Frechet and limiting subdifferentials for a particular maximum function and the tangential subdifferential of its constituent functions. Subsequently, we derive optimality conditions for problems involving DTC functions, employing generalized constraint qualifications within the framework of the tangential subdifferential concept. Several illustrative examples are presented to demonstrate the obtained results.


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...