This paper investigates a specific class of nonsmooth nonconvex optimization problems in the face of data uncertainty, namely, robust optimization problems, where the given objective function can be expressed as a difference of two tangentially convex (DTC) functions. More precisely, we develop a range of nonsmooth calculus rules to establish relationships between Frechet and limiting subdifferentials for a particular maximum function and the tangential subdifferential of its constituent functions. Subsequently, we derive optimality conditions for problems involving DTC functions, employing generalized constraint qualifications within the framework of the tangential subdifferential concept. Several illustrative examples are presented to demonstrate the obtained results.
Real-life optimization problems often encounter data uncertainty due to incomplete information, prediction errors, and measurement errors. Numerous fields, such as logistics, finance, water management, energy management, machine learning, and major emergencies like COVID-19, frequently involve problems with uncertain data. As a result, significant attention has been devoted to these types of optimization problems from both theoretical and practical perspectives over the past two decades (see, e.g., [7]).
The primary objective of this study is to investigate optimality conditions for a specific class of optimization problems that involve data uncertainty in the constituent functions: min ψ 0 (x) (RP S ) s.t.
x ∈ S, where ψ 0 : R n → R ∪ {+∞} and S is a nonempty closed subset of R n .
Robust optimization is a computationally powerful deterministic approach applicable to many classes of optimization problems involving data uncertainty. The goal of robust optimization is to find a worst-case solution that satisfies all possible realizations of the constraints, thereby immunizing the optimization problem against uncertain parameters. There are several methods for applying robust optimization, and the choice of method typically depends on the problem being solved.
Over the last three decades, numerous studies have examined nonsmooth and nonconvex programming problems, particularly DC optimization problems, where the objective and/or constraint functions are expressed as the difference of two convex functions. Indeed, DC programming and the DC Algorithm (DCA) are widely recognized as essential tools for addressing nonsmooth and nonconvex programming problems. Recent work [12] introduced a broader class of nonconvex, nonsmooth problems where functions are expressed as differences of two tangentially convex (DTC) functions. Motivated by these advances, we derive optimality conditions for both unconstrained and constrained optimization problems under data uncertainty. Our approach combines robust optimization techniques with the tangential subdifferential concept. Specifically, we:
- Analyze a special nonsmooth maximum function, proving its DTC property using variational analysis. 2. Explore relationships between the Fréchet subdifferential, limiting subdifferential, and tangential subdifferential for constituent functions. 3. Employ generalized error bounds and Abadie constraint qualifications within the tangential subdifferential framework to establish optimality conditions for the robust DTC problem RP S .
Unlike prior studies, we relax the convexity assumption on uncertain sets and feasible sets, as well as the concavity requirement for functions with respect to uncertain parameters. Our results generalize existing work by encompassing cases where these stronger assumptions hold, thereby deriving sharper local optimality conditions under weaker constraint qualifications. The remainder of the paper is organized as follows. Section 2 contains the basic definitions and preliminary results of convex and nonsmooth analysis needed in the article. Several results in nonsmooth analysis that characterize some suitable properties for a specific maximum function are established in Section 3. In Section 4, two robust constraint qualifications are introduced based on the concept of tangential subdifferential. Furthermore, the relationship between them is exploited. Finally in Section 5, new optimality conditions for problem RP S are proved for two special types of objective functions. Several examples are given to illustrate the results of the paper.
In this section, we recall some basic definitions and results from nonsmooth analysis needed throughout the paper. For more details we refer the reader to [2][3][4][5].
Throughout the paper, we use the following standard notations. Suppose that R n signifies Euclidean space whose norm is denoted by ||.||. The inner product of two vectors x, y ∈ R n is shown by ⟨x, y⟩ = n i=1 x i y i . By B, we denote the closed unit ball centered at the origin of R n , while B δ (x) stands for the closed ball centered at x with radius δ > 0. Let S be a given subset of R n , the distance function d S : R n → R + is defined by d S (x) := inf y∈S ||y -x||. The negative polar cone, the closure and convex hull of S, are denoted, respectively, by S -, cl S and co S.
If S is closed and x ∈ S, then the contingent cone of S and the Fréchet normal cone to S at x are defined, respectively, by
Complementary to lower semicontinuity of f, we say that f is upper semi continuous (u.s.c.) at x if -f is l.s.c. at this point. The lower and upper Dini derivatives of f at x in the direction d ∈ R n are defined, respectively, by
The directional derivative of f at x in the direction d is given by
when the limit in (1) exists. Following [14], if f is l.s.c. around x ∈ dom f , then the presubdifferential or Fréchet subdifferential of f at x is defined by
and the limiting or Mordukhovich subdifferential of f
This content is AI-processed based on open access ArXiv data.