Bounds for Distributionally Robust Optimization Problems

Bounds for Distributionally Robust Optimization Problems
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We study distributionally robust optimization (DRO) problems with uncertainty sets consisting of high-dimensional random vectors that are close in the multivariate Wasserstein distance to a reference random vector. We give conditions when the images of these sets under scalar-valued aggregation functions are contained in and contain uncertainty sets of univariate random variables defined via a univariate Wasserstein distance. This provides lower and upper bounds for the solution to general multivariate DRO problems that are computationally tractable. Furthermore, we generalize the results to uncertainty sets characterized by Bregman-Wasserstein divergences, which allows for asymmetric deviations from the reference random vector. Moreover, for DRO problems with risk measure criterion in the class of signed Choquet integrals, we derive semi-analytic formulae for the upper and lower bounds and the distribution that attains these bounds.


💡 Research Summary

The paper addresses the computational challenges of distributionally robust optimization (DRO) when the uncertainty set consists of high‑dimensional random vectors that are close to a reference distribution in the multivariate Wasserstein distance. The authors’ central contribution is a set‑inclusion result that links the image of a multivariate Wasserstein ball under a scalar aggregation function to a univariate Wasserstein ball around the aggregated random variable.

Formally, let X∈Lⁿₚ be a random vector, g:ℝⁿ→ℝ a K‑Lipschitz function, and Uⁿ_ε(X) the multivariate Wasserstein ball of radius ε around X. Theorem 1 proves that g


Comments & Academic Discussion

Loading comments...

Leave a Comment