TY - JOUR
T1 - An approximation scheme for distributionally robust nonlinear optimization
AU - Milz, Johannes
AU - Ulbrich, Michael
N1 - Publisher Copyright:
© 2020 Society for Industrial and Applied Mathematics
PY - 2020
Y1 - 2020
N2 - We consider distributionally robust optimization problems (DROPs) with nonlinear and nonconcave dependence on uncertain parameters. The DROP can be written as a nonsmooth, nonlinear program with a bilevel structure; the objective function and each of the constraint functions are suprema of expected values of parametric functions taken over an ambiguity set of probability distributions. We define ambiguity sets through moment constraints, and to make the computation of first order stationary points tractable, we approximate nonlinear functions using quadratic expansions w.r.t. parameters, resulting in lower-level problems defined by trust-region problems and semidefinite programs. Subsequently, we construct smoothing functions for the approximate lower level functions which are computationally tractable, employing strong duality for trust-region problems, and show that gradient consistency holds. We formulate smoothed DROPs and apply a homotopy method that dynamically decreases smoothing parameters and establish its convergence to stationary points of the approximate DROP under mild assumptions. Through our scheme, we provide a new approach to robust nonlinear optimization as well. We perform numerical experiments and comparisons to other methods on a well-known test set, assuming design variables are subject to implementation errors, which provides a representative set of numerical examples.
AB - We consider distributionally robust optimization problems (DROPs) with nonlinear and nonconcave dependence on uncertain parameters. The DROP can be written as a nonsmooth, nonlinear program with a bilevel structure; the objective function and each of the constraint functions are suprema of expected values of parametric functions taken over an ambiguity set of probability distributions. We define ambiguity sets through moment constraints, and to make the computation of first order stationary points tractable, we approximate nonlinear functions using quadratic expansions w.r.t. parameters, resulting in lower-level problems defined by trust-region problems and semidefinite programs. Subsequently, we construct smoothing functions for the approximate lower level functions which are computationally tractable, employing strong duality for trust-region problems, and show that gradient consistency holds. We formulate smoothed DROPs and apply a homotopy method that dynamically decreases smoothing parameters and establish its convergence to stationary points of the approximate DROP under mild assumptions. Through our scheme, we provide a new approach to robust nonlinear optimization as well. We perform numerical experiments and comparisons to other methods on a well-known test set, assuming design variables are subject to implementation errors, which provides a representative set of numerical examples.
KW - Distributionally robust optimization
KW - Gradient consistency
KW - Robust optimization
KW - Semidefinite programming
KW - Smoothing functions
KW - Smoothing methods
KW - Trust-region problem
UR - http://www.scopus.com/inward/record.url?scp=85093873902&partnerID=8YFLogxK
U2 - 10.1137/19M1263121
DO - 10.1137/19M1263121
M3 - Article
AN - SCOPUS:85093873902
SN - 1052-6234
VL - 30
SP - 1996
EP - 2025
JO - SIAM Journal on Optimization
JF - SIAM Journal on Optimization
IS - 3
ER -