An approximation scheme for distributionally robust nonlinear optimization

Johannes Milz, Michael Ulbrich

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We consider distributionally robust optimization problems (DROPs) with nonlinear and nonconcave dependence on uncertain parameters. The DROP can be written as a nonsmooth, nonlinear program with a bilevel structure; the objective function and each of the constraint functions are suprema of expected values of parametric functions taken over an ambiguity set of probability distributions. We define ambiguity sets through moment constraints, and to make the computation of first order stationary points tractable, we approximate nonlinear functions using quadratic expansions w.r.t. parameters, resulting in lower-level problems defined by trust-region problems and semidefinite programs. Subsequently, we construct smoothing functions for the approximate lower level functions which are computationally tractable, employing strong duality for trust-region problems, and show that gradient consistency holds. We formulate smoothed DROPs and apply a homotopy method that dynamically decreases smoothing parameters and establish its convergence to stationary points of the approximate DROP under mild assumptions. Through our scheme, we provide a new approach to robust nonlinear optimization as well. We perform numerical experiments and comparisons to other methods on a well-known test set, assuming design variables are subject to implementation errors, which provides a representative set of numerical examples.

Original languageEnglish
Pages (from-to)1996-2025
Number of pages30
JournalSIAM Journal on Optimization
Volume30
Issue number3
DOIs
StatePublished - 2020

Keywords

  • Distributionally robust optimization
  • Gradient consistency
  • Robust optimization
  • Semidefinite programming
  • Smoothing functions
  • Smoothing methods
  • Trust-region problem

Fingerprint

Dive into the research topics of 'An approximation scheme for distributionally robust nonlinear optimization'. Together they form a unique fingerprint.

Cite this