Joint Estimation and Robustness Optimization

Taozeng Zhu, Jingui Xie, Melvyn Sim

Research output: Contribution to journalArticlepeer-review

13 Scopus citations


Many real-world optimization problems have input parameters estimated from data whose inherent imprecision can lead to fragile solutions that may impede desired objectives and/or render constraints infeasible. We propose a joint estimation and robustness optimization (JERO) framework to mitigate estimation uncertainty in optimization problems by seamlessly incorporating both the parameter estimation procedure and the optimization problem. Toward that end, we construct an uncertainty set that incorporates all of the data, and the size of the uncertainty set is based on how well the parameters are estimated from that data when using a particular estimation procedure: regressions, the least absolute shrinkage and selection operator, and maximum likelihood estimation (among others). The JERO model maximizes the uncertainty set’s size and so obtains solutions that—unlike those derived from models dedicated strictly to robust optimization—are immune to parameter perturbations that would violate constraints or lead to objective function values exceeding their desired levels. We describe several applications and provide explicit formulations of the JERO framework for a variety of estimation procedures. To solve the JERO models with exponential cones, we develop a second-order conic approximation that limits errors beyond an operating range; with this approach, we can use state-of-the-art second-order conic programming solvers to solve even large-scale convex optimization problems.

Original languageEnglish
Pages (from-to)1659-1677
Number of pages19
JournalManagement Science
Issue number3
StatePublished - Mar 2022


  • data-driven optimization
  • parameter estimation
  • robust optimization
  • robustness optimization


Dive into the research topics of 'Joint Estimation and Robustness Optimization'. Together they form a unique fingerprint.

Cite this