Distributionally Robust Bayesian Optimization

Johannes Kirschner, Ilija Bogunovic, Stefanie Jegelka, Andreas Krause

Research output: Contribution to journalConference articlepeer-review

44 Scopus citations

Abstract

Robustness to distributional shift is one of the key challenges of contemporary machine learning. Attaining such robustness is the goal of distributionally robust optimization, which seeks a solution to an optimization problem that is worst-case robust under a specified distributional shift of an uncontrolled covariate. In this paper, we study such a problem when the distributional shift is measured via the maximum mean discrepancy (MMD). For the setting of zeroth-order, noisy optimization, we present a novel distributionally robust Bayesian optimization algorithm (DRBO). Our algorithm provably obtains sub-linear robust regret in various settings that differ in how the uncertain covariate is observed. We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.

Original languageEnglish
Pages (from-to)2174-2184
Number of pages11
JournalProceedings of Machine Learning Research
Volume108
StatePublished - 2020
Externally publishedYes
Event23rd International Conference on Artificial Intelligence and Statistics, AISTATS 2020 - Virtual, Online
Duration: 26 Aug 202028 Aug 2020

Fingerprint

Dive into the research topics of 'Distributionally Robust Bayesian Optimization'. Together they form a unique fingerprint.

Cite this