Abstract
While Gaussian processes are a mainstay for various engineering and scientific applications, the uncertainty estimates don't satisfy frequentist guarantees and can be miscalibrated in practice.State-of-the-art approaches for designing calibrated models rely on inflating the Gaussian process posterior variance, which yields confidence intervals that are potentially too coarse.To remedy this, we present a calibration approach that generates predictive quantiles using a computation inspired by the vanilla Gaussian process posterior variance but using a different set of hyperparameters chosen to satisfy an empirical calibration constraint.This results in a calibration approach that is considerably more flexible than existing approaches, which we optimize to yield tight predictive quantiles.Our approach is shown to yield a calibrated model under reasonable assumptions.Furthermore, it outperforms existing approaches in sharpness when employed for calibrated regression.
Originalsprache | Englisch |
---|---|
Fachzeitschrift | Advances in Neural Information Processing Systems |
Jahrgang | 36 |
Publikationsstatus | Veröffentlicht - 2023 |
Veranstaltung | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, USA/Vereinigte Staaten Dauer: 10 Dez. 2023 → 16 Dez. 2023 |