Predicting Ordinary Differential Equations with Transformers

Sören Becker, Michal Klein, Alexander Neitz, Giambattista Parascandolo, Niki Kilbertus

Publikation: Beitrag in FachzeitschriftKonferenzartikelBegutachtung

4 Zitate (Scopus)

Abstract

We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory. We demonstrate in extensive empirical evaluations that our model performs better or on par with existing methods in terms of accurate recovery across various settings. Moreover, our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.

OriginalspracheEnglisch
Seiten (von - bis)1978-2002
Seitenumfang25
FachzeitschriftProceedings of Machine Learning Research
Jahrgang202
PublikationsstatusVeröffentlicht - 2023
Veranstaltung40th International Conference on Machine Learning, ICML 2023 - Honolulu, USA/Vereinigte Staaten
Dauer: 23 Juli 202329 Juli 2023

Fingerprint

Untersuchen Sie die Forschungsthemen von „Predicting Ordinary Differential Equations with Transformers“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren