Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting

Stephan Thaler, Julija Zavadlav

Research output: Contribution to journalArticlepeer-review

37 Scopus citations

Abstract

In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through MD simulations. We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent observables. Leveraging thermodynamic perturbation theory, we avoid exploding gradients and achieve around 2 orders of magnitude speed-up in gradient computation for top-down learning. We show effectiveness of DiffTRe in learning NN potentials for an atomistic model of diamond and a coarse-grained model of water based on diverse experimental observables including thermodynamic, structural and mechanical properties. Importantly, DiffTRe also generalizes bottom-up structural coarse-graining methods such as iterative Boltzmann inversion to arbitrary potentials. The presented method constitutes an important milestone towards enriching NN potentials with experimental data, particularly when accurate bottom-up data is unavailable.

Original languageEnglish
Article number6884
JournalNature Communications
Volume12
Issue number1
DOIs
StatePublished - Dec 2021

Fingerprint

Dive into the research topics of 'Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting'. Together they form a unique fingerprint.

Cite this