ΦFlow: Differentiable Simulations for PyTorch, TensorFlow and Jax

Philipp Holl, Nils Thuerey

Research output: Contribution to journalConference articlepeer-review

4 Scopus citations

Abstract

Differentiable processes have proven an invaluable tool for machine learning (ML) in scientific and engineering settings, but most ML libraries are not primarily designed for such applications. We present ΦFlow, a Python toolkit that seamlessly integrates with PyTorch, TensorFlow, Jax and NumPy, simplifying the process of writing differentiable simulation code at every step. ΦFlow provides many essential features that go beyond the capabilities of the base libraries, such as differential operators, boundary conditions, the ability to write dimensionality-agnostic code, floating-point precision management, fully differentiable preconditioned (sparse) linear solves, automatic matrix generation via function tracing, integration of SciPy optimizers, simulation vectorization, and visualization tools. At the same time, ΦFlow inherits all important traits of the base ML libraries, such as GPU/TPU support, just-in-time compilation, and automatic differentiation. Put together, these features drastically simplify scientific code like PDE or ODE solvers on grids or unstructured meshes, and ΦFlow even includes out-of-the-box support for fluid simulations. ΦFlow has been used in various publications and as a ground-truth solver in multiple scientific data sets. It is available at https://github.com/tum-pbs/PhiFlow.

Original languageEnglish
Pages (from-to)18515-18546
Number of pages32
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

Fingerprint

Dive into the research topics of 'ΦFlow: Differentiable Simulations for PyTorch, TensorFlow and Jax'. Together they form a unique fingerprint.

Cite this