PRISM: Probabilistic Real-Time Inference in Spatial World Models

Atanas Mirchev, Baris Kayalibay, Ahmed Agha, Patrick van der Smagt, Daniel Cremers, Justin Bayer

Research output: Contribution to journalConference articlepeer-review

Abstract

We introduce PRISM, a method for real-time filtering in a probabilistic generative model of agent motion and visual perception. Previous approaches either lack uncertainty estimates for the map and agent state, do not run in real-time, do not have a dense scene representation or do not model agent dynamics. Our solution reconciles all of these aspects. We start from a predefined state-space model which combines differentiable rendering and 6-DoF dynamics. Probabilistic inference in this model amounts to simultaneous localisation and mapping (SLAM) and is intractable. We use a series of approximations to Bayesian inference to arrive at probabilistic map and state estimates. We take advantage of well-established methods and closed-form updates, preserving accuracy and enabling real-time capability. The proposed solution runs at 10Hz real-time and is similarly accurate to state-of-the-art SLAM in small to medium-sized indoor environments, with high-speed UAV and handheld camera agents (Blackbird, EuRoC and TUM-RGBD).

Original languageEnglish
Pages (from-to)161-174
Number of pages14
JournalProceedings of Machine Learning Research
Volume205
StatePublished - 2023
Event6th Conference on Robot Learning, CoRL 2022 - Auckland, New Zealand
Duration: 14 Dec 202218 Dec 2022

Keywords

  • Bayes filter
  • SLAM
  • diff. rendering
  • generative model
  • uncertainty

Fingerprint

Dive into the research topics of 'PRISM: Probabilistic Real-Time Inference in Spatial World Models'. Together they form a unique fingerprint.

Cite this