Abstract
We introduce PRISM, a method for real-time filtering in a probabilistic generative model of agent motion and visual perception. Previous approaches either lack uncertainty estimates for the map and agent state, do not run in real-time, do not have a dense scene representation or do not model agent dynamics. Our solution reconciles all of these aspects. We start from a predefined state-space model which combines differentiable rendering and 6-DoF dynamics. Probabilistic inference in this model amounts to simultaneous localisation and mapping (SLAM) and is intractable. We use a series of approximations to Bayesian inference to arrive at probabilistic map and state estimates. We take advantage of well-established methods and closed-form updates, preserving accuracy and enabling real-time capability. The proposed solution runs at 10Hz real-time and is similarly accurate to state-of-the-art SLAM in small to medium-sized indoor environments, with high-speed UAV and handheld camera agents (Blackbird, EuRoC and TUM-RGBD).
Original language | English |
---|---|
Pages (from-to) | 161-174 |
Number of pages | 14 |
Journal | Proceedings of Machine Learning Research |
Volume | 205 |
State | Published - 2023 |
Event | 6th Conference on Robot Learning, CoRL 2022 - Auckland, New Zealand Duration: 14 Dec 2022 → 18 Dec 2022 |
Keywords
- Bayes filter
- SLAM
- diff. rendering
- generative model
- uncertainty