Visual-Inertial Multi-Instance Dynamic SLAM with Object-level Relocalisation

Yifei Ren, Binbin Xu, Christopher L. Choi, Stefan Leutenegger

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

9 Zitate (Scopus)

Abstract

In this paper, we present a tightly-coupled visual-inertial object-level multi-instance dynamic SLAM system. Even in extremely dynamic scenes, it can robustly optimise for the camera pose, velocity, IMU biases and build a dense 3D reconstruction object-level map of the environment. Our system can robustly track and reconstruct the geometries of arbitrary objects, their semantics and motion by incrementally fusing associated colour, depth, semantic, and foreground object probabilities into each object model thanks to its robust sensor and object tracking. In addition, when an object is lost or moved outside the camera field of view, our system can reliably recover its pose upon re-observation. We demonstrate the robustness and accuracy of our method by quantitatively and qualitatively testing it in real-world data sequences.

OriginalspracheEnglisch
TitelIEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten11055-11062
Seitenumfang8
ISBN (elektronisch)9781665479271
DOIs
PublikationsstatusVeröffentlicht - 2022
Veranstaltung2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022 - Kyoto, Japan
Dauer: 23 Okt. 202227 Okt. 2022

Publikationsreihe

NameIEEE International Conference on Intelligent Robots and Systems
Band2022-October
ISSN (Print)2153-0858
ISSN (elektronisch)2153-0866

Konferenz

Konferenz2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022
Land/GebietJapan
OrtKyoto
Zeitraum23/10/2227/10/22

Fingerprint

Untersuchen Sie die Forschungsthemen von „Visual-Inertial Multi-Instance Dynamic SLAM with Object-level Relocalisation“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren