TY - GEN
T1 - Towards Robust Monocular Visual Odometry for Flying Robots on Planetary Missions
AU - Wudenka, M.
AU - Muller, M. G.
AU - Demmel, N.
AU - Wedler, A.
AU - Triebel, R.
AU - Cremers, D.
AU - Sturzl, W.
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - In the future, extraterrestrial expeditions will not only be conducted by rovers but also by flying robots. The technical demonstration drone Ingenuity, that just landed on Mars, will mark the beginning of a new era of exploration unhindered by terrain traversability. Robust self-localization is crucial for that. Cameras that are lightweight, cheap and information-rich sensors are already used to estimate the ego-motion of vehicles. However, methods proven to work in man-made environments cannot simply be deployed on other planets. The highly repetitive textures present in the wastelands of Mars pose a huge challenge to descriptor matching based approaches.In this paper, we present an advanced robust monocular odometry algorithm that uses efficient optical flow tracking to obtain feature correspondences between images and a refined keyframe selection criterion. In contrast to most other approaches, our framework can also handle rotation-only motions that are particularly challenging for monocular odometry systems. Furthermore, we present a novel approach to estimate the current risk of scale drift based on a principal component analysis of the relative translation information matrix. This way we obtain an implicit measure of uncertainty. We evaluate the validity of our approach on all sequences of a challenging real-world dataset captured in a Mars-like environment and show that it outperforms state-of-the-art approaches. The source code is publicly available at: https://github.com/DLR-RM/granit.
AB - In the future, extraterrestrial expeditions will not only be conducted by rovers but also by flying robots. The technical demonstration drone Ingenuity, that just landed on Mars, will mark the beginning of a new era of exploration unhindered by terrain traversability. Robust self-localization is crucial for that. Cameras that are lightweight, cheap and information-rich sensors are already used to estimate the ego-motion of vehicles. However, methods proven to work in man-made environments cannot simply be deployed on other planets. The highly repetitive textures present in the wastelands of Mars pose a huge challenge to descriptor matching based approaches.In this paper, we present an advanced robust monocular odometry algorithm that uses efficient optical flow tracking to obtain feature correspondences between images and a refined keyframe selection criterion. In contrast to most other approaches, our framework can also handle rotation-only motions that are particularly challenging for monocular odometry systems. Furthermore, we present a novel approach to estimate the current risk of scale drift based on a principal component analysis of the relative translation information matrix. This way we obtain an implicit measure of uncertainty. We evaluate the validity of our approach on all sequences of a challenging real-world dataset captured in a Mars-like environment and show that it outperforms state-of-the-art approaches. The source code is publicly available at: https://github.com/DLR-RM/granit.
UR - http://www.scopus.com/inward/record.url?scp=85124365025&partnerID=8YFLogxK
U2 - 10.1109/IROS51168.2021.9636844
DO - 10.1109/IROS51168.2021.9636844
M3 - Conference contribution
AN - SCOPUS:85124365025
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 8737
EP - 8744
BT - IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2021
Y2 - 27 September 2021 through 1 October 2021
ER -