Semi-dense visual odometry for a monocular camera

Jakob Engel, Jurgen Sturm, Daniel Cremers

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

460 Zitate (Scopus)

Abstract

We propose a fundamentally novel approach to real-time visual odometry for a monocular camera. It allows to benefit from the simplicity and accuracy of dense tracking-which does not depend on visual features-while running in real-time on a CPU. The key idea is to continuously estimate a semi-dense inverse depth map for the current frame, which in turn is used to track the motion of the camera using dense image alignment. More specifically, we estimate the depth of all pixels which have a non-negligible image gradient. Each estimate is represented as a Gaussian probability distribution over the inverse depth. We propagate this information over time, and update it with new measurements as new images arrive. In terms of tracking accuracy and computational speed, the proposed method compares favorably to both state-of-the-art dense and feature-based visual odometry and SLAM algorithms. As our method runs in real-time on a CPU, it is of large practical value for robotics and augmented reality applications.

OriginalspracheEnglisch
TitelProceedings - 2013 IEEE International Conference on Computer Vision, ICCV 2013
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten1449-1456
Seitenumfang8
ISBN (Print)9781479928392
DOIs
PublikationsstatusVeröffentlicht - 2013
Veranstaltung2013 14th IEEE International Conference on Computer Vision, ICCV 2013 - Sydney, NSW, Australien
Dauer: 1 Dez. 20138 Dez. 2013

Publikationsreihe

NameProceedings of the IEEE International Conference on Computer Vision

Konferenz

Konferenz2013 14th IEEE International Conference on Computer Vision, ICCV 2013
Land/GebietAustralien
OrtSydney, NSW
Zeitraum1/12/138/12/13

Fingerprint

Untersuchen Sie die Forschungsthemen von „Semi-dense visual odometry for a monocular camera“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren