Real-time visual odometry from dense RGB-D images

Frank Steinbrücker, Jürgen Sturm, Daniel Cremers

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

322 Scopus citations

Abstract

We present an energy-based approach to visual odometry from RGB-D images of a Microsoft Kinect camera. To this end we propose an energy function which aims at finding the best rigid body motion to map one RGB-D image into another one, assuming a static scene filmed by a moving camera. We then propose a linearization of the energy function which leads to a 6×6 normal equation for the twist coordinates representing the rigid body motion. To allow for larger motions, we solve this equation in a coarse-to-fine scheme. Extensive quantitative analysis on recently proposed benchmark datasets shows that the proposed solution is faster than a state-of-the-art implementation of the iterative closest point (ICP) algorithm by two orders of magnitude. While ICP is more robust to large camera motion, the proposed method gives better results in the regime of small displacements which are often the case in camera tracking applications.

Original languageEnglish
Title of host publication2011 IEEE International Conference on Computer Vision Workshops, ICCV Workshops 2011
Pages719-722
Number of pages4
DOIs
StatePublished - 2011
Event2011 IEEE International Conference on Computer Vision Workshops, ICCV Workshops 2011 - Barcelona, Spain
Duration: 6 Nov 201113 Nov 2011

Publication series

NameProceedings of the IEEE International Conference on Computer Vision

Conference

Conference2011 IEEE International Conference on Computer Vision Workshops, ICCV Workshops 2011
Country/TerritorySpain
CityBarcelona
Period6/11/1113/11/11

Fingerprint

Dive into the research topics of 'Real-time visual odometry from dense RGB-D images'. Together they form a unique fingerprint.

Cite this