Efficient dense scene flow from sparse or dense stereo data

Andreas Wedel, Clemens Rabe, Tobi Vaudrey, Thomas Brox, Uwe Franke, Daniel Cremers

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

95 Scopus citations

Abstract

This paper presents a technique for estimating the three-dimensional velocity vector field that describes the motion of each visible scene point (scene flow). The technique presented uses two consecutive image pairs from a stereo sequence. The main contribution is to decouple the position and velocity estimation steps, and to estimate dense velocities using a variational approach. We enforce the scene flow to yield consistent displacement vectors in the left and right images. The decoupling strategy has two main advantages: Firstly, we are independent in choosing a disparity estimation technique, which can yield either sparse or dense correspondences, and secondly, we can achieve frame rates of 5 fps on standard consumer hardware. The approach provides dense velocity estimates with accurate results at distances up to 50 meters.

Original languageEnglish
Title of host publicationComputer Vision - ECCV 2008 - 10th European Conference on Computer Vision, Proceedings
PublisherSpringer Verlag
Pages739-751
Number of pages13
EditionPART 1
ISBN (Print)3540886818, 9783540886815
DOIs
StatePublished - 2008
Externally publishedYes
Event10th European Conference on Computer Vision, ECCV 2008 - Marseille, France
Duration: 12 Oct 200818 Oct 2008

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume5302 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference10th European Conference on Computer Vision, ECCV 2008
Country/TerritoryFrance
CityMarseille
Period12/10/0818/10/08

Fingerprint

Dive into the research topics of 'Efficient dense scene flow from sparse or dense stereo data'. Together they form a unique fingerprint.

Cite this