Visual odometry based on random finite set statistics in urban environment

Feihu Zhang, Guang Chen, Hauke Stähle, Christian Buckl, Alois Knoll

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations

Abstract

This paper presents a novel approach for estimating the vehicle's trajectory in complex urban environments. In previous work, we presented a visual odometry solution that estimates frame-to-frame motion from a single camera based on Random Finite Set (RFS) Statistics. This paper extends that work by combining the stereo cameras and gyroscope sensor. We are among the first to apply RFS statistics to visual odometry in real traffic scenes. The method is based on two phases: a preprocessing phase to extract features from the image and transform the coordinates from the image space to vehicle coordinates; a tracking phase to estimate the egomotion vector of the camera. We consider features as a group target and use the Probability Hypothesis Density (PHD) filter to update the overall group state as the motion vector. Compared to other approaches, our method presents a recursive filtering algorithm that provides dynamic estimation of multiple-targets states in the presence of clutter and high association uncertainty. The experimental results show that this method exhibits good robustness under various scenarios.

Original languageEnglish
Title of host publication2012 IEEE Intelligent Vehicles Symposium, IV 2012
Pages69-74
Number of pages6
DOIs
StatePublished - 2012
Event2012 IEEE Intelligent Vehicles Symposium, IV 2012 - Alcal de Henares, Madrid, Spain
Duration: 3 Jun 20127 Jun 2012

Publication series

NameIEEE Intelligent Vehicles Symposium, Proceedings

Conference

Conference2012 IEEE Intelligent Vehicles Symposium, IV 2012
Country/TerritorySpain
CityAlcal de Henares, Madrid
Period3/06/127/06/12

Fingerprint

Dive into the research topics of 'Visual odometry based on random finite set statistics in urban environment'. Together they form a unique fingerprint.

Cite this