Omnidirectional DSO: Direct Sparse Odometry with Fisheye Cameras

Hidenobu Matsuki, Lukas Von Stumberg, Vladyslav Usenko, Jorg Stuckler, Daniel Cremers

Research output: Contribution to journalArticlepeer-review

76 Scopus citations


We propose a novel real-time direct monocular visual odometry for omnidirectional cameras. Our method extends direct sparse odometry by using the unified omnidirectional model as a projection function, which can be applied to fisheye cameras with a field-of-view (FoV) well above 180$^\circ$. This formulation allows for using the full area of the input image even with strong distortion, while most existing visual odometry methods can only use a rectified and cropped part of it. Model parameters within an active keyframe window are jointly optimized, including the intrinsic/extrinsic camera parameters, three-dimensional position of points, and affine brightness parameters. Thanks to the wide FoV, image overlap between frames becomes bigger and points are more spatially distributed. Our results demonstrate that our method provides increased accuracy and robustness over state-of-the-art visual odometry algorithms.

Original languageEnglish
Article number8410468
Pages (from-to)3693-3700
Number of pages8
JournalIEEE Robotics and Automation Letters
Issue number4
StatePublished - Oct 2018


  • SLAM
  • omnidirectional vision
  • visual-based navigation


Dive into the research topics of 'Omnidirectional DSO: Direct Sparse Odometry with Fisheye Cameras'. Together they form a unique fingerprint.

Cite this