TY - GEN
T1 - Large-scale direct SLAM for omnidirectional cameras
AU - Caruso, David
AU - Engel, Jakob
AU - Cremers, Daniel
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/12/11
Y1 - 2015/12/11
N2 - We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cameras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view above 180 °. This is in contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, in practice limiting the field of view to around 130 ° diagonally. Not only does this allows to observe - and reconstruct - a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185 ° fisheye lens, and compare it to a rectified and a piecewise rectified approach.
AB - We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cameras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view above 180 °. This is in contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, in practice limiting the field of view to around 130 ° diagonally. Not only does this allows to observe - and reconstruct - a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185 ° fisheye lens, and compare it to a rectified and a piecewise rectified approach.
KW - Cameras
KW - Computational modeling
KW - Lenses
KW - Nonlinear distortion
KW - Simultaneous localization and mapping
KW - Three-dimensional displays
UR - http://www.scopus.com/inward/record.url?scp=84958184174&partnerID=8YFLogxK
U2 - 10.1109/IROS.2015.7353366
DO - 10.1109/IROS.2015.7353366
M3 - Conference contribution
AN - SCOPUS:84958184174
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 141
EP - 148
BT - IROS Hamburg 2015 - Conference Digest
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2015
Y2 - 28 September 2015 through 2 October 2015
ER -