TY - GEN
T1 - A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM
AU - Nikolic, Janosch
AU - Rehder, Joern
AU - Burri, Michael
AU - Gohl, Pascal
AU - Leutenegger, Stefan
AU - Furgale, Paul T.
AU - Siegwart, Roland
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/9/22
Y1 - 2014/9/22
N2 - Robust, accurate pose estimation and mapping at real-time in six dimensions is a primary need of mobile robots, in particular flying Micro Aerial Vehicles (MAVs), which still perform their impressive maneuvers mostly in controlled environments. This work presents a visual-inertial sensor unit aimed at effortless deployment on robots in order to equip them with robust real-time Simultaneous Localization and Mapping (SLAM) capabilities, and to facilitate research on this important topic at a low entry barrier. Up to four cameras are interfaced through a modern ARM-FPGA system, along with an Inertial Measurement Unit (IMU) providing high-quality rate gyro and accelerometer measurements, calibrated and hardware-synchronized with the images. This facilitates a tight fusion of visual and inertial cues that leads to a level of robustness and accuracy which is difficult to achieve with purely visual SLAM systems. In addition to raw data, the sensor head provides FPGA-pre-processed data such as visual keypoints, reducing the computational complexity of SLAM algorithms significantly and enabling employment on resource-constrained platforms. Sensor selection, hardware and firmware design, as well as intrinsic and extrinsic calibration are addressed in this work. Results from a tightly coupled reference visual-inertial motion estimation framework demonstrate the capabilities of the presented system.
AB - Robust, accurate pose estimation and mapping at real-time in six dimensions is a primary need of mobile robots, in particular flying Micro Aerial Vehicles (MAVs), which still perform their impressive maneuvers mostly in controlled environments. This work presents a visual-inertial sensor unit aimed at effortless deployment on robots in order to equip them with robust real-time Simultaneous Localization and Mapping (SLAM) capabilities, and to facilitate research on this important topic at a low entry barrier. Up to four cameras are interfaced through a modern ARM-FPGA system, along with an Inertial Measurement Unit (IMU) providing high-quality rate gyro and accelerometer measurements, calibrated and hardware-synchronized with the images. This facilitates a tight fusion of visual and inertial cues that leads to a level of robustness and accuracy which is difficult to achieve with purely visual SLAM systems. In addition to raw data, the sensor head provides FPGA-pre-processed data such as visual keypoints, reducing the computational complexity of SLAM algorithms significantly and enabling employment on resource-constrained platforms. Sensor selection, hardware and firmware design, as well as intrinsic and extrinsic calibration are addressed in this work. Results from a tightly coupled reference visual-inertial motion estimation framework demonstrate the capabilities of the presented system.
KW - Calibration
KW - Camera
KW - FPGA
KW - IMU
KW - SLAM
KW - Sensor Fusion
KW - Visual-Inertial Motion Estimation
UR - http://www.scopus.com/inward/record.url?scp=84929172132&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2014.6906892
DO - 10.1109/ICRA.2014.6906892
M3 - Conference contribution
AN - SCOPUS:84929172132
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 431
EP - 437
BT - Proceedings - IEEE International Conference on Robotics and Automation
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 IEEE International Conference on Robotics and Automation, ICRA 2014
Y2 - 31 May 2014 through 7 June 2014
ER -