Abstract
Currently, enhanced types of active range imaging devices are available for capturing dynamic scenes. By using intensity and range images, data derived from different or the same range imaging devices can be fused. In this paper, an automatic image-based co-registration methodology is presented which uses a RANSAC-based scheme for the Efficient Perspective-n-Point (EPnP) algorithm. For evaluating the methodology, two different types of range imaging devices have been investigated, namely Microsoft Kinect and PMD [vision] CamCube 2.0. The data sets captured with the test devices have been compared to a reference device with respect to the absolute and relative accuracy. As the presented methodology can cope with different configurations concerning measurement principle, point density and range accuracy, it shows a high potential for automated data fusion for range imaging devices.
Original language | English |
---|---|
Pages (from-to) | 119-124 |
Number of pages | 6 |
Journal | International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives |
Volume | 38 |
Issue number | 3W22 |
State | Published - 26 Apr 2011 |
Externally published | Yes |
Event | 2011 ISPRS Worshop on Photogrammetric Image Analysis, PIA 2011 - Munich, Germany Duration: 5 Oct 2011 → 7 Oct 2011 |
Keywords
- Accuracy
- Active sensing
- Automatic
- Close-range
- Evaluation
- Range imaging
- Structured light