RGB-D sensor data correction and enhancement by introduction of an additional RGB view

Artashes Mkhitaryan, Darius Burschka

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

RGB-D sensors are becoming more and more vital to robotics. Sensors such as the Microsoft Kinect and time of flight cameras provide 3D colored point-clouds in real time can play a crucial role in Robot Vision. However these sensors suffer from precision deficiencies, and often the density of the point-clouds they provide is insufficient. In this paper, we present a multi-camera system for correction and enhancement of the data acquired from an RGB-D sensor. Our system consists of two sensors, the RGB-D sensor (main sensor) and a regular RGB camera (auxiliary sensor). We perform the correction and the enhancement of the data acquired from the RGB-D sensor by placing the auxiliary sensor in a close proximity to the target object and taking advantage of the established epipolar geometry. We have managed to reduce the relative error of the raw point-cloud from a Microsoft Kinect RGB-D sensor by 74.5 % and increase its density up to 2.5 times.

Original languageEnglish
Title of host publicationIROS 2013
Subtitle of host publicationNew Horizon, Conference Digest - 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems
Pages1077-1083
Number of pages7
DOIs
StatePublished - 2013
Event2013 26th IEEE/RSJ International Conference on Intelligent Robots and Systems: New Horizon, IROS 2013 - Tokyo, Japan
Duration: 3 Nov 20138 Nov 2013

Publication series

NameIEEE International Conference on Intelligent Robots and Systems
ISSN (Print)2153-0858
ISSN (Electronic)2153-0866

Conference

Conference2013 26th IEEE/RSJ International Conference on Intelligent Robots and Systems: New Horizon, IROS 2013
Country/TerritoryJapan
CityTokyo
Period3/11/138/11/13

Fingerprint

Dive into the research topics of 'RGB-D sensor data correction and enhancement by introduction of an additional RGB view'. Together they form a unique fingerprint.

Cite this