TY - GEN
T1 - Illumination-invariant image-based novelty detection in a cognitive mobile robot's environment
AU - Maier, Werner
AU - Bao, Fengqing
AU - Mair, Elmar
AU - Steinbach, Eckehard
AU - Burschka, Darius
PY - 2010
Y1 - 2010
N2 - Image-based scene representations enable a mobile robot to make a realistic prediction of its environment. Hence, it is able to rapidly detect changes in its surroundings by comparing a virtual image generated from previously acquired reference images and its current observation. This facilitates attentional control to novel events. However, illumination effects can impair attentional control if the robot does not take them into account. To address this issue, we present in this paper an approach for the acquisition of illumination-invariant scene representations. Using multiple spatial image sequences which are captured under varying illumination conditions the robot computes an illumination-invariant image-based environment model. With this representation and statistical models about the illumination behavior, the robot is able to robustly detect texture changes in its environment under different lighting. Experimental results show high-quality images which are free of illumination effects as well as more robust novelty detection compared to state-of-the-art methods.
AB - Image-based scene representations enable a mobile robot to make a realistic prediction of its environment. Hence, it is able to rapidly detect changes in its surroundings by comparing a virtual image generated from previously acquired reference images and its current observation. This facilitates attentional control to novel events. However, illumination effects can impair attentional control if the robot does not take them into account. To address this issue, we present in this paper an approach for the acquisition of illumination-invariant scene representations. Using multiple spatial image sequences which are captured under varying illumination conditions the robot computes an illumination-invariant image-based environment model. With this representation and statistical models about the illumination behavior, the robot is able to robustly detect texture changes in its environment under different lighting. Experimental results show high-quality images which are free of illumination effects as well as more robust novelty detection compared to state-of-the-art methods.
UR - http://www.scopus.com/inward/record.url?scp=77955807395&partnerID=8YFLogxK
U2 - 10.1109/ROBOT.2010.5509354
DO - 10.1109/ROBOT.2010.5509354
M3 - Conference contribution
AN - SCOPUS:77955807395
SN - 9781424450381
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 5029
EP - 5034
BT - 2010 IEEE International Conference on Robotics and Automation, ICRA 2010
T2 - 2010 IEEE International Conference on Robotics and Automation, ICRA 2010
Y2 - 3 May 2010 through 7 May 2010
ER -