Visual homing and surprise detection for cognitive mobile robots using image-based environment representations

Werner Maier, Elmar Mair, Darius Burschka, Eckehard Steinbach

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Scopus citations

Abstract

One important feature of a cognitive system is to perceive and understand its environment and to adapt its actions to changes and unforeseen situations. In this paper, we propose a scheme for visual surprise detection in cognitive mobile robots. With the robot's observation and a set of reference images previously acquired near its current viewpoint, a pixelwise surprise trigger is computed using Bayesian probabilistic inference techniques. With appropriate mathematical approximations this algorithm can be implemented on modern graphics hardware which nearly allows for real-time surprise detection. In order to refer to prior observations, a mobile robot has to be able to re-localize itself with respect to its environment. Thus we also present two online image-based homing algorithms which both facilitate the computation of location-independent surprise triggers. Experiments show acceptable results in terms of robust and fast detection of unexpected changes in theenvironment.

Original languageEnglish
Title of host publication2009 IEEE International Conference on Robotics and Automation, ICRA '09
Pages807-812
Number of pages6
DOIs
StatePublished - 2009
Event2009 IEEE International Conference on Robotics and Automation, ICRA '09 - Kobe, Japan
Duration: 12 May 200917 May 2009

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Conference

Conference2009 IEEE International Conference on Robotics and Automation, ICRA '09
Country/TerritoryJapan
CityKobe
Period12/05/0917/05/09

Fingerprint

Dive into the research topics of 'Visual homing and surprise detection for cognitive mobile robots using image-based environment representations'. Together they form a unique fingerprint.

Cite this