Yielding Self-Perception in Robots Through Sensorimotor Contingencies

Pablo Lanillos, Emmanuel Dean-Leon, Gordon Cheng

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

We address self-perception in robots as the key for world understanding and causality interpretation. We present a self-perception mechanism that enables a humanoid robot to understand certain sensory changes caused by naive actions during interaction with objects. Visual, proprioceptive and tactile cues are combined via artificial attention and probabilistic reasoning to permit the robot to discern between inbody and outbody sources in the scene. With that support and exploiting intermodal sensory contingencies, the robot can infer simple concepts such as discovering potential 'usable' objects. Theoretically and through experimentation with a real humanoid robot, we show how self-perception is a backdrop ability for high order cognitive skills. Moreover, we present a novel model for self-detection, which does not need to track the body parts. Furthermore, results show that the proposed approach successfully discovers objects in the reaching space, improving scene understanding by discriminating real objects from visual artifacts.

Original languageEnglish
Article number7740933
Pages (from-to)100-112
Number of pages13
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume9
Issue number2
DOIs
StatePublished - Jun 2017

Keywords

  • Conceptual inference
  • embodied cognition
  • multimodal integration
  • self-detection
  • self-perception
  • sensorimotor contingencies (SMCs)

Fingerprint

Dive into the research topics of 'Yielding Self-Perception in Robots Through Sensorimotor Contingencies'. Together they form a unique fingerprint.

Cite this