A Self-Verifying Cognitive Architecture for Robust Bootstrapping of Sensory-Motor Skills via Multipurpose Predictors

Erhard Wieser, Gordon Cheng

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

The autonomous acquisition of sensory-motor skills along multiple developmental stages is one of the current challenges in robotics. To this end, we propose a new developmental cognitive architecture that combines multipurpose predictors and principles of self-verification for the robust bootstrapping of sensory-motor skills. Our architecture operates with loops formed by both mental simulation of sensory-motor sequences and their subsequent physical trial on a robot. During these loops, verification algorithms monitor the predicted and the physically observed sensory-motor data. Multiple types of predictors are acquired through several developmental stages. As a result, the architecture can select and plan actions, adapt to various robot platforms by adjusting proprioceptive feedback, predict the risk of self-collision, learn from a previous interaction stage by validating and extracting sensory-motor data for training the predictor of a subsequent stage, and finally acquire an internal representation for evaluating the performance of its predictors. These cognitive capabilities in turn realize the bootstrapping of early hand-eye coordination and its improvement. We validate the cognitive capabilities experimentally and, in particular, show an improvement of reaching as an example skill.

Original languageEnglish
Article number8470985
Pages (from-to)1081-1095
Number of pages15
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume10
Issue number4
DOIs
StatePublished - Dec 2018

Keywords

  • Bootstrapping
  • cognitive architecture
  • prediction
  • self-verification

Fingerprint

Dive into the research topics of 'A Self-Verifying Cognitive Architecture for Robust Bootstrapping of Sensory-Motor Skills via Multipurpose Predictors'. Together they form a unique fingerprint.

Cite this