Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot

P. Mittendorfer, E. Yoshida, G. Cheng

Research output: Contribution to journalArticlepeer-review

102 Scopus citations

Abstract

In this paper, we present a new approach to realize whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. We, therefore, equipped the whole upper body of the humanoid HRP-2 with various patches of CellulARSkin-a modular artificial skin. In order to automatically handle a potentially high number of tactile sensor cells and motors units, the robot uses open-loop exploration motions, and distributed accelerometers in the artificial skin cells, to acquire its self-centered sensory-motor knowledge. This body self-knowledge is then utilized to transfer multi-modal tactile stimulations into reactive body motions. Tactile events provide feedback on changes of contact on the whole-body surface. We demonstrate the feasibility of our approach on a humanoid, here HRP-2, grasping large and unknown objects only via tactile feedback. Kinesthetically taught grasping trajectories, are reactively adapted to the size and stiffness of different test objects. Our paper contributes the first realization of a self-organizing tactile sensor-behavior mapping on a full-sized humanoid robot, enabling a position controlled robot to compliantly handle objects.

Original languageEnglish
Pages (from-to)51-67
Number of pages17
JournalAdvanced Robotics
Volume29
Issue number1
DOIs
StatePublished - 2 Jan 2015

Keywords

  • artificial skin
  • humanoid robots
  • self-organization
  • whole-body tactile interaction

Fingerprint

Dive into the research topics of 'Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot'. Together they form a unique fingerprint.

Cite this