Humanoids learn touch modalities identification via multi-modal robotic skin and robust tactile descriptors

Mohsen Kaboli, Alex Long, Gordon Cheng

Research output: Contribution to journalArticlepeer-review

45 Scopus citations

Abstract

In this paper, we present a novel approach for touch modality identification via tactile sensing on a humanoid. In this respect, we equipped a NAO humanoid with whole upper body coverage of multi-modal artificial skin. We propose a set of biologically inspired feature descriptors to provide robust and abstract tactile information for use in touch classification. These features are demonstrated to be invariant to location of contact and movement of the humanoid, as well as capable of processing single and multi-touch actions. To provide a comparison of our method, existing approaches were reimplemented and evaluated. The experimental results show that the humanoid can distinguish different single touch modalities with a recognition rate of 96.79% while using the proposed feature descriptors and SVM classifier. Furthermore, it can recognize multiple touch actions with 93.03% recognition rate.

Original languageEnglish
Pages (from-to)1411-1425
Number of pages15
JournalAdvanced Robotics
Volume29
Issue number21
DOIs
StatePublished - 2 Nov 2015

Keywords

  • artificial robotic skin
  • humanoid robots
  • tactile data processing
  • tactile feature descriptors
  • tactile learning
  • touch classification

Fingerprint

Dive into the research topics of 'Humanoids learn touch modalities identification via multi-modal robotic skin and robust tactile descriptors'. Together they form a unique fingerprint.

Cite this