Humanoids learn touch modalities identification via multi-modal robotic skin and robust tactile descriptors

Mohsen Kaboli, Alex Long, Gordon Cheng

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

45 Zitate (Scopus)

Abstract

In this paper, we present a novel approach for touch modality identification via tactile sensing on a humanoid. In this respect, we equipped a NAO humanoid with whole upper body coverage of multi-modal artificial skin. We propose a set of biologically inspired feature descriptors to provide robust and abstract tactile information for use in touch classification. These features are demonstrated to be invariant to location of contact and movement of the humanoid, as well as capable of processing single and multi-touch actions. To provide a comparison of our method, existing approaches were reimplemented and evaluated. The experimental results show that the humanoid can distinguish different single touch modalities with a recognition rate of 96.79% while using the proposed feature descriptors and SVM classifier. Furthermore, it can recognize multiple touch actions with 93.03% recognition rate.

OriginalspracheEnglisch
Seiten (von - bis)1411-1425
Seitenumfang15
FachzeitschriftAdvanced Robotics
Jahrgang29
Ausgabenummer21
DOIs
PublikationsstatusVeröffentlicht - 2 Nov. 2015

Fingerprint

Untersuchen Sie die Forschungsthemen von „Humanoids learn touch modalities identification via multi-modal robotic skin and robust tactile descriptors“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren