A Tactile-Based Framework for Active Object Learning and Discrimination using Multimodal Robotic Skin

Mohsen Kaboli, Di Feng, Kunpeng Yao, Pablo Lanillos, Gordon Cheng

Research output: Contribution to journalArticlepeer-review

51 Scopus citations

Abstract

In this letter, we propose a complete probabilistic tactile-based framework to enable robots to autonomously explore unknown workspaces and recognize objects based on their physical properties. Our framework consists of three components: 1) an active pretouch strategy to efficiently explore unknown workspaces; 2) an active touch learning method to learn about unknown objects based on their physical properties (surface texture, stiffness, and thermal conductivity) with the least number of training samples; and 3) an active touch algorithm for object discrimination, which selects the most informative exploratory action to apply to the object, so that the robot can efficiently distinguish between objects with a few number of actions. Our proposed framework was experimentally evaluated using a robotic arm equipped with multimodal artificial skin. The robot with the active pretouch method reduced the uncertainty of the workspace up to 30% and 70% compared to uniform and random strategies, respectively. By means of the active touch learning algorithm, the robot used 50% fewer samples to achieve the same learning accuracy than the baseline methods. By taking advantage of the prior knowledge obtained during the learning process, the robot actively discriminated objects with an improvement of 10% recognition accuracy compare to the random action selection approach.

Original languageEnglish
Article number7961193
Pages (from-to)2143-2150
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume2
Issue number4
DOIs
StatePublished - Oct 2017

Keywords

  • Active tactile learning
  • artificial robotic skin
  • force and tactile sensing
  • tactile object recognition

Fingerprint

Dive into the research topics of 'A Tactile-Based Framework for Active Object Learning and Discrimination using Multimodal Robotic Skin'. Together they form a unique fingerprint.

Cite this