Understanding Human Manipulation with the Environment: A Novel Taxonomy for Video Labelling

Visar Arapi, Cosimo Della Santina, Giuseppe Averta, Antonio Bicchi, Matteo Bianchi

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

In recent years, the spread of data-driven approaches for robotic grasp synthesis has come with the increasing need for reliable datasets, which can be built e.g. through video labelling. To this goal, it is important to define suitable rules to characterize the main human grasp types, for easily identifying them in video streams. In this work, we present a novel taxonomy that builds upon the related state of the art, but it is specifically thought for video labelling. It focuses on the interaction of the hand with the environment and accounts for pre-contact phases, bi-manual grasps as well as non-prehensile strategies. This study is complemented with a dataset of labelled videos of subjects performing activities of daily living, for a total of nine hours, and the description of MatLab tools for labelling new videos. Both hands were labelled at any time. We used these labelled data for performing a preliminary statistical description of the occurrences of the here proposed class types.

Original languageEnglish
Article number9472990
Pages (from-to)6537-6544
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume6
Issue number4
DOIs
StatePublished - Oct 2021
Externally publishedYes

Keywords

  • Datasets for human motion
  • bimanual manipulation
  • deep learning
  • dexterous manipulation

Fingerprint

Dive into the research topics of 'Understanding Human Manipulation with the Environment: A Novel Taxonomy for Video Labelling'. Together they form a unique fingerprint.

Cite this