Understanding the intention of human activities through semantic perception: Observation, understanding and execution on a humanoid robot

Karinne Ramirez-Amaro, Michael Beetz, Gordon Cheng

Research output: Contribution to journalArticlepeer-review

38 Scopus citations

Abstract

In this work, we present and demonstrate that with an appropriate semantic representation and even with a very naive perception system, it is sufficient to infer human activities from observations. First, we present a method to extract the semantic rules of human everyday activities. Namely, we extract low-level information from the sensor data and then we infer the high-level by reasoning about the intended human behaviors. The advantage of this abstract representation is that it allows us to obtain more generic models from human behaviors, even when the information is obtained from different scenarios. Another important aspect of our system is its scalability and adaptability toward new activities, which can be learned on-demand. Our system has been fully implemented on a humanoid robot, the iCub, to experimentally validate the performance and the robustness of our system during on-line execution within the control loop of the robot. The results show that the robot is able to make a decision in 0.12 s about the inferred human behaviors with a recognition accuracy of 85%.

Original languageEnglish
Pages (from-to)345-362
Number of pages18
JournalAdvanced Robotics
Volume29
Issue number5
DOIs
StatePublished - 4 Mar 2015

Keywords

  • automatic segmentation
  • human activity recognition
  • meaningful robot learning
  • semantic reasoning

Fingerprint

Dive into the research topics of 'Understanding the intention of human activities through semantic perception: Observation, understanding and execution on a humanoid robot'. Together they form a unique fingerprint.

Cite this