Abstract
In this work, we present and demonstrate that with an appropriate semantic representation and even with a very naive perception system, it is sufficient to infer human activities from observations. First, we present a method to extract the semantic rules of human everyday activities. Namely, we extract low-level information from the sensor data and then we infer the high-level by reasoning about the intended human behaviors. The advantage of this abstract representation is that it allows us to obtain more generic models from human behaviors, even when the information is obtained from different scenarios. Another important aspect of our system is its scalability and adaptability toward new activities, which can be learned on-demand. Our system has been fully implemented on a humanoid robot, the iCub, to experimentally validate the performance and the robustness of our system during on-line execution within the control loop of the robot. The results show that the robot is able to make a decision in 0.12 s about the inferred human behaviors with a recognition accuracy of 85%.
| Original language | English |
|---|---|
| Pages (from-to) | 345-362 |
| Number of pages | 18 |
| Journal | Advanced Robotics |
| Volume | 29 |
| Issue number | 5 |
| DOIs | |
| State | Published - 4 Mar 2015 |
Keywords
- automatic segmentation
- human activity recognition
- meaningful robot learning
- semantic reasoning
Fingerprint
Dive into the research topics of 'Understanding the intention of human activities through semantic perception: Observation, understanding and execution on a humanoid robot'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver