TY - JOUR
T1 - Understanding the intention of human activities through semantic perception
T2 - Observation, understanding and execution on a humanoid robot
AU - Ramirez-Amaro, Karinne
AU - Beetz, Michael
AU - Cheng, Gordon
N1 - Publisher Copyright:
© 2015 Taylor & Francis and The Robotics Society of Japan.
PY - 2015/3/4
Y1 - 2015/3/4
N2 - In this work, we present and demonstrate that with an appropriate semantic representation and even with a very naive perception system, it is sufficient to infer human activities from observations. First, we present a method to extract the semantic rules of human everyday activities. Namely, we extract low-level information from the sensor data and then we infer the high-level by reasoning about the intended human behaviors. The advantage of this abstract representation is that it allows us to obtain more generic models from human behaviors, even when the information is obtained from different scenarios. Another important aspect of our system is its scalability and adaptability toward new activities, which can be learned on-demand. Our system has been fully implemented on a humanoid robot, the iCub, to experimentally validate the performance and the robustness of our system during on-line execution within the control loop of the robot. The results show that the robot is able to make a decision in 0.12 s about the inferred human behaviors with a recognition accuracy of 85%.
AB - In this work, we present and demonstrate that with an appropriate semantic representation and even with a very naive perception system, it is sufficient to infer human activities from observations. First, we present a method to extract the semantic rules of human everyday activities. Namely, we extract low-level information from the sensor data and then we infer the high-level by reasoning about the intended human behaviors. The advantage of this abstract representation is that it allows us to obtain more generic models from human behaviors, even when the information is obtained from different scenarios. Another important aspect of our system is its scalability and adaptability toward new activities, which can be learned on-demand. Our system has been fully implemented on a humanoid robot, the iCub, to experimentally validate the performance and the robustness of our system during on-line execution within the control loop of the robot. The results show that the robot is able to make a decision in 0.12 s about the inferred human behaviors with a recognition accuracy of 85%.
KW - automatic segmentation
KW - human activity recognition
KW - meaningful robot learning
KW - semantic reasoning
UR - http://www.scopus.com/inward/record.url?scp=84925585380&partnerID=8YFLogxK
U2 - 10.1080/01691864.2014.1003096
DO - 10.1080/01691864.2014.1003096
M3 - Article
AN - SCOPUS:84925585380
SN - 0169-1864
VL - 29
SP - 345
EP - 362
JO - Advanced Robotics
JF - Advanced Robotics
IS - 5
ER -