TY - GEN
T1 - Invariant representation for user independent motion recognition
AU - Saveriano, Matteo
AU - Lee, Dongheui
PY - 2013
Y1 - 2013
N2 - Human gesture recognition is of importance for smooth and efficient human robot interaction. One of difficulties in gesture recognition is that different actors have different styles in performing even same gestures. In order to move towards more realistic scenarios, a robot is required to handle not only different users, but also different view points and noisy incomplete data from onboard sensors on the robot. Facing these challenges, we propose a new invariant representation of rigid body motions, which is invariant to translation, rotation and scaling factors. For classification, Hidden Markov Models based approach and Dynamic Time Warping based approach are modified by weighting the importances of body parts. The proposed method is tested with two Kinect datasets and it is compared with another invariant representation and a typical non-invariant representation. The experimental results show good recognition performance of our proposed approach.
AB - Human gesture recognition is of importance for smooth and efficient human robot interaction. One of difficulties in gesture recognition is that different actors have different styles in performing even same gestures. In order to move towards more realistic scenarios, a robot is required to handle not only different users, but also different view points and noisy incomplete data from onboard sensors on the robot. Facing these challenges, we propose a new invariant representation of rigid body motions, which is invariant to translation, rotation and scaling factors. For classification, Hidden Markov Models based approach and Dynamic Time Warping based approach are modified by weighting the importances of body parts. The proposed method is tested with two Kinect datasets and it is compared with another invariant representation and a typical non-invariant representation. The experimental results show good recognition performance of our proposed approach.
UR - http://www.scopus.com/inward/record.url?scp=84889568152&partnerID=8YFLogxK
U2 - 10.1109/ROMAN.2013.6628422
DO - 10.1109/ROMAN.2013.6628422
M3 - Conference contribution
AN - SCOPUS:84889568152
SN - 9781479905072
T3 - Proceedings - IEEE International Workshop on Robot and Human Interactive Communication
SP - 650
EP - 655
BT - 22nd IEEE International Symposium on Robot and Human Interactive Communication
T2 - 22nd IEEE International Symposium on Robot and Human Interactive Communication: "Living Together, Enjoying Together, and Working Together with Robots!", IEEE RO-MAN 2013
Y2 - 26 August 2013 through 29 August 2013
ER -