TY - GEN
T1 - Learning deep movement primitives using convolutional neural networks
AU - Pervez, Affan
AU - Mao, Yuecheng
AU - Lee, Dongheui
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/12/22
Y1 - 2017/12/22
N2 - Dynamic Movement Primitives (DMPs) are widely used for encoding motion data. Task parameterized DMP (TP-DMP) can adapt a learned skill to different situations. Mostly a customized vision system is used to extract task specific variables. This limits the use of such systems to real world scenarios. This paper proposes a method for combining the DMP with a Convolutional Neural Network (CNN). Our approach preserves the generalization properties associated with a DMP, while the CNN learns the task specific features from the camera images. This eliminates the need to extract the task parameters, by directly utilizing the camera image during the motion reproduction. The performance of the developed approach is demonstrated through a trash cleaning task, executed with a real robot. We also show that by using the data augmentation, the learned sweeping skill can be generalized for arbitrary objects. The experiments show the robustness of our approach for several different settings.
AB - Dynamic Movement Primitives (DMPs) are widely used for encoding motion data. Task parameterized DMP (TP-DMP) can adapt a learned skill to different situations. Mostly a customized vision system is used to extract task specific variables. This limits the use of such systems to real world scenarios. This paper proposes a method for combining the DMP with a Convolutional Neural Network (CNN). Our approach preserves the generalization properties associated with a DMP, while the CNN learns the task specific features from the camera images. This eliminates the need to extract the task parameters, by directly utilizing the camera image during the motion reproduction. The performance of the developed approach is demonstrated through a trash cleaning task, executed with a real robot. We also show that by using the data augmentation, the learned sweeping skill can be generalized for arbitrary objects. The experiments show the robustness of our approach for several different settings.
UR - http://www.scopus.com/inward/record.url?scp=85044466146&partnerID=8YFLogxK
U2 - 10.1109/HUMANOIDS.2017.8246874
DO - 10.1109/HUMANOIDS.2017.8246874
M3 - Conference contribution
AN - SCOPUS:85044466146
T3 - IEEE-RAS International Conference on Humanoid Robots
SP - 191
EP - 197
BT - 2017 IEEE-RAS 17th International Conference on Humanoid Robotics, Humanoids 2017
PB - IEEE Computer Society
T2 - 17th IEEE-RAS International Conference on Humanoid Robotics, Humanoids 2017
Y2 - 15 November 2017 through 17 November 2017
ER -