TY - GEN
T1 - Mirror my emotions! Combining facial expression analysis and synthesis on a robot
AU - Sosnowski, Stefan
AU - Mayer, Christoph
AU - Kühnlenz, Kolja
AU - Radig, Bernd
PY - 2010
Y1 - 2010
N2 - Everyday human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, body pose or gestures. Facial expressions are one of the main communication mechanisms and pass large amounts of information between human dialogue partners [22]. Therefore, the analysis and the synthesis of facial expressions are important steps towards an intuitive human-machine interaction and form valuable research targets. We present a system that tackles both challenges. It relies on a fully automated, model-based, real-time capable approach to distinguish universal facial expressions and their intensities from camera images. Facial expression synthesis is conducted via the robot head EDDIE, a flexible low-cost emotion-display with 23 degrees of freedom. Static facial expressions at continuous intensities are included, as well as smooth transitions based on the circumplex model of affect. Miniature off-the-shelf mechatronic components are used to provide high functionality at low cost. Evaluations conducted in a user-study show that emotions displayed by EDDIE are recognizable by humans very well. By combining facial expression recognition and display on the robot, a first demonstration is presented in which the robot mirrors the human's emotions, as a basis for further research in the field of emotional closed loop systems.
AB - Everyday human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, body pose or gestures. Facial expressions are one of the main communication mechanisms and pass large amounts of information between human dialogue partners [22]. Therefore, the analysis and the synthesis of facial expressions are important steps towards an intuitive human-machine interaction and form valuable research targets. We present a system that tackles both challenges. It relies on a fully automated, model-based, real-time capable approach to distinguish universal facial expressions and their intensities from camera images. Facial expression synthesis is conducted via the robot head EDDIE, a flexible low-cost emotion-display with 23 degrees of freedom. Static facial expressions at continuous intensities are included, as well as smooth transitions based on the circumplex model of affect. Miniature off-the-shelf mechatronic components are used to provide high functionality at low cost. Evaluations conducted in a user-study show that emotions displayed by EDDIE are recognizable by humans very well. By combining facial expression recognition and display on the robot, a first demonstration is presented in which the robot mirrors the human's emotions, as a basis for further research in the field of emotional closed loop systems.
UR - http://www.scopus.com/inward/record.url?scp=84863926598&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84863926598
SN - 1902956877
SN - 9781902956879
T3 - Proceedings of the 2nd International Symposium on New Frontiers in Human-Robot Interaction - A Symposium at the AISB 2010 Convention
SP - 108
EP - 112
BT - Proceedings of the 2nd International Symposium on New Frontiers in Human-Robot Interaction - A Symposium at the AISB 2010 Convention
T2 - 2nd International Symposium on New Frontiers in Human-Robot Interaction - A Symposium at the AISB 2010 Convention
Y2 - 29 March 2010 through 1 April 2010
ER -