TY - GEN
T1 - Embracing and Exploiting Annotator Emotional Subjectivity
T2 - 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2021
AU - Stappen, Lukas
AU - Schumann, Lea
AU - Batliner, Anton
AU - Schuller, Bjorn W.
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Automated recognition of continuous emotions in audio-visual data is a growing area of study that aids in understanding human-machine interaction. Training such systems presupposes human annotation of the data. The annotation process, however, is laborious and expensive given that several human ratings are required for every data sample to compensate for the subjectivity of emotion perception. As a consequence, labelled data for emotion recognition are rare and the existing corpora are limited when compared to other state-of-the-art deep learning datasets. In this study, we explore different ways in which existing emotion annotations can be utilised more effectively to exploit available labelled information to the fullest. To reach this objective, we exploit individual raters' opinions by employing an ensemble of rater-specific models, one for each annotator, by that reducing the loss of information which is a byproduct of annotation aggregation; we find that individual models can indeed infer subjective opinions. Furthermore, we explore the fusion of such ensemble predictions using different fusion techniques. Our ensemble model with only two annotators outperforms the regular Arousal baseline on the test set of the MuSe-CaR corpus. While no considerable improvements on valence could be obtained, using all annotators increases the prediction performance of arousal by up to. 07 Concordance Correlation Coefficient absolute improvement on test - solely trained on rate-specific models and fused by an attention-enhanced Long-short Term Memory-Recurrent Neural Network.
AB - Automated recognition of continuous emotions in audio-visual data is a growing area of study that aids in understanding human-machine interaction. Training such systems presupposes human annotation of the data. The annotation process, however, is laborious and expensive given that several human ratings are required for every data sample to compensate for the subjectivity of emotion perception. As a consequence, labelled data for emotion recognition are rare and the existing corpora are limited when compared to other state-of-the-art deep learning datasets. In this study, we explore different ways in which existing emotion annotations can be utilised more effectively to exploit available labelled information to the fullest. To reach this objective, we exploit individual raters' opinions by employing an ensemble of rater-specific models, one for each annotator, by that reducing the loss of information which is a byproduct of annotation aggregation; we find that individual models can indeed infer subjective opinions. Furthermore, we explore the fusion of such ensemble predictions using different fusion techniques. Our ensemble model with only two annotators outperforms the regular Arousal baseline on the test set of the MuSe-CaR corpus. While no considerable improvements on valence could be obtained, using all annotators increases the prediction performance of arousal by up to. 07 Concordance Correlation Coefficient absolute improvement on test - solely trained on rate-specific models and fused by an attention-enhanced Long-short Term Memory-Recurrent Neural Network.
KW - annotation optimisation
KW - emotion recognition
KW - ensemble models
UR - http://www.scopus.com/inward/record.url?scp=85124952438&partnerID=8YFLogxK
U2 - 10.1109/ACIIW52867.2021.9666407
DO - 10.1109/ACIIW52867.2021.9666407
M3 - Conference contribution
AN - SCOPUS:85124952438
T3 - 2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2021
BT - 2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 28 September 2021 through 1 October 2021
ER -