Speech emotion recognition combining acoustic features and linguistic information in a hybrid support vector machine - Belief network architecture

Publikation: Beitrag in FachzeitschriftKonferenzartikelBegutachtung

362 Zitate (Scopus)

Abstract

In this contribution we introduce a novel approach to the combination of acoustic features and language information for a most robust automatic recognition of a speaker's emotion. Seven discrete emotional states are classified throughout the work. Firstly a model for the recognition of emotion by acoustic features is presented. The derived features of the signal-, pitch-, energy, and spectral contours are ranked by their quantitative contribution to the estimation of an emotion. Several different classification methods including linear classifiers, Gaussian Mixture Models, Neural Nets, and Support Vector Machines are compared by their performance within this task. Secondly an approach to emotion recognition by the spoken content is introduced applying Belief Network based spotting for emotional key-phrases. Finally the two information sources will be integrated in a soft decision fusion by using a Neural Net. The gain will be evaluated and compared to other advances. Two emotional speech corpora used for training and evaluation are described in detail and the results achieved applying the propagated novel advance to speaker emotion recognition are presented and discussed.

OriginalspracheEnglisch
Seiten (von - bis)I577-I580
FachzeitschriftICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Jahrgang1
PublikationsstatusVeröffentlicht - 2004
VeranstaltungProceedings - IEEE International Conference on Acoustics, Speech, and Signal Processing - Montreal, Que, Kanada
Dauer: 17 Mai 200421 Mai 2004

Fingerprint

Untersuchen Sie die Forschungsthemen von „Speech emotion recognition combining acoustic features and linguistic information in a hybrid support vector machine - Belief network architecture“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren