Hidden Markov model-based speech emotion recognition

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

258 Scopus citations

Abstract

In this contribution we introduce speech emotion recognition by use of continuous hidden Markov models. Two methods are propagated and compared throughout the paper. Within the first method a global statistics framework of an utterance is classified by Gaussian mixture models using derived features of the raw pitch and energy contour of the speech signal. A second method introduces increased temporal complexity applying continuous hidden Markov models considering several states using low-level instantaneous features instead of global statistics. The paper addresses the design of working recognition engines and results achieved with respect to the alluded alternatives. A speech corpus consisting of acted and spontaneous emotion samples in German and English language is described in detail. Both engines have been tested and trained using this equivalent speech corpus. Results in recognition of seven discrete emotions exceeded 86% recognition rate. As a basis of comparison the similar judgment of human deciders classifying the same corpus at 79.8% recognition rate was analyzed.

Original languageEnglish
Title of host publicationProceedings - 2003 International Conference on Multimedia and Expo, ICME
PublisherIEEE Computer Society
PagesI401-I404
ISBN (Electronic)0780379659
DOIs
StatePublished - 2003
Event2003 International Conference on Multimedia and Expo, ICME 2003 - Baltimore, United States
Duration: 6 Jul 20039 Jul 2003

Publication series

NameProceedings - IEEE International Conference on Multimedia and Expo
Volume1
ISSN (Print)1945-7871
ISSN (Electronic)1945-788X

Conference

Conference2003 International Conference on Multimedia and Expo, ICME 2003
Country/TerritoryUnited States
CityBaltimore
Period6/07/039/07/03

Fingerprint

Dive into the research topics of 'Hidden Markov model-based speech emotion recognition'. Together they form a unique fingerprint.

Cite this