Sincerity and deception in speech: Two sides of the same coin? A transfer-and multi-task learning perspective

Yue Zhang, Felix Weninger, Zhao Ren, Björn Schuller

Research output: Contribution to journalConference articlepeer-review

10 Scopus citations

Abstract

In this work, we investigate the coherence between inferable deception and perceived sincerity in speech, as featured in the Deception and Sincerity tasks of the INTERSPEECH 2016 Computational Paralinguistics ChallengE (ComParE). We demonstrate an effective approach that combines the corpora of both Challenge tasks to achieve higher classification accuracy. We show that the naïve label mapping method based on the assumption that sincerity and deception are just 'two sides of the same coin', i. e., taking deceptive speech as equivalent to non-sincere speech and vice versa, does not yield satisfactory results. However, we can exploit the interplay and synergies between these characteristics. To achieve this, we combine our previously introduced approach for data aggregation by semi-supervised cross-task label completion with multi-task learning, and knowledge-based instance selection. In the result, our approach achieves significant error rate reductions compared to the official Challenge baseline.

Original languageEnglish
Pages (from-to)2041-2045
Number of pages5
JournalProceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH
Volume08-12-September-2016
DOIs
StatePublished - 2016
Externally publishedYes
Event17th Annual Conference of the International Speech Communication Association, INTERSPEECH 2016 - San Francisco, United States
Duration: 8 Sep 201616 Sep 2016

Keywords

  • Computational paralinguistics
  • Cross-task labelling
  • Deception and sincerity identification
  • Multi-task learning
  • Transfer learning

Fingerprint

Dive into the research topics of 'Sincerity and deception in speech: Two sides of the same coin? A transfer-and multi-task learning perspective'. Together they form a unique fingerprint.

Cite this