Crowdsourcing vs. laboratory experiments - QoE evaluation of binaural playback in a teleconference scenario

Thomas Volk, Christian Keimel, Michael Moosmeier, Klaus Diepold

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Experiments for the subjective evaluation of multimedia presentations and content are traditionally conducted in a laboratory environment. In this respect common procedures for the evaluation of teleconference systems are no different. The strictly controlled laboratory environment, however, often gives a rather poor representation of the actual use case. Therefore in this study we crowdsourced the evaluation of a teleconference system to perform the evaluation in a real-life environment. Moreover, we used the unique possibilities of crowdsourcing to employ two different demographics by hiring workers from Germany on the one hand and the US and Great Britain on the other hand. The goal of this experiment was to assess the perceived Quality of Experience (QoE) during a listening test and compare the results to results from a similar listening test conducted in the controlled laboratory environment. In doing so, we observed not only intriguing differences in the collected QoE ratings between the results of laboratory and crowdsourcing experiments, but also between the different worker demographics in terms of reliability, availability and efficiency.

Original languageEnglish
Pages (from-to)99-109
Number of pages11
JournalComputer Networks
Volume90
DOIs
StatePublished - 29 Oct 2015

Keywords

  • Binaural audio
  • Crowdsourcing
  • QoE
  • Subjective evaluation
  • Teleconference

Fingerprint

Dive into the research topics of 'Crowdsourcing vs. laboratory experiments - QoE evaluation of binaural playback in a teleconference scenario'. Together they form a unique fingerprint.

Cite this