An investigation of techniques that aim to improve the quality of labels provided by the crowd

Jonathon Hare, Anna Weston, Elena Simperl, Sina Samangooei, David Dupplaw, Paul Lewis, Maribel Acosta

Research output: Contribution to journalConference articlepeer-review

Abstract

The 2013 MediaEval Crowdsourcing task looked at the problem of working with noisy crowdsourced annotations of image data. The aim of the task was to investigate possible techniques for estimating the true labels of an image by using the set of noisy crowdsourced labels, and possibly any content and metadata from the image itself. For the runs in this paper, we've applied a shotgun approach and tried a number of existing techniques, which include generative probabilistic models and further crowdsourcing.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume1043
StatePublished - 2013
Externally publishedYes
Event2013 Multimedia Benchmark Workshop, MediaEval 2013 - Barcelona, Spain
Duration: 18 Oct 201319 Oct 2013

Fingerprint

Dive into the research topics of 'An investigation of techniques that aim to improve the quality of labels provided by the crowd'. Together they form a unique fingerprint.

Cite this