Zeitschriftenartikel
Optimising crowdsourcing efficiency: Amplifying human computation with validation
Vorschaubild nicht verfügbar
Volltext URI
Dokumententyp
Text/Journal Article
Zusatzinformation
Datum
2018
Autor:innen
Zeitschriftentitel
ISSN der Zeitschrift
Bandtitel
Verlag
De Gruyter
Zusammenfassung
Crowdsourcing has revolutionised the way tasks can be completed but the process is frequently inefficient, costing practitioners time and money. This research investigates whether crowdsourcing can be optimised with a validation process, as measured by four criteria: quality; cost; noise; and speed. A validation model is described, simulated and tested on real data from an online crowdsourcing game to collect data about human language. Results show that by adding an agreement validation (or a like/upvote) step fewer annotations are required, noise and collection time are reduced and quality may be improved.