Logo des Repositoriums
 

How (not) to Incent Crowd Workers

dc.contributor.authorStraub, Tim
dc.contributor.authorGimpel, Henner
dc.contributor.authorTeschner, Florian
dc.contributor.authorWeinhardt, Christof
dc.date.accessioned2018-01-08T07:44:37Z
dc.date.available2018-01-08T07:44:37Z
dc.date.issued2015
dc.description.abstractCrowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments.
dc.identifier.pissn1867-0202
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/10637
dc.publisherSpringer
dc.relation.ispartofBusiness & Information Systems Engineering: Vol. 57, No. 3
dc.relation.ispartofseriesBusiness & Information Systems Engineering
dc.subjectCrowdsourcing
dc.subjectExperimental techniques
dc.subjectExploratory study
dc.subjectFeedback
dc.subjectIncentives
dc.subjectOnline labor
dc.subjectPiece rate
dc.subjectRank-order tournament
dc.subjectReal effort task
dc.titleHow (not) to Incent Crowd Workers
dc.typeText/Journal Article
gi.citation.endPage179
gi.citation.startPage167

Dateien