How (not) to Incent Crowd Workers
dc.contributor.author | Straub, Tim | |
dc.contributor.author | Gimpel, Henner | |
dc.contributor.author | Teschner, Florian | |
dc.contributor.author | Weinhardt, Christof | |
dc.date.accessioned | 2018-01-08T07:44:37Z | |
dc.date.available | 2018-01-08T07:44:37Z | |
dc.date.issued | 2015 | |
dc.description.abstract | Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments. | |
dc.identifier.pissn | 1867-0202 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/10637 | |
dc.publisher | Springer | |
dc.relation.ispartof | Business & Information Systems Engineering: Vol. 57, No. 3 | |
dc.relation.ispartofseries | Business & Information Systems Engineering | |
dc.subject | Crowdsourcing | |
dc.subject | Experimental techniques | |
dc.subject | Exploratory study | |
dc.subject | Feedback | |
dc.subject | Incentives | |
dc.subject | Online labor | |
dc.subject | Piece rate | |
dc.subject | Rank-order tournament | |
dc.subject | Real effort task | |
dc.title | How (not) to Incent Crowd Workers | |
dc.type | Text/Journal Article | |
gi.citation.endPage | 179 | |
gi.citation.startPage | 167 |