Auflistung nach Schlagwort "Empirical Evaluation"
1 - 1 von 1
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragAccelerating Deductive Coding of Qualitative Data: An Experimental Study on the Applicability of Crowdsourcing(Mensch und Computer 2021 - Tagungsband, 2021) Haug, Saskia; Rietz, Tim; Mädche, AlexanderWhile qualitative research can produce a rich understanding of peoples’ mind, it requires an essential and strenuous data annotation process known as coding. Coding can be repetitive and timeconsuming, particularly for large datasets. Crowdsourcing provides flexible access toworkers all around theworld, however, researchers remain doubtful about its applicability for coding. In this study, we present an interactive coding system to support crowdsourced deductive coding of semi-structured qualitative data. Through an empirical evaluation on Amazon Mechanical Turk, we assess both the quality and the reliability of crowd-support for coding. Our results show that non-expert coders provide reliable results using our system. The crowd reached a substantial agreement of up to 91% with the coding provided by experts. Our results indicate that crowdsourced coding is an applicable strategy for accelerating a strenuous task. Additionally, we present implications of crowdsourcing to reduce biases in the interpretation of qualitative data.