Logo des Repositoriums
 
Konferenzbeitrag

Accelerating Deductive Coding of Qualitative Data: An Experimental Study on the Applicability of Crowdsourcing

Vorschaubild nicht verfügbar

Volltext URI

Dokumententyp

Text/Conference Paper

Zusatzinformation

Datum

2021

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

ACM

Zusammenfassung

While qualitative research can produce a rich understanding of peoples’ mind, it requires an essential and strenuous data annotation process known as coding. Coding can be repetitive and timeconsuming, particularly for large datasets. Crowdsourcing provides flexible access toworkers all around theworld, however, researchers remain doubtful about its applicability for coding. In this study, we present an interactive coding system to support crowdsourced deductive coding of semi-structured qualitative data. Through an empirical evaluation on Amazon Mechanical Turk, we assess both the quality and the reliability of crowd-support for coding. Our results show that non-expert coders provide reliable results using our system. The crowd reached a substantial agreement of up to 91% with the coding provided by experts. Our results indicate that crowdsourced coding is an applicable strategy for accelerating a strenuous task. Additionally, we present implications of crowdsourcing to reduce biases in the interpretation of qualitative data.

Beschreibung

Haug, Saskia; Rietz, Tim; Mädche, Alexander (2021): Accelerating Deductive Coding of Qualitative Data: An Experimental Study on the Applicability of Crowdsourcing. Mensch und Computer 2021 - Tagungsband. DOI: 10.1145/3473856.3473873. New York: ACM. pp. 461-472. MCI-SE07. Ingolstadt. 5.-8.. September 2021

Zitierform

Tags