Logo des Repositoriums
 
Konferenzbeitrag

Inter-Rater Agreement and Usability: A Comparative Evaluation of Annotation Tools for Sentiment Annotation

Zusammenfassung

We present the results of a comparative evaluation study of five annotation tools with 50 participants in the context of sentiment and emotion annotation of literary texts. Ten participants per tool annotated 50 speeches of the play Emilia Galotti by G. E. Lessing. We evaluate the tools via standard usability and user experience questionnaires, by measuring the time needed for the annotation, and via semi-structured interviews. Based on the results we formulate a recommendation. In addition, we discuss and compare the usability metrics and methods to develop best practices for tool selection in similar contexts. Furthermore, we also highlight the relationship between inter-rater agreement and usability metrics as well as the effect of the chosen tool on annotation behavior.

Beschreibung

Schmidt, Thomas; Winterl, Brigitte; Maul, Milena; Schark, Alina; Vlad, Andrea; Wolff, Christian (2019): Inter-Rater Agreement and Usability: A Comparative Evaluation of Annotation Tools for Sentiment Annotation. INFORMATIK 2019: 50 Jahre Gesellschaft für Informatik – Informatik für Gesellschaft (Workshop-Beiträge). DOI: 10.18420/inf2019_ws12. Bonn: Gesellschaft für Informatik e.V.. PISSN: 1617-5468. ISBN: 978-3-88579-689-3. pp. 121-133. Software Engineering in den Digital Humanities. Kassel. 23.-26. September 2019

Zitierform

Tags