Inter-Rater Agreement and Usability: A Comparative Evaluation of Annotation Tools for Sentiment Annotation
dc.contributor.author | Schmidt, Thomas | |
dc.contributor.author | Winterl, Brigitte | |
dc.contributor.author | Maul, Milena | |
dc.contributor.author | Schark, Alina | |
dc.contributor.author | Vlad, Andrea | |
dc.contributor.author | Wolff, Christian | |
dc.contributor.editor | Draude, Claude | |
dc.contributor.editor | Lange, Martin | |
dc.contributor.editor | Sick, Bernhard | |
dc.date.accessioned | 2019-08-27T13:00:15Z | |
dc.date.available | 2019-08-27T13:00:15Z | |
dc.date.issued | 2019 | |
dc.description.abstract | We present the results of a comparative evaluation study of five annotation tools with 50 participants in the context of sentiment and emotion annotation of literary texts. Ten participants per tool annotated 50 speeches of the play Emilia Galotti by G. E. Lessing. We evaluate the tools via standard usability and user experience questionnaires, by measuring the time needed for the annotation, and via semi-structured interviews. Based on the results we formulate a recommendation. In addition, we discuss and compare the usability metrics and methods to develop best practices for tool selection in similar contexts. Furthermore, we also highlight the relationship between inter-rater agreement and usability metrics as well as the effect of the chosen tool on annotation behavior. | en |
dc.identifier.doi | 10.18420/inf2019_ws12 | |
dc.identifier.isbn | 978-3-88579-689-3 | |
dc.identifier.pissn | 1617-5468 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/25044 | |
dc.language.iso | en | |
dc.publisher | Gesellschaft für Informatik e.V. | |
dc.relation.ispartof | INFORMATIK 2019: 50 Jahre Gesellschaft für Informatik – Informatik für Gesellschaft (Workshop-Beiträge) | |
dc.relation.ispartofseries | Lecture Notes in Informatics (LNI) - Proceedings, Volume P-295 | |
dc.subject | Sentiment Annotation | |
dc.subject | Usability Engineering | |
dc.subject | Usability | |
dc.subject | Inter-rater agreement | |
dc.subject | Annotation | |
dc.subject | Sentiment Analysis | |
dc.subject | Emotion Analysis | |
dc.subject | Annotation Tools | |
dc.title | Inter-Rater Agreement and Usability: A Comparative Evaluation of Annotation Tools for Sentiment Annotation | en |
dc.type | Text/Conference Paper | |
gi.citation.endPage | 133 | |
gi.citation.publisherPlace | Bonn | |
gi.citation.startPage | 121 | |
gi.conference.date | 23.-26. September 2019 | |
gi.conference.location | Kassel | |
gi.conference.sessiontitle | Software Engineering in den Digital Humanities |
Dateien
Originalbündel
1 - 1 von 1