Logo des Repositoriums
 

Inter-Rater Agreement and Usability: A Comparative Evaluation of Annotation Tools for Sentiment Annotation

dc.contributor.authorSchmidt, Thomas
dc.contributor.authorWinterl, Brigitte
dc.contributor.authorMaul, Milena
dc.contributor.authorSchark, Alina
dc.contributor.authorVlad, Andrea
dc.contributor.authorWolff, Christian
dc.contributor.editorDraude, Claude
dc.contributor.editorLange, Martin
dc.contributor.editorSick, Bernhard
dc.date.accessioned2019-08-27T13:00:15Z
dc.date.available2019-08-27T13:00:15Z
dc.date.issued2019
dc.description.abstractWe present the results of a comparative evaluation study of five annotation tools with 50 participants in the context of sentiment and emotion annotation of literary texts. Ten participants per tool annotated 50 speeches of the play Emilia Galotti by G. E. Lessing. We evaluate the tools via standard usability and user experience questionnaires, by measuring the time needed for the annotation, and via semi-structured interviews. Based on the results we formulate a recommendation. In addition, we discuss and compare the usability metrics and methods to develop best practices for tool selection in similar contexts. Furthermore, we also highlight the relationship between inter-rater agreement and usability metrics as well as the effect of the chosen tool on annotation behavior.en
dc.identifier.doi10.18420/inf2019_ws12
dc.identifier.isbn978-3-88579-689-3
dc.identifier.pissn1617-5468
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/25044
dc.language.isoen
dc.publisherGesellschaft für Informatik e.V.
dc.relation.ispartofINFORMATIK 2019: 50 Jahre Gesellschaft für Informatik – Informatik für Gesellschaft (Workshop-Beiträge)
dc.relation.ispartofseriesLecture Notes in Informatics (LNI) - Proceedings, Volume P-295
dc.subjectSentiment Annotation
dc.subjectUsability Engineering
dc.subjectUsability
dc.subjectInter-rater agreement
dc.subjectAnnotation
dc.subjectSentiment Analysis
dc.subjectEmotion Analysis
dc.subjectAnnotation Tools
dc.titleInter-Rater Agreement and Usability: A Comparative Evaluation of Annotation Tools for Sentiment Annotationen
dc.typeText/Conference Paper
gi.citation.endPage133
gi.citation.publisherPlaceBonn
gi.citation.startPage121
gi.conference.date23.-26. September 2019
gi.conference.locationKassel
gi.conference.sessiontitleSoftware Engineering in den Digital Humanities

Dateien

Originalbündel
1 - 1 von 1
Lade...
Vorschaubild
Name:
paper03_05.pdf
Größe:
227.65 KB
Format:
Adobe Portable Document Format