Konferenzbeitrag
Automated assessment of process modeling exams: basic ideas and prototypical implementation
Lade...
Volltext URI
Dokumententyp
Text/Conference Paper
Dateien
Zusatzinformation
Datum
2016
Autor:innen
Zeitschriftentitel
ISSN der Zeitschrift
Bandtitel
Verlag
Gesellschaft für Informatik e.V.
Zusammenfassung
The assessment of process modeling exams is a time consuming and complex task. It is desirable to give each student a detailed feedback on their solution in terms of syntactic, semantic, and pragmatic quality. It is obvious that especially in the case of mass courses with hundreds of participants, individual grading of modeling exams by humans is challenging: Besides reliability, consistency, and validity, the efficiency of the grading process must be guaranteed. Against that background, this paper aims at developing first ideas for an automated assessment of process modeling exams. The goal is to improve modeling education while teaching students not only to model correctly but to develop good models. Our ideas were prototypically implemented and applied in an exemplary scenario with promising results. It was possible to identify limitations but also to derive reliable semi-automated approaches for the assessment of process modeling exams.