Logo des Repositoriums
 

Evaluating Task-Level Struggle Detection Methods in Intelligent Tutoring Systems for Programming

dc.contributor.authorDannath, Jesper
dc.contributor.authorDeriyeva, Alina
dc.contributor.authorPaaßen, Benjamin
dc.contributor.editorSchulz, Sandra
dc.contributor.editorKiesler, Natalie
dc.date.accessioned2024-09-03T16:26:21Z
dc.date.available2024-09-03T16:26:21Z
dc.date.issued2024
dc.description.abstractIntelligent Tutoring Systems require student modeling in order to make pedagogical decisions, such as individualized feedback or task selection. Typically, student modeling is based on the eventual correctness of tasks. However, for multi-step or iterative learning tasks, like in programming, the intermediate states towards a correct solution also carry crucial information about learner skill. We investigate how to detect learners who struggle on their path towards a correct solution of a task. Prior work addressed struggle detection in programming environments on different granularity levels, but has mostly focused on preventing course dropout. We conducted a pilot study of our programming learning environment and evaluated different approaches for struggle detection at the task level. For the evaluation of measures, we use downstream Item Response Theory competency models. We find that detecting struggle based on large language model text embeddings outperforms chosen baselines with regard to correlation with a programming competency proxy.en
dc.identifier.doi10.18420/delfi2024_07
dc.identifier.eissn2944-7682
dc.identifier.issn2944-7682
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/44545
dc.language.isoen
dc.pubPlaceBonn
dc.publisherGesellschaft für Informatik e.V.
dc.relation.ispartofProceedings of DELFI 2024
dc.relation.ispartofseriesDELFI
dc.subjectIntelligent Tutoring Systems
dc.subjectItem Response Theory
dc.subjectStruggle
dc.subjectLarge Language Models
dc.titleEvaluating Task-Level Struggle Detection Methods in Intelligent Tutoring Systems for Programmingen
dc.typeText/Conference paper
mci.conference.date09.-11. September 2024
mci.conference.locationFulda
mci.conference.sessiontitleBest Paper Kandidaten Kurzbeiträge
mci.document.qualitydigidoc
mci.reference.pages97-105

Dateien

Originalbündel
1 - 1 von 1
Lade...
Vorschaubild
Name:
Jesper-Dannath.pdf
Größe:
829.21 KB
Format:
Adobe Portable Document Format