Logo des Repositoriums
 

Automated interpretation of eye–hand coordination in mobile eye tracking recordings

dc.contributor.authorMussgnug, Moritz
dc.contributor.authorSinger, Daniel
dc.contributor.authorLohmeyer, Quentin
dc.contributor.authorMeboldt, Mirko
dc.date.accessioned2018-01-08T08:13:27Z
dc.date.available2018-01-08T08:13:27Z
dc.date.issued2017
dc.description.abstractMobile eye tracking is beneficial for the analysis of human–machine interactions of tangible products, as it tracks the eye movements reliably in natural environments, and it allows for insights into human behaviour and the associated cognitive processes. However, current methods require a manual screening of the video footage, which is time-consuming and subjective. This work aims to automatically detect cognitive demanding phases in mobile eye tracking recordings. The approach presented combines the user’s perception (gaze) and action (hand) to isolate demanding interactions based upon a multi-modal feature level fusion. It was validated in a usability study of a 3D printer with 40 participants by comparing the usability problems found to a thorough manual analysis. The new approach detected 17 out of 19 problems, while the time for manual analyses was reduced by 63%. More than eye tracking alone, adding the information of the hand enriches the insights into human behaviour. The field of AI could significantly advance our approach by improving the hand-tracking through region proposal CNNs, by detecting the parts of a product and mapping the demanding interactions to these parts, or even by a fully automated end-to-end detection of demanding interactions via deep learning. This could set the basis for machines providing real-time assistance to the machine’s users in cases where they are struggling.
dc.identifier.pissn1610-1987
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/11083
dc.publisherSpringer
dc.relation.ispartofKI - Künstliche Intelligenz: Vol. 31, No. 4
dc.relation.ispartofseriesKI - Künstliche Intelligenz
dc.subjectCognitive processes
dc.subjectEvent interpretation
dc.subjectEye-hand coordination
dc.subjectHuman–machine interaction
dc.subjectMobile eye tracking
dc.subjectUsability testing
dc.titleAutomated interpretation of eye–hand coordination in mobile eye tracking recordings
dc.typeText/Journal Article
gi.citation.endPage337
gi.citation.startPage331

Dateien