Drewes, HeikoFanger, YaraMayer, Sven2024-10-082024-10-082024https://dl.gi.de/handle/20.500.12116/44841While desktops and smartphones have established user interface standards, they are still lacking for virtual and augmented reality devices. Hands-free interaction for these devices is desirable. This paper explores utilizing eye and head tracking for interaction beyond buttons, in particular, selection in scroll lists. We conducted a user study with three different interaction methods based on eye and head movements, gaze-based dwell-time, gaze-head offset, and gaze-based head gestures and compared them with the state-of-the-art hand-based interaction. The study evaluation of quantitative and qualitative measurement provides insights into the trade-off between physical and mental demands for augmented reality interfaces.eneye trackinghands-free interactionhead gestureshead trackingselection in list boxesHands-free Selection in Scroll Lists for AR DevicesText/Conference Paper10.1145/3670653.3670671