Auflistung nach Schlagwort "eye tracking"
1 - 10 von 20
Treffer pro Seite
Sortieroptionen
- ZeitschriftenartikelAnswerTruthDetector: a combined cognitive load approach for separating truthful from deceptive answers in computer-administered questionnaires(i-com: Vol. 22, No. 3, 2023) Maleck, Moritz; Gross, TomIn human-computer interaction, much empirical research exists. Online questionnaires increasingly play an important role. Here the quality of the results depend strongly on the quality of the given answers, and it is essential to distinguish truthful from deceptive answers. There exist elegant single modalities for deception detection in the literature, such as mouse tracking and eye tracking (in this paper, respectively, measuring the pupil diameter). Yet, no combination of these two modalities is available. This paper presents a combined approach of two cognitive-load-based lie detection approaches. We address study administrators who conduct questionnaires in the HCI, wanting to improve the validity of questionnaires.
- WorkshopbeitragAssisting Mouse Pointer Recovery in Multi-Display Environments(Mensch und Computer 2015 – Proceedings, 2015) Fortmann, Florian; Nowak, Dennis; Bruns, Kristian; Milster, Mark; Boll, SusanneRecovering the mouse pointer in a multi-display environment after it got lost, e.g., because the user shifted his or her visual attention to another task, can be very annoying and frustrating. The reason is that the user has to oversee a large display space requiring sequential sampling actions. These actions consume psycho-physiological effort for moving the eyes and head around, and increase the time needed to recover the mouse pointer. In this paper, we present an assistant system to support users to recover their mouse pointers in multi-display environments. A preliminary user study showed no significant improvement of mouse pointer recovery time, but usability and satisfaction were assessed positive. This work was carried out as part of a course on human-machine interaction at the University of Oldenburg.
- ZeitschriftenartikelCognitive state detection with eye tracking in the field: an experience sampling study and its lessons learned(i-com: Vol. 23, No. 1, 2024) Langner, Moritz; Toreini, Peyman; Maedche, AlexanderIn the future, cognitive activity will be tracked in the same way how physical activity is tracked today. Eye-tracking technology is a promising off-body technology that provides access to relevant data for cognitive activity tracking. For building cognitive state models, continuous and longitudinal collection of eye-tracking and self-reported cognitive state label data is critical. In a field study with 11 students, we use experience sampling and our data collection system esmLoop to collect both cognitive state labels and eye-tracking data. We report descriptive results of the field study and develop supervised machine learning models for the detection of two eye-based cognitive states: cognitive load and flow. In addition, we articulate the lessons learned encountered during data collection and cognitive state model development to address the challenges of building generalizable and robust user models in the future. With this study, we contribute knowledge to bring eye-based cognitive state detection closer to real-world applications.
- DissertationDesigning gaze-based interaction for pervasive public displays(2018) Khamis, MohamedThe last decade witnessed an increasing adoption of public interactive displays. Displays can now be seen in many public areas, such as shopping malls, and train stations. There is also a growing trend towards using large public displays especially in airports, urban areas, universities and libraries. Meanwhile, advances in eye tracking and visual computing promise straightforward integration of eye tracking on these displays for both: 1) monitoring the user's visual behavior to evaluate different aspects of the display, such as measuring the visual attention of passersby, and for 2) interaction purposes, such as allowing users to provide input, retrieve content, or transfer data using their eye movements. Gaze is particularly useful for pervasive public displays. In addition to being natural and intuitive, eye gaze can be detected from a distance, bringing interactivity to displays that are physically unreachable. Gaze reflects the user's intention and visual interests, and its subtle nature makes it well-suited for public interactions where social embarrassment and privacy concerns might hinder the experience. On the downside, eye tracking technologies have traditionally been developed for desktop settings, where a user interacts from a stationary position and for a relatively long period of time. Interaction with public displays is fundamentally different and hence poses unique challenges when employing eye tracking. First, users of public displays are dynamic; users could approach the display from different directions, and interact from different positions or even while moving. This means that gaze-enabled displays should not expect users to be stationary at a specific position, but instead adapt to users' ever-changing position in front of the display. Second, users of public displays typically interact for short durations, often for a few seconds only. This means that contrary to desktop settings, public displays cannot afford requiring users to perform time-consuming calibration prior to interaction. In this publications-based dissertation, we first report on a review of challenges of interactive public displays, and discuss the potential of gaze in addressing these challenges. We then showcase the implementation and in-depth evaluation of two applications where gaze is leveraged to address core problems in today's public displays. The first presents an eye-based solution, EyePACT, that tackles the parallax effect which is often experienced on today's touch-based public displays. We found that EyePACT significantly improves accuracy even with varying degrees of parallax. The second is a novel multimodal system, GTmoPass, that combines gaze and touch input for secure user authentication on public displays. GTmoPass was found to be highly resilient to shoulder surfing, thermal attacks and smudge attacks, thereby offering a secure solution to an important problem on public displays. The second part of the dissertation explores specific challenges of gaze-based interaction with public displays. First, we address the user positioning problem by means of active eye tracking. More specifically, we built a novel prototype, EyeScout, that dynamically moves the eye tracker based on the user's position without augmenting the user. This, in turn, allowed us to study and understand gaze-based interaction with public displays while walking, and when approaching the display from different positions. An evaluation revealed that EyeScout is well perceived by users, and improves the time needed to initiate gaze interaction by 62% compared to state-of-the-art. Second, we propose a system, Read2Calibrate, for calibrating eye trackers implicitly while users read text on displays. We found that although text-based calibration is less accurate than traditional methods, it integrates smoothly while reading and thereby more suitable for public displays. Finally, through our prototype system, EyeVote, we show how to allow users to select textual options on public displays via gaze without calibration. In a field deployment of EyeVote, we studied the trade-off between accuracy and selection speed when using calibration-free selection techniques. We found that users of public displays value faster interactions over accurate ones, and are willing to correct system errors in case of inaccuracies. We conclude by discussing the implications of our findings on the design of gaze-based interaction for public displays, and how our work can be adapted for other domains apart from public displays, such as on handheld mobile dev
- WorkshopbeitragDesigning Tools To Improve Collaborative Interaction in a VR Environment for Teaching Geosciences Interpretation(Mensch und Computer 2020 - Workshopband, 2020) Woodworth, Jason; Broussard, David; Borst, ChristophWe discuss practical and theoretical solutions to problems that arose during the development of a collaborative VR application in which a teacher guides students through visualization and interactive interpretation of a geological dataset. To provide access to a large number of tools, we introduced a dashboard-style menu that rotates and moves to follow the user through the environment. We expect users to need good awareness of each other in the virtual environment, and especially to understand each other’s attention to specific terrain surface features or annotations. For this, we display an eye gaze cue on the visualized terrain and visually tether a nametag widget on the dashboard to each user’s avatar. Results of an initial usability review, involving an expert geologist guiding students, show promise for sharing eye gaze with a gaze trail as a basic method for understanding attention. Other tested indicators of avatar location or view appeared less important during the terrain feature presentation and interpretation.We additionally summarize ongoing work to enhance collaborative awareness through other eye tracking metrics and physiological data.
- WorkshopbeitragEvaluation of a smart public display in public transport(Mensch und Computer 2019 - Workshopband, 2019) Keller, Christine; Titov, Waldemar; Sawilla, Swenja; Schlegel, ThomasIn this paper, we present the iterative evaluation of a smart public display prototype for public transport. In our research project, we developed a working prototype of a smart and mobile public display. We iteratively evaluated several facets of our prototypes, following a user centered design approach. In this paper, we describe challenges and experiences during the development and evaluation of this smart and mobile public display, as well as the results of our studies so far and we discuss our evaluation steps as best practice examples.
- ZeitschriftenartikelEye tracking and its application in usability and media research(MMI Interaktiv - Eye Tracking: Vol. 1, No. 06, 2003) Schiessl, Michael; Duda, Sabrina; Thölke, Andreas; Fischer, RicoIn den vergangenen Jahren ist das Eye Tracking zu einer wichtigen Methode zur Untersuchung von Mensch-Computer-Interaktionen geworden. In diesem Artikel möchten wir zeigen, wie die Eye Tracking Methode erfolgreich in Usability Untersuchungen ihre Anwendung findet. Anhand von drei Beispielen werden wir die Vorteile des Eye Trackings im Vergleich zu herkömmlichen Usability Methoden aufzeigen. Es wird diskutiert warum eine alleinige Anwendung traditioneller Usability Methoden, die stark auf Introspektions- und Verbalisierungsmethoden beruhen, zu verzerrten bzw. schlecht anwendbaren Ergebnissen führen können.
- KonferenzbeitragEye tracking experiments in business process modeling: agenda setting and proof of concept(Enterprise modelling and information systems architectures (EMISA 2011), 2011) Hogrebe, Frank; Gehrke, Nick; Nüttgens, MarkusFor almost all applications there is a need to judge business process models by userś perspective. This paper presents first facts and findings how the eye tracking method can contribute to a deeper empirical understanding and evaluation of user satisfaction in business process modeling. The method of eye tracking is used in order to check subjective perceptions of users through objective measurement. The experimental investigation is done using two variants of the widespread business process modelling notation “Event-driven Process Chain (EPC)”.
- KonferenzbeitragEyePointing: A Gaze-Based Selection Technique(Mensch und Computer 2019 - Tagungsband, 2019) Schweigert, Robin; Schwind, Valentin; Mayer, SvenInteracting with objects from a distance is not only challenging in the real world but also a common problem in virtual reality (VR). One issue concerns the distinction between attention for exploration and attention for selection - also known as the Midas-touch problem. Researchers proposed numerous approaches to overcome that challenge using additional devices, gaze input cascaded pointing, and using eye blinks to select the remote object. While techniques such as MAGIC pointing still require additional input for confirming a selection using eye gaze and, thus, forces the user to perform unnatural behavior, there is still no solution enabling a truly natural and unobtrusive device free interaction for selection. In this paper, we propose EyePointing: a technique which combines the MAGIC pointing technique and the referential mid-air pointing gesture to selecting objects in a distance. While the eye gaze is used for referencing the object, the pointing gesture is used as a trigger. Our technique counteracts the Midas-touch problem.
- muc: kurzbeitrag (poster)Gaze-based Landmarks to Support Re-finding Information on the Web(Mensch & Computer 2013: Interaktive Vielfalt, 2013) Hempel, Julia; Nitsche, Marcus; Haun, Stefan; Nürnberger, AndreasRe-finding information is a frequently performed task in the WWW, which requires both the re-location of the page and the re-finding of specific information within the page. However, current browsers rarely support the second task. In this paper, we present a gaze-based approach to create marks within web pages in order to support re-finding information. Therefore, eye tracking is used to identify information relevant to the user. To inform the ongoing design process, an informal user study has been conducted. The results suggest that landmarks should be created based on a combination of different measures (e.g., gaze and mouse data) and presented in the user s peripheral visual field.