Auflistung nach Schlagwort "Eye tracking"
1 - 7 von 7
Treffer pro Seite
Sortieroptionen
- TextdokumentAnalysis and Comparison of the Gaze Behavior of E-Scooter Drivers and Cyclists(INFORMATIK 2020, 2021) Trefzger, Mathias; Titov, Waldemar; Schlegel, ThomasIn this paper, we contribute an eye tracking study to evaluate the gaze behavior of e-scooter drivers and cyclists on high and low quality road surfaces. We recorded the surface quality with sensors and put the different surfaces in relation to the gaze behavior. We recorded eye movements of the participants and performed an Area of Interest (AOI) sequence analysis to identify gaze patterns. Found sequences show that on the high quality surface participants focused most commonly the distant road section and then shifted to nearer sections. Individual advantageous gaze sequences are omitted if the surface is poor. We found a significant difference in the attention distribution of the two means of transport. In addition, we can confirm previous results showing that low quality road surfaces cause the gaze to shift forward. However, the participants did not adapt their speed to the worse surface.
- ZeitschriftenartikelAutomatic Detection of Visual Search for the Elderly using Eye and Head Tracking Data(KI - Künstliche Intelligenz: Vol. 31, No. 4, 2017) Dietz, Michael; Schork, Daniel; Damian, Ionut; Steinert, Anika; Haesner, Marten; André, ElisabethWith increasing age we often find ourselves in situations where we search for certain items, such as keys or wallets, but cannot remember where we left them before. Since finding these objects usually results in a lengthy and frustrating process, we propose an approach for the automatic detection of visual search for older adults to identify the point in time when the users need assistance. In order to collect the necessary sensor data for the recognition of visual search, we develop a completely mobile eye and head tracking device specifically tailored to the requirements of older adults. Using this device, we conduct a user study with 30 participants aged between 65 and 80 years ($$avg = 71.7,$$avg=71.7, 50% female) to collect training and test data. During the study, each participant is asked to perform several activities including the visual search for objects in a real-world setting. We use the recorded data to train a support vector machine (SVM) classifier and achieve a recognition rate of 97.55% with the leave-one-user-out evaluation method. The results indicate the feasibility of an approach towards the automatic detection of visual search in the wild.
- ZeitschriftenartikelThe Role of Focus in Advanced Visual Interfaces(KI - Künstliche Intelligenz: Vol. 30, No. 0, 2016) Orlosky, Jason; Toyama, Takumi; Sonntag, Daniel; Kiyokawa, KiyoshiDeveloping more natural and intelligent interaction methods for head mounted displays (HMDs) has been an important goal in augmented reality for many years. Recently, small form factor eye tracking interfaces and wearable displays have become small enough to be used simultaneously and for extended periods of time. In this paper, we describe the combination of monocular HMDs and an eye tracking interface and show how they can be used to automatically reduce interaction requirements for displays with both single and multiple focal planes. We then present the results of preliminary and primary experiments which test the accuracy of eye tracking for a number of different displays such as Google Glass and Brother’s AiRScouter. Results show that our focal plane classification algorithm works with over 98 % accuracy for classifying the correct distance of virtual objects in our multi-focal plane display prototype and with over 90 % accuracy for classifying physical and virtual objects in commercial monocular displays. Additionally, we describe methodology for integrating our system into augmented reality applications and attentive interfaces.
- ZeitschriftenartikelTowards pervasive eye tracking(it - Information Technology: Vol. 59, No. 5, 2017) Ksaneci, EnkelejdaThe human gaze provides paramount cues for communication and interaction. Following this insight, gaze-based interfaces have been proposed for human-computer interaction (HCI) since the early 90s, with some believing that such interfaces will revolutionize the way we interact with our devices. Since then gaze-based HCI in stationary scenarios (e. g., desktop computing) has been rapidly maturing, and the production costs of mainstream eye trackers have been steadily decreasing. In consequence, a variety of new applications with the ambitious goal to apply eye tracking to dynamic, real-world HCI tasks and scenarios have emerged. This article gives an overview of the research conducted by the Perception Engineering Group at the University of Tübingen.
- ZeitschriftenartikelURWalking: Indoor Navigation for Research and Daily Use(KI - Künstliche Intelligenz: Vol. 37, No. 1, 2023) Ludwig, Bernd; Donabauer, Gregor; Ramsauer, Dominik; Subari, Karema alIn this report, we present the project URWalking conducted at the University of Regensburg. We describe its major outcomes: Firstly, an indoor navigation system for pedestrians as a web application and as an Android app with position tracking of users in indoor and outdoor environments. Our implementation showcases that a variant of the $$A^*$$ A ∗ -algorithm by Ullmann (tengetriebene optimierung präferenzadaptiver fußwegrouten durch gebäudekomplexe https://epub.uni-regensburg.de/43697/ , 2020) can handle the routing problem in large, levelled indoor environments efficiently. Secondly, the apps have been used in several studies for a deeper understanding of human wayfinding. We collected eye tracking and synchronized video data, think aloud protocols, and log data of users interacting with the apps. We applied state-of-the-art deep learning models for gaze tracking and automatic classification of landmarks. Our results indicate that even the most recent version of the YOLO image classifier by Redmon and Farhadi (olov3: An incremental improvement. arXiv, 2018) needs finetuning to recognize everyday objects in indoor environments. Furthermore, we provide empirical evidence that appropriate machine learning models are helpful to bridge behavioural data from users during wayfinding and conceptual models for the salience of objects and landmarks. However, simplistic models are insufficient to reasonably explain wayfinding behaviour in real time—an open issue in GeoAI. We conclude that the GeoAI community should collect more naturalistic log data of wayfinding activities in order to build efficient machine learning models capable of predicting user reactions to routing instructions and of explaining how humans integrate stimuli from the environment as essential information into routing instructions while solving wayfinding tasks. Such models form the basis for real-time wayfinding assistance.
- KonferenzbeitragWhat does it Take to Test a Bicycle Simulator for Realism? A Discussion of the Challenges and Possible Evaluation Methods(Proceedings of Mensch und Computer 2024, 2024) Trefzger, Mathias; Raschke, Michael; Fath, Michael; Eckart, JochenIn recent years, bicycle simulators have become increasingly important as a research tool, resulting in a large number of new prototypes at various academic institutions. In order for the results of studies to be transferred from a simulator to reality, it is necessary to ensure that simulators reproduce realistic traffic behaviour. Most bicycle simulators are evaluated using cycling parameters. However, the (combined) use of other methods is necessary to ensure realism. In this paper we present a variety of evaluation methods and discuss why we consider them relevant for evaluating the realism. We present the challenges and potential and the study design of our current study comparing cycling behaviour in reality and in simulation. Our study design should serve as a reference for other researchers and facilitate their own research. This paper contributes to the motivation of new and future work in the still young field of cycling simulation.
- ZeitschriftenartikelWhy should we use Eye Tracking for Hypertext Design?(MMI Interaktiv - Eye Tracking: Vol. 1, No. 06, 2003) Waniek, Jacqueline; Brunstein, Angela; Naumann, AnjaIn diesem Artikel wird die Analyse von Blickbewegungen zur Verbesserung des Design von Hypertexten diskutiert. Problematisch für das Design von Hypertexten ist, dass die Rolle von kognitiven Prozessen beim Umgang mit diesen Texten noch relativ unklar ist. Die Höhe der Anforderungen, die ein Hypertextsystem an den Nutzer stellt, beeinflusst auch die Leistung des Nutzers. Beim Design von Hypertexten sollte darauf geachtet werden, eine Balance zwischen den Anforderungen des Systems und den kognitiven Fähigkeiten des Nutzers herzustellen. Blickbewegungen sollten analysiert werden, damit besser verstanden werden kann, wie Nutzer mit den dargebotenen Informationen umgehen und um das Design an die Bedürfnisse des Nutzers anpassen zu können. Die Analyse von Blickbewegungen ermöglicht Rückschlüsse auf die visuelle und kognitive Verarbeitung von Informationen bei der Benutzung von Hypertexten. Die Veränderungen von Parametern während der Informationsaufnahme kann aufgezeichnet werden. Darüber hinaus können Probleme, die bei der Analyse von Offline-Daten auftreten, vermieden werden.