Logo des Repositoriums
 

VR/AR Workshop 2024

Autor*innen mit den meisten Dokumenten  

Auflistung nach:

Neueste Veröffentlichungen

1 - 10 von 16
  • Workshopbeitrag
    An immersive VR test tool for multimodal human robot interaction
    (GI VR / AR Workshop, 2024) Milde, Sven; Friesen, Stefan; Runzheimer, Tabea; Milde, Jan-Torsten; Blum, Rainer
    Multimodal interaction concepts for robotics are getting more so- phisticated by the day and need in depth testing and evaluation, of- ten with prototypes that are not ready to be tested by humans in the real world because of security concerns and low production numbers. As an alternative, this paper describes a virtual reality (VR) application of a multimodal Human-Robot-Interaction (HRI/HCI) interface, designed to work with standalone VR headsets, that is able to track gesture and speech input in order to control an autonomous robot vehicle. The data collected from the VR tests is evaluated and compared to the results from tests with the real vehicle and show no serious differences between the results.
  • Workshopbeitrag
    Stereoscopic Depth and Object Size Effects on Fixations and Subjective Saliency in Autostereoscopic Displays
    (GI VR / AR Workshop, 2024) Tasliarmut, Melisa; Dettmann, André; Bullinger, Angelika C.
    For the purpose of effectively directing visual attention in monitoring or teleoperation tasks, such as highly visualized control rooms or assistance systems in vehicles, it is crucial to understand which attributes contribute to salient perception, since the use of attention-directing designs can ultimately affect response times. To date, research literature (e.g., on displays) and norms have focused on directing attention through 2D attributes such as color. With the increasing relevance of three-dimensional environments (VR, AR, 3D displays), this paper investigates stereoscopic depth and object size as saliency-guiding attributes to direct attention. To control for the effects of the perspective, object size is divided into the viewing angle and the actual size. To estimate which variations of depth and object size attract more attention in a pairwise comparison we conducted a within-subjects design with subjective saliency selection. The study involved N = 15 participants using an autostereoscopic 3D display. Measurements included recordings of the initial fixation with eye-tracking, the selection of the subjectively more salient item, and the reaction time. The results show that both the visual angle and the stereoscopic depth have independent effects on subjective saliency perception. In addition the effect of visual angle is stronger when the item is farther away. A clear distinction between object size and visual angle is therefore suggested for future studies.
  • Workshopbeitrag
    Comparing Information Visualization Modalities for 2D Diagrams on Handheld Devices for Industrial Augmented Reality Applications
    (GI VR / AR Workshop, 2024) Brüggemann, Tim; Rudolph, Linda; Klusmann, Jack; Khan, Towsif
    Despite leaving the Gartner Hype Cycle for emerging technologies in 2019, indicating the technology’s maturity, Augmented Reality (AR) still finds relatively few applications in modern industry. To investigate the value of AR-based 3D representation for ab- stract data visualization, we developed and evaluated two distinct visualization methods for presenting informational diagrams. One method involves a conventional 2D representation on the screen of a mobile device, while the other integrates AR by placing an image plane in a 3D immersive scene. Subsequently, we conducted a user study, in which we assessed the feasibility and usability of both approaches, showing that both methods offer distinct advantages. The AR-based visualization provides great overall orientation and navigation, while the 2D visualization offers more precise manipu- lation and spatial flexibility. As users of mobile devices encounter AR more frequently and at increasingly younger ages, this topic remains intriguing and warrants further investigation. Our work investigates whether an AR visualization of complex diagrams is advantageous over display-based representations due to the poster- like spatial anchoring.
  • Workshopbeitrag
    Anti-aliasing Techniques in Virtual Reality: A User Study with Perceptual Pairwise Comparison Ranking Scheme
    (GI VR / AR Workshop, 2024) Waldow, Kristoffer; Scholz, Jonas; Misiak, Martin; Fuhrmann, Arnulph; Roth, Daniel; Latoschik, Marc Erich
    Anti-aliasing is essential for Virtual Reality (VR) applications, as the pixels of current VR displays subtend a large field of view. This makes various undersampling artifacts particularly noticeable. Un- derstanding state-of-the-art anti-aliasing techniques and their trade- offs is therefore crucial for optimizing VR experiences and develop- ing high-quality VR applications. This paper investigates multiple anti-aliasing techniques through a user study with pairwise compar- isons to determine the best method for image quality in VR, con- sidering both static and moving objects in four different plausible environments. Results indicate that the ranking of methods does not differ significantly between moving and static scenes. While naive Supersampling Anti-Aliasing provides the best image qual- ity from the tested methods and Fast Approximate Anti-Aliasing the worst, Temporal Anti-Aliasing and Multisample Anti-Aliasing achieved similar results in terms of image quality.
  • Workshopbeitrag
    Respiration VR - Breathwork therapy in a real-time Virtual Reality environment on the Oculus Quest using a low-cost strain sensor
    (GI VR / AR Workshop, 2024) Beisiegel, Thomas; Grimm, Paul; Hergenröther, Elke; Thesen, Philipp; Suchoroschenko, Stefan; Dienes, Maikel
    This paper presents our Respiration Virtual Reality (VR) Envi- ronment, a prototype for virtual reality breathwork therapy that combines a low-cost, easy-to-manufacture sensor, standalone Virtual Reality (Meta Quest without additional hardware) and a corresponding therapeutic application with a gamified approach. The aim of the project was to realize an interactive form of therapy that achieves a high degree of effectiveness. Our framework condition was the development of a low-cost sensor that could be set up quickly, which works in a standing, sitting or lying position of the user and provides data on the user’s breathing in realtime. The challenge was to integrate external realtime sensors as seamlessly as possible into a virtual reality application and offer the user the most entertaining and motivating experience possible.
  • Workshopbeitrag
    Make Me Run: The Effects of an Immersive Learning Experience on Physical Running Exercise
    (GI VR / AR Workshop, 2024) Ertugrul, Nazife; Kruse, Lucie; Steinicke, Frank
    Regular physical activity is crucial for physical and mental health, but many people do not meet the recommendations of the World Health Organization for physical activity due to lack of motivation and time constraints. This exploratory study aims to examine how an immersive learning experience using Virtual Reality (VR) affects running behavior, compared to a traditional approach that uses presentation slides. The results of our study indicate an increase in exercise enjoyment among active participants, with three participants even starting new workout routines in the week after the study. Approximately 60% of participants expressed a preference for the immersive intervention in VR due to its interactive 3D models. Moreover, VR was observed to enhance the feeling of flow during the running exercise in comparison to the 2D slides. Nonetheless, no statistically significant differences were detected in immediate enjoyment of physical activity, intrinsic motivation, or knowledge retention. This paper provides first insights into the effect of a VR learning intervention on sports behavior and points out future research directions.
  • Workshopbeitrag
    Human Perception of Robots in Public Spaces: A Comparison between Virtual and Augmented Reality
    (GI VR / AR Workshop, 2024) Huhn, Calvin; Tluk, Niklas; Säger, Mitja; Schweidler, Paul; Geiger, Christian
    This paper explores the perception of robots in public spaces through Virtual Reality (VR) and Augmented Reality (AR) simu- lations. As robots become increasingly integrated into daily life, understanding how the public perceives them is crucial for their successful deployment. This study aims to compare the effective- ness of VR and AR in simulating public spaces to study these per- ceptions. Using a mixed-methods approach, we conducted surveys and interviews within VR and AR settings to gather quantitative and qualitative data on participants’ attitudes and reactions toward robots. Our findings reveal significant differences in perception be- tween the two simulation methods, highlighting the strengths and limitations of each. The study provides insights into the implica- tions of robotics in public spaces and offers recommendations for future research and practical applications.
  • Workshopbeitrag
    Evaluation von Gesten- und Controller-Interaktionen im virtuellen Raum
    (GI VR / AR Workshop, 2024) Pilz, Vinzent; Langbehn, Eike
    Im Rahmen dieser Arbeit werden die beiden Eingabemethoden „Handtracking“ und „Controller“ in Hinblick auf ihre Einsatzmöglichkeiten zur Interaktion im virtuellen Raum verglichen und beurteilt. Während es bereits Untersuchungen zum Vergleich von einfachen Greif- und Ablage-Aufgaben gibt, fehlt es noch an solchen zu anderen gängigen Aufgaben, die herkömmlicher Weise mit Controllereingaben ausgeführt werden . Dafür durchlaufen die Teilnehmer der Studie abwechselnd jeweils einmal mit jeder Eingabemethode eine Teststrecke in einer virtuellen Realität und bewerten ihre Erfahrungen per Fragebogen. Neben dem Einsatz der System Usability Scale (SUS) für beide Eingabemethoden und dem Erfassen von Leistungsstatistiken aus der Anwendung werden die Handposen zusätzlich auf Natürlichkeit und Spaß bewertet. Die Teilnehmer begründen außerdem die Wahl ihrer favorisierten Eingabemethode und kommentieren, wenn nötig, erlebte Probleme. Zur Auswertung werden die Teilnehmer in verschiedene Gruppen zusammengefasst und ihre Ergebnisse miteinander verglichen. Die Ergebnisse zeigen, dass das Handtracking durchschnittlich schlechter als die Nutzung von Controllern bewertet wird, was hauptsächlich an Problemen liegt, die dem Stand der Technik der genutzten Hardware geschuldet sind.
  • Workshopbeitrag
    Mixed Reality Theatre – Towards a Theatre of Extended Realites
    (GI VR / AR Workshop, 2024) Geiger, Chris; Triebus, Charlotte; Harth, Jonathan
    This paper explores the emerging field of Mixed Reality Theatre [1], examining the intersection of traditional theatrical practices with cutting-edge Extended Reality (XR) technologies. We draw on the ongoing research project "Theater of Extended Realities." Our investigation focuses on the integration of Virtual Reality (VR), Augmented Reality (AR), and other XR technologies into theatrical productions. We explore how these technologies enable novel forms of immersive and interactive storytelling. The project employs Milgram and Kishino's Reality-Virtuality Continuum as a theoretical framework, situating various forms of mixed reality experiences within the theatrical context. Through this examination, we aim to contribute to several ongoing dialogues. We address the balance between technological innovation and artistic integrity. We explore the potential for creating personalized and participatory experiences. Additionally, we discuss the future of theatre in the digital age. Finally, we provide insights into how XR technologies can enhance, rather than replace, the fundamental qualities of live performance.
  • Workshopbeitrag
    Vergleich von Neural Radiance Fields und Photogrammetrie für 3D-Asset Creation
    (GI VR / AR Workshop, 2024) Kruse, Leo; Langbehn, Eike
    Mit Photogrammetrie lassen sich aus Fotos oder Videos auto- matisch detailgetreue und texturierte 3D-Meshes generieren. Ob Neural Radiance Fields ebenfalls qualitativ gleichwertige oder bes- sere 3D-Meshes produzieren, wird in dieser Studie u ̈berpru ̈ft. Fu ̈r eine Beurteilung werden beide Verfahren mit einem selbstangefer- tigten Luft-/Nahbild Datensatz durchgefu ̈hrt und miteinander ver- glichen. Die Polygon-Meshes der Methoden werden mit spezieller Software analysiert und auf ihre Oberflächenstruktur und Textur geprüft. Die Auswertung der beiden Rekonstruktionsverfahren zeigt, dass die subjektiven Beurteilungen der Meshes mit den objektiven Messergebnissen grundsa ̈tzlich vergleichbar sind. Allerdings bleiben die 3D-Meshes der Photogrammetrie-Software qualitativ und quantitativ hochwertiger und unkomplizierter in ihrer Erzeugung.