Auflistung nach Schlagwort "multimodal interaction"
1 - 6 von 6
Treffer pro Seite
Sortieroptionen
- ZeitschriftenartikelAdapting visualizations and interfaces to the user(it - Information Technology: Vol. 64, No. 4-5, 2022) Chiossi, Francesco; Zagermann, Johannes; Karolus, Jakob; Rodrigues, Nils; Balestrucci, Priscilla; Weiskopf, Daniel; Ehinger, Benedikt; Feuchtner, Tiare; Reiterer, Harald; Chuang, Lewis L.; Ernst, Marc; Bulling, Andreas; Mayer, Sven; Schmidt, AlbrechtAdaptive visualization and interfaces pervade our everyday tasks to improve interaction from the point of view of user performance and experience. This approach allows using several user inputs, whether physiological, behavioral, qualitative, or multimodal combinations, to enhance the interaction. Due to the multitude of approaches, we outline the current research trends of inputs used to adapt visualizations and user interfaces. Moreover, we discuss methodological approaches used in mixed reality, physiological computing, visual analytics, and proficiency-aware systems. With this work, we provide an overview of the current research in adaptive systems.
- ZeitschriftenartikelComplementary interfaces for visual computing(it - Information Technology: Vol. 64, No. 4-5, 2022) Zagermann, Johannes; Hubenschmid, Sebastian; Balestrucci, Priscilla; Feuchtner, Tiare; Mayer, Sven; Ernst, Marc O.; Schmidt, Albrecht; Reiterer, HaraldWith increasing complexity in visual computing tasks, a single device may not be sufficient to adequately support the user’s workflow. Here, we can employ multi-device ecologies such as cross-device interaction, where a workflow can be split across multiple devices, each dedicated to a specific role. But what makes these multi-device ecologies compelling? Based on insights from our research, each device or interface component must contribute a complementary characteristic to increase the quality of interaction and further support users in their current activity. We establish the term complementary interfaces for such meaningful combinations of devices and modalities and provide an initial set of challenges. In addition, we demonstrate the value of complementarity with examples from within our own research.
- muc: kurzbeitrag (poster)Enhancing Medical Needle Placement with Auditory Display(Mensch & Computer 2013: Interaktive Vielfalt, 2013) Black, David; Al Issawi, Jumana; Rieder, Christian; Hahn, HorstRadiofrequency ablation is a minimally invasive procedure used to treat a tumor by applying local radiofrequency energy using a needle that is inserted into the patient through the skin. Current methods for guiding needle placement require the radiologist to remove the view from the patient and instead use a computer screen for guidance. We present two auditory display methods to guide needle placement that allow visual attention to remain on the patient. Initial results indicate that the needle placement task can be accomplished using almost solely auditory support, increasing user attention on the patient and reducing head and neck movements.
- WorkshopbeitragGaze-supported Interaction(Mensch & Computer 2012 – Workshopband: interaktiv informiert – allgegenwärtig und allumfassend!?, 2012) Stellmach, Sophie; Dachselt, RaimundConsidering the increasing diversity of display arrangements including wall-sized screens and multi-display setups, our eye gaze provides a particular high potential for implicit and seamless, as well as fast interactions. However, gaze-based interaction is often regarded as error-prone and unnatural, especially when restricting the input to gaze as a single modality. For this reason, we have developed several interaction techniques benefitting from gaze as an additional, implicit and fast pointing modality for roughly indicating a user s visual attention in combination with common smartphones to make more explicit and precise specifications. In our demos, we showcase two examples for more natural and yet effective ways of incorporating a user s gaze as a supporting input modality. The two application scenarios comprise (1) gaze-supported pan-and-zoom techniques using the example of GoogleEarth and (2) gaze-supported navigation and target selection in a virtual 3D scene.
- KonferenzbeitragICAROSmulti - eine VR-Testumgebung für die Entwicklung multimodaler und mehrbenutzerfähiger Interaktionskonzepte(Mensch und Computer 2019 - Tagungsband, 2019) Treskunov, Anastasia; Fischer, Ben; Gerhardt, Emil; Gerhardt, Laurin; Nowottnik, David; Säger, Mitja; Geiger, ChristianMit ‚ICAROSmulti‘ präsentieren wir eine kooperative Virtual-Reality-Anwendung, die zur Präsentation und Validierung multimodaler Interaktionstechniken entwickelt wurde. Für ein auf Messen demonstrierbares und gleichzeitig attraktives Anwendungsszenario in Virtual Reality wählten wir eine flugbasierte virtuelle Umgebung für mehrere Benutzer. Letztere interagieren darin miteinander und nehmen dazu unterschiedliche Rollen ein. Als Eingabegerät nutzen wir bis zu zwei ICAROS-Fluggeräte, die eigens dafür entwickelt wurden einen Flug in VR zu simulieren. In waagerechter Position können sich die Nutzer in der VR-Umwelt bewegen und navigieren (siehe Bild X). Mittels HMD wird ihnen ihre virtuelle Umgebung angezeigt, wobei die am Icaros befestigte Sensorik die Bewegung des Nutzers detektiert und kommuniziert. Aktorik wie Windmaschinen, Hitzelampen und andere Geräte dienen dem multimodalen Output und erhöhen die Immersion des Fluges. Darüber hinaus sind in die aktuelle Szene auch andere Geräte und 3D-Interaktionstechniken wie z. B. Laufen, Klettern etc. integrierbar. Alle Nutzer können sich in Virtual Reality sehen und miteinander kommunizieren. Für die einfache Entwicklung und Testung von 3D-Interaktionstechniken auf Basis vordefinierter Templates unter Unity3D wurden bereits verschiedene Komponenten entwickelt und integriert, die in diesem Paper kurz beschrieben werden und im Rahmen einer Demonstration auf der Konferenz ‚Mensch und Computer 2019‘ ausprobiert werden können. Eine Besonderheit des Projekts sind Komponenten, die eine messbare Verschlechterung der Immersion realisieren, etwa eine Einschränkung des Sichtbereichs, der Latenz oder der Framerate.
- KonferenzbeitragIDIAR: Augmented Reality Dashboards to Supervise Mobile Intervention Studies(Mensch und Computer 2021 - Tagungsband, 2021) Vock, Katja; Hubenschmid, Sebastian; Zagermann, Johannes; Butscher, Simon; Reiterer, HaraldMobile intervention studies employ mobile devices to observe participants’ behavior change over several weeks. Researchers regularly monitor high-dimensional data streams to ensure data quality and prevent data loss (e.g., missing engagement or malfunctions). The multitude of problem sources hampers possible automated detection of such irregularities – providing a use case for interactive dashboards. With the advent of untethered head-mounted AR devices, these dashboards can be placed anywhere in the user’s physical environment, leveraging the available space and allowing for flexible information arrangement and natural navigation. In this work, we present the user-centered design and the evaluation of IDIAR: I nteractive Dashboards in AR, combining a headmounted display with the familiar interaction of a smartphone. A user study with 15 domain experts for mobile intervention studies shows that participants appreciated the multimodal interaction approach. Based on our findings, we provide implications for research and design of interactive dashboards in AR.