Auflistung nach Schlagwort "Multimodality"
1 - 5 von 5
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragElevating Stress Levels - Exploring Multimodality for Stress Induction in VR(Mensch und Computer 2022 - Tagungsband, 2022) Weiß, Sebastian; Kimmel, Simon; Withöft, Ani; Jung, Frederike; Boll, Susanne; Heuten, WilkoSimulation training in Virtual Reality (VR) has gained attraction in recent years. With its broad application possibilities and implicit safety for users, simulation-based training may be imagined for safety-critical situations and exposure therapy. Beyond visual and auditory representation of the environment and stressors, upcoming hardware supports olfactory and haptic feedback. To examine the benefits of these technological advances in stress training, we present a Wizard of Oz pilot study (N=12). Therein, a bimodal presentation of the scenario ‘being stuck in an elevator’ was compared to a multimodal one. For the comparison, we measured qualitative feedback, the iGroup presence questionnaire scores, and physiological stress reactions by recording changes in cardiac and pulmonary activity. Results show trends for moderately more pronounced stress levels and perceived presence for the multimodal presentation. Thus, we argue that multimodal stress induction may better simulate hazardous situations in stress training.
- ZeitschriftenartikelInfluence of Age and Gender on the Reaction Performance in Human-Vehicle InteractionEinfluss von Alter und Geschlecht auf die Reaktionszeit bei Fahrer-Autor Interaktion(i-com: Vol. 8, No. 2, 2009) Riener, AndreasExcess workload in vehicle control and inappropriateness of the common two interaction modalities seeing and hearing requires to consider ways and means for new interaction capabilities in vehicles. We have investigated haptic force displays for transmitting feedback from vehicular services to the driving person by using vibro-tactile elements integrated into the car seat and backrest. A haptic display would be implicit perceivable and passive in its attentiveness, and would furthermore display only private messages. Empirical studies regarding reaction times for the different modalities vision, sound, and touch, as well as age- and genderdependent evaluations have been conducted, with the aim to identify general conditions for an all-purpose vehicle interaction system and to justify the usage of haptic feedback. Experimental data have been acquired in a simulated driving environment in order to guarantee safety for test persons, repeatability of the experiment itself, and similar conditions for ...
- WorkshopbeitragM3I: A Framework for Mobile Multimodal Interaction(Mensch & Computer 2014 - Tagungsband, 2014) Möller, Andreas; Diewald, Stefan; Roalter, Luis; Kranz, MatthiasWe present M3I, an extensive multimodal interaction framework for mobile devices, which simplifies and accelerates the creation of multimodal applications for prototyping and research. It provides an abstraction of information representations in different communication channels, unifies access to implicit and explicit information, and wires together the logic behind context-sensitive modality switches by a rule-based approach. In this paper, we present the structure and major features of our framework, and show exemplary implementations of interaction modalities with help of M3I.
- WorkshopbeitragReferential Practices for a Museum Guide Robot. Human-Robot-Interaction as a Methodological Tool to Investigate Multimodal Interaction(Mensch und Computer 2019 - Workshopband, 2019) Pitsch, KarolaAn autonomous robot system was equipped with basic means to monitor the users’ success/failure in following a robot’s verbal-gestural deictic reference to an object and – in case of problems – to provide additional help, i.e. to suggest a ‘repair’ action. A real-world field trial with the robot acting as museum guide constitutes the basis for analysis of the users’ reactions. This example is used to explore HRI as a tool to investigate multimodal interaction.
- ZeitschriftenartikelSemantic Interpretation of Multi-Modal Human-Behaviour Data(KI - Künstliche Intelligenz: Vol. 31, No. 4, 2017) Bhatt, Mehul; Kersting, KristianThis special issue presents interdisciplinary research—at the interface of artificial intelligence, cognitive science, and human-computer interaction—focussing on the semantic interpretation of human behaviour. The special issue constitutes an attempt to highlight and steer foundational methods research in artificial intelligence, in particular knowledge representation and reasoning, for the development of human-centred cognitive assistive technologies. Of specific interest and focus have been application outlets for basic research in knowledge representation and reasoning and computer vision for the cognitive, behavioural, and social sciences.