Auflistung nach Schlagwort "Automotive User Interfaces"
1 - 3 von 3
Treffer pro Seite
Sortieroptionen
- Konferenzbeitrag6th Workshop “Automotive HMI”: Cars in the Transition from Manual to Automated Driving(Mensch und Computer 2017 - Workshopband, 2017) Riener, Andreas; Pfleging, Bastian; Geisler, Stefan; Laack, Alexander van; Wintersberger, PhilippAutomotive user interfaces and, in particular, automated vehicle technology pose a plenty of challenges to researchers, vehicle manufacturers, and third-party suppliers to support all diverse facets of user needs. To give an example, they emerge from the variation of different user groups ranging from inexperienced, thrill-seeking young novice drivers to elderly drivers with all their natural limitations. To allow assessing the quality of automotive user interfaces and automated driving technology already during development and within virtual test processes, the proposed workshop is dedicated to the quest of finding objective, quantifiable quality criteria for describing future driving experiences. The workshop is intended for HCI, AutomotiveUI, and "Human Factors" researchers and practitioners as well for designers and developers. In adherence to the conference main topic "Spielend einfach interagieren" this workshop calls in particular for contributions in the in the area of human factors and ergonomics (user acceptance, trust, user experience, driving fun, natural user interfaces etc.) and artificial intelligence (predictive HMIs, adaptive systems, intuitive interaction).
- DissertationAffective automotive user interfaces(2020) Braun, MichaelTechnological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions. Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driver’s state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving. This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience. Promoting safe behavior through emotion regulation: Systems which detect and react to the driver’s state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology. Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the user’s preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars. We argue that the future of user-aware interaction lies in adapting not only to the driver’s preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.
- Zeitschriftenartikel“What’s the Robo-Driver up to?” Requirements for Screen-based Awareness and Intent Communication in Autonomous Buses(i-com: Vol. 18, No. 2, 2019) Fröhlich, Peter; Schatz, Raimund; Buchta, Markus; Schrammel, Johann; Suette, Stefan; Tscheligi, ManfredAutonomous buses are expected to become a cornerstone of future mobility systems. Especially during their introduction, passengers may require reassurance about the vehicle’s awareness of the situation on the road and of its intended next actions to further acceptance. In order to investigate the need and requirements for information about the vehicle’s awareness and intent from the perspective of first-time users, we conducted two user studies in a state-of-the-art autonomous bus at public demonstration spaces. In the first study, participants underwent a demonstration ride with the bus and were then asked about their needs for awareness and intent communication. The second study took participants on a ‘simulated ride’ within a stationary bus, in which typical scenarios of the road ahead were presented, together with different awareness and intent cues. Our results suggest that, first, future autonomous bus passengers may be in need of such awareness and intent communication screens. Second, we found that awareness and intent communication may be of greater importance for the indication of potential hazard recognition than for indicating route directions. Third, due to their complementary strengths, none of the three compared types of visual communication (text, icon and augmented reality) should be used in isolation.