Logo des Repositoriums
 

Affective automotive user interfaces

dc.contributor.authorBraun, Michael
dc.date.accessioned2023-09-03T11:47:53Z
dc.date.available2023-09-03T11:47:53Z
dc.date.issued2020
dc.description.abstractTechnological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions. Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driver’s state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving. This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience. Promoting safe behavior through emotion regulation: Systems which detect and react to the driver’s state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology. Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the user’s preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars. We argue that the future of user-aware interaction lies in adapting not only to the driver’s preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.en
dc.description.urihttps://edoc.ub.uni-muenchen.de/26309/en
dc.identifier.doi10.5282/edoc.26309
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/42256
dc.language.isoen
dc.publisherLudwig-Maximilians-Universität München, Fakultät für Mathematik, Informatik und Statistik
dc.relation.ispartofseriesDissertationen Ludwig-Maximilians-Universität München, Fakultät für Mathematik, Informatik und Statistik (Prof. F. Alt)
dc.subjectAffective Computing
dc.subjectEmotion Detection
dc.subjectInteraction Design
dc.subjectAutomotive User Interfaces
dc.subjectHuman-Computer Interaction
dc.subjectUser Studies
dc.titleAffective automotive user interfacesen
dc.typeText/Dissertation
gi.citation.publisherPlaceMünchen

Dateien