Wintersberger, PhilippJanotta, FredericaPeintner, JakobLöcken, AndreasRiener, Andreas2021-06-162021-06-162021https://dl.gi.de/handle/20.500.12116/36537The inappropriate use of automation as a result of trust issues is a major barrier for a broad market penetration of automated vehicles. Studies so far have shown that providing information about the vehicle’s actions and intentions can be used to calibrate trust and promote user acceptance. However, how such feedback could be designed optimally is still an open question. This article presents the results of two user studies. In the first study, we investigated subjective trust and user experience of (N=21) participants driving in a fully automated vehicle, which interacts with other traffic participants in virtual reality. The analysis of questionnaires and semi-structured interviews shows that participants request feedback about the vehicle’s status and intentions and prefer visual feedback over other modalities. Consequently, we conducted a second study to derive concrete requirements for future feedback systems. We showed (N=56) participants various videos of an automated vehicle from the ego perspective and asked them to select elements in the environment they want feedback about so that they would feel safe, trust the vehicle, and understand its actions. The results confirm a correlation between subjective user trust and feedback needs and highlight essential requirements for automatic feedback generation. The results of both experiments provide a scientific basis for designing more adaptive and personalized in-vehicle interfaces for automated driving.enAutomated DrivingTrust in AutomationUser AcceptanceUser StudiesMixed/Augmented RealityAdaptive UIsPersonalized UIsEvaluating feedback requirements for trust calibration in automated vehiclesText/Journal Article10.1515/itit-2020-00242196-7032