Riener, AndreasPfleging, BastianDetjen, HenrikBraun, MichaelPeintner, JakobWienrich, CarolinWintersberger, PhilippWeyers, Benjamin2021-09-032021-09-032021https://dl.gi.de/handle/20.500.12116/37339Modern vehicles allow control by the driver with multimodal user interfaces (UIs), touch interaction on screens, speech input, and mid-air gestures. Such UIs are driver-focused and optimized for limited distraction to not compromise road safety in manual driving. Nevertheless, they are often complex and it might be difficult to find specific features. Automated driving in L3+ will disrupt the design of automotive UIs as drivers become passengers, at least for certain parts along the way. Similarly, the car is being transformed into a social space where passengers can be granted control over systems because they can devote their full attention without imposing safety risks. The complexity of advanced driver assistance, in-vehicle information and interaction systems requires explanation to the user, e.g., in which state the system is, interaction possibilities, expectations from the driver or take over timing. We expect novel technologies to allow for natural interaction and adaptivity to design valuable and future-proof interaction concepts for the changing interior of (automated) vehicles. The goal of this workshop is, thus, to discuss how natural and adaptive user interfaces can help to solve the mentioned challenges and to identify opportunities for future research and collaboration.enContextual UIsAutomotive HMIsAutomated DrivingNatural InteractionAdaptive Interfaces9thWorkshop Automotive HMIs: Natural and Adaptive UIs to Support Future VehiclesText/Workshop Paper10.18420/muc2021-mci-ws10-119