Rosbach, EmelyMuhammad, MuhammadTalabani-Durmus, LarinMaral, MuhammedZiegler, CarinaReuter, HannahLell, AliceHoffer, SabrinaPeintner, JakobRiener, Andreas2023-10-012023-10-012023https://dl.gi.de/handle/20.500.12116/42439Recently, public spaces have seen a shift towards touch-free interaction to address hygiene concerns. HapTech, a prototype of a gesture-controlled interface with mid-air haptic feedback, offers a solution. It allows users to control essential functions like lights and HVAC without physical contact. To understand the impact of visual user interfaces, a Wizard-of-Oz study was conducted. The findings suggest that while including a visual UI improves self-explainability, it also leads to longer task completion time and errors. Striking a balance is crucial, emphasizing simplistic UIs for intuitive gesture language and optional visual feedback. This optimization enhances touch-free interaction and overall user experience in public environments. We conclude that the inclusion of a visual user interface influences gesture choice and task completion, but it plays a pivotal role in improving user experience and self-explainability.enMid-Air HapticsHuman.Computer InteractionNatural User InterfacesGesture-Based InteractionHapTech: Intelligent controls in public spaces through mid-air haptic interactionText10.18420/muc2023-mci-src-404