Schrills, TimSchmid, LeonJetter, Hans-ChristianFranke, ThomasWienrich, CarolinWintersberger, PhilippWeyers, Benjamin2021-09-052021-09-052021https://dl.gi.de/handle/20.500.12116/37352interfaces (CUI) miss requirements for good usability, e.g. sufficient feedback regarding system status. Within a user-centred design process we created different design approaches to explain the CUI’s state. A prototypical explainable conversational user interface (XCUI) was developed, which explains its state by means of representations of (1) confidence, (2) intent alternatives, (3) entities, and (4) a context time line. The XCUI was then tested in a user study (N = 49) and compared with a conventional CUI in terms of user satisfaction and task completion time. Results indicated that completion time and satisfaction improvement were dependent on specific task characteristics. The effects of the implemented XCUI features potentially resulted from task-specific needs for explanation. This could be based on the tasks’ different complexity indicating the potential need for adaptive presentation of explainability features.enConversational User InterfacesExplainable Artificial IntelligenceExplainable Conversational User InterfacesAn Explainability Case-Study for Conversational User Interfaces in Walk-Up-And-Use ContextsText/Workshop Paper10.18420/muc2021-mci-ws02-377