Münch, TobiasGaedke, Martin2024-08-212024-08-212024https://dl.gi.de/handle/20.500.12116/44270Nowadays, Generative Artificial Intelligence (GenAI) can outperform humans in creative professions, such as design. As a result, GenAI attracted a lot of attention from researchers and industry. However, GenAI could used to augment humans with a multimodal user interface, as proposed by Ben Shneiderman in his recent work on Human-Centred Artificial Intelligence (HCAI). Most studies of HCAI have mainly focused on greenfield projects. In contrast to existing research, we describe a brownfield software architecture approach with a loosely coupled GenAI-driven multimodal user interface that combines human interaction with third-party systems. A domain-specific language for user interaction connects natural language and signals of the existing system through GenAI. Our proposed architecture enables research and industry to provide user interfaces for existing software systems that allow hands-free interaction.enhttp://purl.org/eprint/accessRights/RestrictedAccessExploring an Architecture of an Adaptable GenAI-Driven Multimodal User Interface for Third-Party SystemsText/Workshop Paper10.18420/muc2024-mci-ws09-128