Pfeuffer, NicolasBaum, LorenzStammer, WolfgangAbdel-Karim, Benjamin M.Schramowski, PatrickBucher, Andreas M.Hügel, ChristianRohde, GernotKersting, KristianHinz, Oliver2023-12-122023-12-1220231867-0202http://dx.doi.org/10.1007/s12599-023-00806-xhttps://dl.gi.de/handle/20.500.12116/43297The most promising standard machine learning methods can deliver highly accurate classification results, often outperforming standard white-box methods. However, it is hardly possible for humans to fully understand the rationale behind the black-box results, and thus, these powerful methods hamper the creation of new knowledge on the part of humans and the broader acceptance of this technology. Explainable Artificial Intelligence attempts to overcome this problem by making the results more interpretable, while Interactive Machine Learning integrates humans into the process of insight discovery. The paper builds on recent successes in combining these two cutting-edge technologies and proposes how Explanatory Interactive Machine Learning (XIL) is embedded in a generalizable Action Design Research (ADR) process – called XIL-ADR. This approach can be used to analyze data, inspect models, and iteratively improve them. The paper shows the application of this process using the diagnosis of viral pneumonia, e.g., Covid-19, as an illustrative example. By these means, the paper also illustrates how XIL-ADR can help identify shortcomings of standard machine learning projects, gain new insights on the part of the human user, and thereby can help to unlock the full potential of AI-based systems for organizations and research.Action design research||Corona virus||Data science||Explainable artificial intelligence||Interactive machine learning||PneumoniaExplanatory Interactive Machine LearningText/Journal Article10.1007/s12599-023-00806-x