Nachtigall, MarcusSchlichtig, MichaelBodden, EricEngels, GregorHebig, ReginaTichy, Matthias2023-01-182023-01-182023978-3-88579-726-5https://dl.gi.de/handle/20.500.12116/40100Static analysis tools support developers in detecting potential coding issues, such as bugs or vulnerabilities. Research emphasizes technical challenges of such tools but also mentions severe usability shortcomings. These shortcomings hinder the adoption of static analysis tools, and user dissatisfaction may even lead to tool abandonment. To comprehensively assess the state of the art, we present the first systematic usability evaluation of a wide range of static analysis tools. We derived a set of 36 relevant criteria from the literature and used them to evaluate a total of 46 static analysis tools complying with our inclusion and exclusion criteria - a representative set of mainly non-proprietary tools. The evaluation against the usability criteria in a multiple-raters approach shows that two thirds of the considered tools off er poor warning messages, while about three-quarters provide hardly any fix support. Furthermore, the integration of user knowledge is strongly neglected, which could be used for instance, to improve handling of false positives. Finally, issues regarding workflow integration and specialized user interfaces are revealed. These findings should prove useful in guiding and focusing further research and development in user experience for static code analyses.enAutomated static analysisSoftware usabilityEvaluation of Usability Criteria Addressed by Static Analysis Tools on a Large ScaleText/Conference Paper1617-5468