Luo, LinghuiPauck, FelixPiskachev, GoranBenz, ManuelPashchenko, IvanMory, MartinBodden, EricHermann, BenMassacci, FabioGrunske, LarsSiegmund, JanetVogelsang, Andreas2022-01-192022-01-192022978-3-88579-714-2https://dl.gi.de/handle/20.500.12116/37973Due to the lack of established real-world benchmark suites for static taint analyses of Android applications, evaluations of these analyses are often restricted and hard to compare. Even in evaluations that do use real-world applications, details about the ground truth in those apps are rarely documented, which makes it difficult to compare and reproduce the results. Our recent study fills this gap. It first defines a set of sensible construction criteria for such a benchmark suite. It further proposes the TaintBench benchmark suite designed to fulfil these construction criteria. Along with the suite, this paper introduces the TaintBench framework, which allows tool-assisted benchmark suite construction, evaluation and inspection. Our experiments using TaintBench reveal new insights of popular Android taint analysis tools.enTaint analysisBenchmarkReal-world benchmarkAndroid malwareTaintBench: Automatic Real-World Malware Benchmarking of Android Taint AnalysesText/Conference Paper10.18420/se2022-ws-0201617-5468