Seybold, DanielDomaschka, JörgHerrmann, Andrea2024-02-222024-02-2220230720-8928https://dl.gi.de/handle/20.500.12116/43640Benchmarking is an important method to advance database management systems (DBMS) from the industry and research perspective. Ensuring transparent and reproducible results is a key requirement to ensure the acceptance and credibility of benchmarking. To advance the research towards transparent and reproducible benchmark data, we report on building an open DBMS performance ranking with 130 benchmark configurations and ensuring comparability, transparency and reproducibility. We derive the required data on cloud, resource, DBMS and benchmark level to enable transparency and reproducibility and demonstrate the generation of such data sets with benchANT. Building upon such data, we outline future research directions for DBMS performance modelling, DBMS auto-tuning and decision support.enperformancebenchmarkdatabaseDBMStransparencyreproducibilityExperiences from Building the Open Database Performance Ranking with benchANTText/Conference Paper