Auflistung nach Autor:in "Gemulla, Rainer"
1 - 4 von 4
Treffer pro Seite
Sortieroptionen
- ReportDagstuhl-Erklärung: Bildung in der digitalen vernetzten Welt(2016) Brinda, Torsten; Diethelm, Ira; Gemulla, Rainer; Romeike, Ralf; Schöning, Johannes; Schulte, Carsten; Bartoschek, Thomas; Bergner, Nadine; Dietrich, Leonore; Döbeli, Beat; Fries, Rüdiger; Hellmig, Lutz; Herzig, Bardo; Hollatz, Jürgen; Jörissen, Benjamin; Kommer, Sven; Mittag, Alexander; Kusterer, Peter; Oberweis, Andreas; Otto, Torsten; Rabe, Alexander; Röhner, Gerhard; Schelhowe, Heidi; Scheuermann, Björn; Schmitz, Birgit; Schulte, Carsten; Sommer, Hartmut; Zimnol, Martin
- KonferenzbeitragThe DESQ Framework for Declarative and Scalable Frequent Sequence Mining(INFORMATIK 2019: 50 Jahre Gesellschaft für Informatik – Informatik für Gesellschaft, 2019) Beedkar, Kaustubh; Gemulla, Rainer; Renz-Wieland, AlexanderDESQ is a general-purpose framework for declarative and scalable frequent sequence mining. Applications express their specific sequence mining tasks using a simple yet powerful powerful pattern expression language, and DESQ’s computation engine automatically executes the mining task in an efficient and scalable way. In this paper, we give a brief overview of DESQ and its components.
- JournalEditorial(Datenbank-Spektrum: Vol. 18, No. 2, 2018) Michel, Sebastian; Gemulla, Rainer; Schenkel, Ralf; Härder, Theo
- KonferenzbeitragFully parallel inference in Markov logic networks(Datenbanksysteme für Business, Technologie und Web (BTW) 2026, 2013) Beedkar, Kaustubh; Corro, Luciano Del; Gemulla, RainerMarkov logic is a powerful tool for handling the uncertainty that arises in real-world structured data; it has been applied successfully to a number of data management problems. In practice, the resulting ground Markov logic networks can get very large, which poses challenges to scalable inference. In this paper, we present the first fully parallelized approach to inference in Markov logic networks. Inference decomposes into a grounding step and a probabilistic inference step, both of which can be cost-intensive. We propose a parallel grounding algorithm that partitions the Markov logic network based on its corresponding join graph; each partition is ground independently and in parallel. Our partitioning scheme is based on importance sampling, which we use for parallel probabilistic inference, and is also well-suited to other, more efficient parallel inference techniques. Preliminary experiments suggest that significant speedup can be gained by parallelizing both grounding and probabilistic inference.