Beedkar, KaustubhCorro, Luciano DelGemulla, RainerMarkl, VolkerSaake, GunterSattler, Kai-UweHackenbroich, GregorMitschang, BernhardHärder, TheoKöppen, Veit2018-10-242018-10-242013978-3-88579-608-4https://dl.gi.de/handle/20.500.12116/17322Markov logic is a powerful tool for handling the uncertainty that arises in real-world structured data; it has been applied successfully to a number of data management problems. In practice, the resulting ground Markov logic networks can get very large, which poses challenges to scalable inference. In this paper, we present the first fully parallelized approach to inference in Markov logic networks. Inference decomposes into a grounding step and a probabilistic inference step, both of which can be cost-intensive. We propose a parallel grounding algorithm that partitions the Markov logic network based on its corresponding join graph; each partition is ground independently and in parallel. Our partitioning scheme is based on importance sampling, which we use for parallel probabilistic inference, and is also well-suited to other, more efficient parallel inference techniques. Preliminary experiments suggest that significant speedup can be gained by parallelizing both grounding and probabilistic inference.enFully parallel inference in Markov logic networksText/Conference Paper1617-5468