Logo des Repositoriums
 

Extending and Completing Probabilistic Knowledge and Beliefs Without Bias

dc.contributor.authorBeierle, Christoph
dc.contributor.authorKern-Isberner, Gabriele
dc.contributor.authorFinthammer, Marc
dc.contributor.authorPotyka, Nico
dc.date.accessioned2018-01-08T09:17:54Z
dc.date.available2018-01-08T09:17:54Z
dc.date.issued2015
dc.description.abstractCombining logic with probability theory provides a solid ground for the representation of and the reasoning with uncertain knowledge. Given a set of probabilistic conditionals like “If A then B with probability x”, a crucial question is how to extend this explicit knowledge, thereby avoiding any unnecessary bias. The connection between such probabilistic reasoning and commonsense reasoning has been elaborated especially by Jeff Paris, advocating the principle of Maximum Entropy (MaxEnt). In this paper, we address the general concepts and ideas underlying MaxEnt and leading to it, illustrate the use of MaxEnt by reporting on an example application from the medical domain, and give a brief survey on recent approaches to extending the MaxEnt principle to first-order logic.
dc.identifier.pissn1610-1987
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/11470
dc.publisherSpringer
dc.relation.ispartofKI - Künstliche Intelligenz: Vol. 29, No. 3
dc.relation.ispartofseriesKI - Künstliche Intelligenz
dc.subjectCommonsense reasoning
dc.subjectConditional logic
dc.subjectFirst-order conditional
dc.subjectMaximum entropy
dc.subjectProbabilistic logic
dc.titleExtending and Completing Probabilistic Knowledge and Beliefs Without Bias
dc.typeText/Journal Article
gi.citation.endPage262
gi.citation.startPage255

Dateien