Extending and Completing Probabilistic Knowledge and Beliefs Without Bias
dc.contributor.author | Beierle, Christoph | |
dc.contributor.author | Kern-Isberner, Gabriele | |
dc.contributor.author | Finthammer, Marc | |
dc.contributor.author | Potyka, Nico | |
dc.date.accessioned | 2018-01-08T09:17:54Z | |
dc.date.available | 2018-01-08T09:17:54Z | |
dc.date.issued | 2015 | |
dc.description.abstract | Combining logic with probability theory provides a solid ground for the representation of and the reasoning with uncertain knowledge. Given a set of probabilistic conditionals like “If A then B with probability x”, a crucial question is how to extend this explicit knowledge, thereby avoiding any unnecessary bias. The connection between such probabilistic reasoning and commonsense reasoning has been elaborated especially by Jeff Paris, advocating the principle of Maximum Entropy (MaxEnt). In this paper, we address the general concepts and ideas underlying MaxEnt and leading to it, illustrate the use of MaxEnt by reporting on an example application from the medical domain, and give a brief survey on recent approaches to extending the MaxEnt principle to first-order logic. | |
dc.identifier.pissn | 1610-1987 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/11470 | |
dc.publisher | Springer | |
dc.relation.ispartof | KI - Künstliche Intelligenz: Vol. 29, No. 3 | |
dc.relation.ispartofseries | KI - Künstliche Intelligenz | |
dc.subject | Commonsense reasoning | |
dc.subject | Conditional logic | |
dc.subject | First-order conditional | |
dc.subject | Maximum entropy | |
dc.subject | Probabilistic logic | |
dc.title | Extending and Completing Probabilistic Knowledge and Beliefs Without Bias | |
dc.type | Text/Journal Article | |
gi.citation.endPage | 262 | |
gi.citation.startPage | 255 |