Augenstein, Isabelle2023-01-182023-01-1820222022http://dx.doi.org/10.1007/s13218-022-00774-6https://dl.gi.de/handle/20.500.12116/40053With the substantial rise in the amount of mis- and disinformation online, fact checking has become an important task to automate. This article is a summary of a habilitation (doctor scientiarum) thesis submitted to the University of Copenhagen, which was sucessfully defended in December 2021 (Augenstein in Towards Explainable Fact Checking. Dr. Scient. thesis, University of Copenhagen, Faculty of Science, 2021). The dissertation addresses several fundamental research gaps within automatic fact checking. The contributions are organised along three verticles: (1) the fact-checking subtask they address; (2) methods which only require small amounts of manually labelled data; (3) methods for explainable fact checking, addressing the problem of opaqueness in the decision-making of black-box fact checking models.Automatic fact checkingExplainable AILow-resource learningMulti-task learningNatural language understandingHabilitation Abstract: Towards Explainable Fact CheckingText/Journal Article10.1007/s13218-022-00774-61610-1987