Logo des Repositoriums
 

Social Signal Interpretation (SSI)

dc.contributor.authorWagner, Johannes
dc.contributor.authorLingenfelser, Florian
dc.contributor.authorBee, Nikolaus
dc.contributor.authorAndré, Elisabeth
dc.date.accessioned2018-01-08T09:15:15Z
dc.date.available2018-01-08T09:15:15Z
dc.date.issued2011
dc.description.abstractThe development of anticipatory user interfaces is a key issue in human-centred computing. Building systems that allow humans to communicate with a machine in the same natural and intuitive way as they would with each other requires detection and interpretation of the user’s affective and social signals. These are expressed in various and often complementary ways, including gestures, speech, mimics etc. Implementing fast and robust recognition engines is not only a necessary, but also challenging task. In this article, we introduce our Social Signal Interpretation (SSI) tool, a framework dedicated to support the development of such online recognition systems. The paper at hand discusses the processing of four modalities, namely audio, video, gesture and biosignals, with focus on affect recognition, and explains various approaches to fuse the extracted information to a final decision.
dc.identifier.pissn1610-1987
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/11222
dc.publisherSpringer
dc.relation.ispartofKI - Künstliche Intelligenz: Vol. 25, No. 3
dc.relation.ispartofseriesKI - Künstliche Intelligenz
dc.subjectAffective computing
dc.subjectHuman-centred computing
dc.subjectMachine learning
dc.subjectMultimodal fusion
dc.subjectReal-time recognition
dc.subjectSocial signal processing
dc.titleSocial Signal Interpretation (SSI)
dc.typeText/Journal Article
gi.citation.endPage256
gi.citation.startPage251

Dateien