Logo des Repositoriums
 

Human Capacities for Emotion Recognition and their Implications for Computer Vision

dc.contributor.authorLiebold, Benny
dc.contributor.authorRichter, René
dc.contributor.authorTeichmann, Michael
dc.contributor.authorHamker, Fred H.
dc.contributor.authorOhler, Peter
dc.contributor.editorZiegler, Jürgen
dc.date.accessioned2017-06-17T18:54:35Z
dc.date.available2017-06-17T18:54:35Z
dc.date.issued2015
dc.description.abstractCurrent models for automated emotion recognition are developed under the assumption that emotion expressions are distinct expression patterns for basic emotions. Thereby, these approaches fail to account for the emotional processes underlying emotion expressions. We review the literature on human emotion processing and suggest an alternative approach to affective computing. We postulate that the generalizability and robustness of these models can be greatly increased by three major steps: (1) modeling emotional processes as a necessary foundation of emotion recognition; (2) basing models of emotional processes on our knowledge about the human brain; (3) conceptualizing emotions based on appraisal processes and thus regarding emotion expressions as expressive behavior linked to these appraisals rather than fixed neuro-motor patterns. Since modeling emotional processes after neurobiological processes can be considered a long-term effort, we suggest that researchers should focus on early appraisals, which evaluate intrinsic stimulus properties with little higher cortical involvement. With this goal in mind, we focus on the amygdala and its neural connectivity pattern as a promising structure for early emotional processing. We derive a model for the amygdala-visual cortex circuit from the current state of neuroscientific research. This model is capable of conditioning visual stimuli with body reactions to enable rapid emotional processing of stimuli consistent with early stages of psychological appraisal theories. Additionally, amygdala activity can feed back to visual areas to modulate attention allocation according to the emotional relevance of a stimulus. The implications of the model considering other approaches to automated emotion recognition are discussed.
dc.identifier.pissn2196-6826
dc.language.isoen
dc.publisherDe Gruyter
dc.relation.ispartofi-com: Vol. 14, No. 2
dc.subjectEmotions in HCI
dc.subjectEmotion Recognition
dc.subjectNeural Networks
dc.titleHuman Capacities for Emotion Recognition and their Implications for Computer Vision
dc.typeText/Journal Article
gi.citation.publisherPlaceBerlin
gi.citation.startPage126–137
gi.document.qualitydigidoc

Dateien