Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions
dc.contributor.author | Roth, Daniel | |
dc.contributor.author | Westermeier, Franziska | |
dc.contributor.author | Brübach, Larissa | |
dc.contributor.author | Feigl, Tobias | |
dc.contributor.author | Schell, Christian | |
dc.contributor.author | Latoschik, Marc Erich | |
dc.date.accessioned | 2019-09-05T01:06:05Z | |
dc.date.available | 2019-09-05T01:06:05Z | |
dc.date.issued | 2019 | |
dc.description.abstract | The perception and expression of emotion is a fundamental part of social interaction. This project aims to utilize neuronal signals to augment avatar-mediated communications. We recognize emotions with a brain-computer-interface (BCI) and supervised machine learning. Using an avatar-based communication interface that supports head tracking, gaze tracking, and speech to animation, we leverage the BCI-based affect detection to visualize emotional states. | en |
dc.identifier.doi | 10.18420/muc2019-ws-571 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/25205 | |
dc.language.iso | en | |
dc.publisher | Gesellschaft für Informatik e.V. | |
dc.relation.ispartof | Mensch und Computer 2019 - Workshopband | |
dc.relation.ispartofseries | Mensch und Computer | |
dc.subject | brain-computer interfaces | |
dc.subject | augmented social interaction | |
dc.subject | avatars | |
dc.subject | shared virtual environments | |
dc.title | Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions | en |
dc.type | Text/Workshop Paper | |
gi.citation.publisherPlace | Bonn | |
gi.conference.date | 8.-11. September 2019 | |
gi.conference.location | Hamburg | |
gi.conference.sessiontitle | MCI-WS24: User-embodied Interaction in Virtual Reality (UIVR) | |
gi.document.quality | digidoc |
Dateien
Originalbündel
1 - 1 von 1