Roth, DanielWestermeier, FranziskaBrĂ¼bach, LarissaFeigl, TobiasSchell, ChristianLatoschik, Marc Erich2019-09-052019-09-052019https://dl.gi.de/handle/20.500.12116/25205The perception and expression of emotion is a fundamental part of social interaction. This project aims to utilize neuronal signals to augment avatar-mediated communications. We recognize emotions with a brain-computer-interface (BCI) and supervised machine learning. Using an avatar-based communication interface that supports head tracking, gaze tracking, and speech to animation, we leverage the BCI-based affect detection to visualize emotional states.enbrain-computer interfacesaugmented social interactionavatarsshared virtual environmentsBrain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social InteractionsText/Workshop Paper10.18420/muc2019-ws-571