Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions

Daniel Roth, Franziska Westermeier, Larissa Brübach, Tobias Feigl, Christian Schell, Marc Erich Latoschik

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

The perception and expression of emotion is a fundamental part of social interaction. This project aims to utilize neuronal signals to augment avatar-mediated communications. We recognize emotions with a brain-computer-interface (BCI) and supervised machine learning. Using an avatar-based communication interface that supports head tracking, gaze tracking, and speech to animation, we leverage the BCI-based affect detection to visualize emotional states.

Keywords

  • affective computing
  • avatars
  • brain-computer interfaces

Fingerprint

Dive into the research topics of 'Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions'. Together they form a unique fingerprint.

Cite this