A social interaction interface supporting affective augmentation based on neuronal data

Daniel Roth, Larissa Bröbach, Franziska Westermeier, Tobias Feigl, Christian Schell, Marc Erich Latoschik

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

In this demonstrationwe present a prototype for an avatar-mediated social interaction interface that supports the replication of headand eye movement in distributed virtual environments. In addition to the retargeting of these natural behaviors, the system is capable of augmenting the interaction based on the visual presentation of affective states.We derive those states using neuronal data captured by electroencephalographic (EEG) sensing in combination with a machine learning driven classification of emotional states.

Original languageEnglish
Title of host publicationProceedings - SUI 2019
Subtitle of host publicationACM Conference on Spatial User Interaction
EditorsStephen N. Spencer
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450369756
DOIs
StatePublished - 19 Oct 2019
Externally publishedYes
Event7th ACM Symposium on Spatial User Interaction, SUI 2019 - New Orleans, United States
Duration: 19 Oct 201920 Oct 2019

Publication series

NameProceedings - SUI 2019: ACM Conference on Spatial User Interaction

Conference

Conference7th ACM Symposium on Spatial User Interaction, SUI 2019
Country/TerritoryUnited States
CityNew Orleans
Period19/10/1920/10/19

Keywords

  • Affective computing
  • Avatars
  • Brain-computer interfaces
  • Communication interfaces
  • Embodiment

Fingerprint

Dive into the research topics of 'A social interaction interface supporting affective augmentation based on neuronal data'. Together they form a unique fingerprint.

Cite this