Multimodal data communication for human-robot interactions

Frank Wallhoff, Tobias Rehrl, Jürgen Gast, Alexander Bannat, Gerhard Rigoll

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper, the development of a framework based on the Realtime Database (RTDB) for processing multimodal data is presented. This framework allows readily integration of input and output modules. Furthermore the asynchronous data streams from different sources can be approximately processed in a synchronous manner. Depending on the included modules, online as well as offline data processing is possible. The idea is to establish a real multimodal interaction system that is able to recognize and react to those situations that are relevant for human-robot interaction.

Original languageEnglish
Title of host publicationProceedings - 2009 IEEE International Conference on Multimedia and Expo, ICME 2009
Pages1146-1149
Number of pages4
DOIs
StatePublished - 2009
Event2009 IEEE International Conference on Multimedia and Expo, ICME 2009 - New York, NY, United States
Duration: 28 Jun 20093 Jul 2009

Publication series

NameProceedings - 2009 IEEE International Conference on Multimedia and Expo, ICME 2009

Conference

Conference2009 IEEE International Conference on Multimedia and Expo, ICME 2009
Country/TerritoryUnited States
CityNew York, NY
Period28/06/093/07/09

Keywords

  • Augmented reality
  • Human-robot interaction
  • Multimodal data communication
  • Real-time data processing

Fingerprint

Dive into the research topics of 'Multimodal data communication for human-robot interactions'. Together they form a unique fingerprint.

Cite this