A multimodal human-robot-dialog applying emotional feedbacks

Alexander Bannat, Jürgen Blume, Jürgen T. Geiger, Tobias Rehrl, Frank Wallhoff, Christoph Mayer, Bernd Radig, Stefan Sosnowski, Kolja Kühnlenz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

This paper presents a system for human-robot communication situated in an ambient assisted living scenario, where the robot performs an order-and-serve-procedure. The interaction is based on different modalities that extract information from the auditory and the visual channel in order to obtain an intuitive and natural dialog. The required interaction dialog structure is represented in first-order logic, which allows to split a complex task into simpler subtasks. The different communication modalities are utilized to conclude these subtasks by determining information about the human interaction partner. The system works in real-time and robust and utilizes emotional feedback to enrich the communication process.

Original languageEnglish
Title of host publicationSocial Robotics - Second International Conference on Social Robotics, ICSR 2010, Proceedings
Pages1-10
Number of pages10
DOIs
StatePublished - 2010
Event2nd International Conference on Social Robotics, ICSR 2010 - Singapore, Singapore
Duration: 23 Nov 201024 Nov 2010

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume6414 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference2nd International Conference on Social Robotics, ICSR 2010
Country/TerritorySingapore
CitySingapore
Period23/11/1024/11/10

Fingerprint

Dive into the research topics of 'A multimodal human-robot-dialog applying emotional feedbacks'. Together they form a unique fingerprint.

Cite this