Did i get it right: Head gestures analysis for human-machine interactions

Jürgen Gast, Alexander Bannat, Tobias Rehrl, Gerhard Rigoll, Frank Wallhoff, Christoph Mayer, Bernd Radig

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

This paper presents a system for another input modality in a multimodal human-machine interaction scenario. In addition to other common input modalities, e.g. speech, we extract head gestures by image interpretation techniques based on machine learning algorithms to have a nonverbal and familiar way of interacting with the system. Our experimental evaluation proofs the capability of the presented approach to work in real-time and reliable.

Original languageEnglish
Title of host publicationHuman-Computer Interaction
Subtitle of host publicationNovel Interaction Methods and Techniques - 13th International Conference, HCI International 2009, Proceedings
Pages170-177
Number of pages8
EditionPART 2
DOIs
StatePublished - 2009
Event13th International Conference on Human-Computer Interaction, HCI International 2009 - San Diego, CA, United States
Duration: 19 Jul 200924 Jul 2009

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume5611 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th International Conference on Human-Computer Interaction, HCI International 2009
Country/TerritoryUnited States
CitySan Diego, CA
Period19/07/0924/07/09

Fingerprint

Dive into the research topics of 'Did i get it right: Head gestures analysis for human-machine interactions'. Together they form a unique fingerprint.

Cite this