A Graphical Model for unifying tracking and classification within a multimodal Human-Robot Interaction scenario

Tobias Rehrl, Jürgen Gast, Nikolaus Theißing, Alexander Bannat, Dejan Arsić, Frank Wallhoff, Gerhard Rigoll, Christoph Mayer, Bernd Radig

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

This paper introduces our research platform for enabling a multimodal Human-Robot Interaction scenario as well as our research vision: approaching problems in a holistic way to realize this scenario. However, in this paper the main focus is laid on the image processing domain, where our vision has been realized by combining particle tracking and Dynamic Bayesian Network classification in a unified Graphical Model. This combination allows for enhancing the tracking process by an adaptive motion model realized via a Dynamic Bayesian Network modeling several motion classes. The Graphical Model provides a direct integration of the classification step in the tracking process. First promising results show the potential of the approach.

Original languageEnglish
Title of host publication2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010
Pages17-23
Number of pages7
DOIs
StatePublished - 2010
Event2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010 - San Francisco, CA, United States
Duration: 13 Jun 201018 Jun 2010

Publication series

Name2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010

Conference

Conference2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010
Country/TerritoryUnited States
CitySan Francisco, CA
Period13/06/1018/06/10

Fingerprint

Dive into the research topics of 'A Graphical Model for unifying tracking and classification within a multimodal Human-Robot Interaction scenario'. Together they form a unique fingerprint.

Cite this