Fusion of multi-modal sensors in a voxel occupancy grid for tracking and behaviour analysis

Martin Hofmann, Moritz Kaiser, Hadi Aliakbarpour, Gerhard Rigoll

Research output: Contribution to journalConference articlepeer-review

10 Scopus citations

Abstract

In this paper, we present a multi-modal fusion scheme for tracking and behavior analysis in Smart Home environments. This is applied to tracking multiple people and detecting their behavior. To this end, information from multiple heterogeneous sensors (visual color sensor, thermal sensor, infrared sensor and photonic mixer devices) is combined in a common 3D voxel occupancy grid. Graph cuts are used for data fusion and to accurately reconstruct people in the scene. A Viterbi tracking framework is applied to track all people and simultaneously determine their behaviour. We evaluate the proposed fusion scheme on the PROMETHEUS Smart Home database and show the impact of different sensors and modalities to the final results.

Original languageEnglish
JournalInternational Workshop on Image Analysis for Multimedia Interactive Services
StatePublished - 2011
Event12th International Workshop on Image Analysis for Multimedia Interactive Services, WIAMIS 2011 - Delft, Netherlands
Duration: 13 Apr 201115 Apr 2011

Fingerprint

Dive into the research topics of 'Fusion of multi-modal sensors in a voxel occupancy grid for tracking and behaviour analysis'. Together they form a unique fingerprint.

Cite this