Dynamic composition of tracking primitives for interactive vision-guided navigation

Darius Burschka, Gregory Hager

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.

Original languageEnglish
Pages (from-to)114-125
Number of pages12
JournalProceedings of SPIE - The International Society for Optical Engineering
Volume4573
DOIs
StatePublished - 2001
Externally publishedYes
EventMobile Robots XVI - Newton, MA, United States
Duration: 29 Oct 200130 Oct 2001

Keywords

  • 3D tracking
  • Color tracking
  • Vision-based navigation

Fingerprint

Dive into the research topics of 'Dynamic composition of tracking primitives for interactive vision-guided navigation'. Together they form a unique fingerprint.

Cite this