Real-time visual behaviours for navigating a mobile robot

Gordon Cheng, Alexander Zelinsky

Research output: Contribution to conferencePaperpeer-review

36 Scopus citations

Abstract

In this paper we present an approach for using vision as the primary source of sensing to guide a mobile robot in an unknown environment. We define a set of primitive visual behaviours for navigating a mobile robot in real-time. By combining such behaviours with a purposive map, our mobile robot exhibits a goal seeking behaviour. We present a fast segmentation technique for vision processing. This processing technique is used by different behaviours to produce an overall competent behaviour in our Yamabico robot. Experimental results show that our robot can navigate competently in dynamic indoor environments.

Original languageEnglish
Pages973-980
Number of pages8
StatePublished - 1996
Externally publishedYes
EventProceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. Part 3 (of 3) - Osaka, Jpn
Duration: 4 Nov 19968 Nov 1996

Conference

ConferenceProceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. Part 3 (of 3)
CityOsaka, Jpn
Period4/11/968/11/96

Fingerprint

Dive into the research topics of 'Real-time visual behaviours for navigating a mobile robot'. Together they form a unique fingerprint.

Cite this