Autonomous behavior-based switched top-down and bottom-up visual attention for mobile robots

Tingting Xu, Kolja Kühnlenz, Martin Buss

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

21 Zitate (Scopus)

Abstract

In this paper, autonomous switching between two basic attention selection mechanisms, i.e., top-down and bottom-up, is proposed. This approach fills a gap in object search using conventional top-down biased bottom-up attention selection, which fails, if a group of objects is searched whose appearances cannot be uniquely described by low-level features used in bottom-up computational models. Three internal robot states, such as observing, operating, and exploring, are included to determine the visual selection behavior. A vision-guided mobile robot equipped with an active stereo camera is used to demonstrate our strategy and evaluate the performance experimentally. This approach facilitates adaptations of visual behavior to different internal robot states and benefits further development toward cognitive visual perception in the robotics domain.

OriginalspracheEnglisch
Aufsatznummer5557826
Seiten (von - bis)947-954
Seitenumfang8
FachzeitschriftIEEE Transactions on Robotics
Jahrgang26
Ausgabenummer5
DOIs
PublikationsstatusVeröffentlicht - Okt. 2010

Fingerprint

Untersuchen Sie die Forschungsthemen von „Autonomous behavior-based switched top-down and bottom-up visual attention for mobile robots“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren