Towards Autonomous Robotic Assembly: Using Combined Visual and Tactile Sensing for Adaptive Task Execution

Korbinian Nottensteiner, Arne Sachtler, Alin Albu-Schäffer

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

17 Zitate (Scopus)

Abstract

Robotic assembly tasks are typically implemented in static settings in which parts are kept at fixed locations by making use of part holders. Very few works deal with the problem of moving parts in industrial assembly applications. However, having autonomous robots that are able to execute assembly tasks in dynamic environments could lead to more flexible facilities with reduced implementation efforts for individual products. In this paper, we present a general approach towards autonomous robotic assembly that combines visual and intrinsic tactile sensing to continuously track parts within a single Bayesian framework. Based on this, it is possible to implement object-centric assembly skills that are guided by the estimated poses of the parts, including cases where occlusions block the vision system. In particular, we investigate the application of this approach for peg-in-hole assembly. A tilt-and-align strategy is implemented using a Cartesian impedance controller, and combined with an adaptive path executor. Experimental results with multiple part combinations are provided and analyzed in detail.

OriginalspracheEnglisch
Aufsatznummer49
FachzeitschriftJournal of Intelligent and Robotic Systems: Theory and Applications
Jahrgang101
Ausgabenummer3
DOIs
PublikationsstatusVeröffentlicht - März 2021

Fingerprint

Untersuchen Sie die Forschungsthemen von „Towards Autonomous Robotic Assembly: Using Combined Visual and Tactile Sensing for Adaptive Task Execution“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren