Towards Autonomous Robotic Assembly: Using Combined Visual and Tactile Sensing for Adaptive Task Execution

Korbinian Nottensteiner, Arne Sachtler, Alin Albu-Schäffer

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Robotic assembly tasks are typically implemented in static settings in which parts are kept at fixed locations by making use of part holders. Very few works deal with the problem of moving parts in industrial assembly applications. However, having autonomous robots that are able to execute assembly tasks in dynamic environments could lead to more flexible facilities with reduced implementation efforts for individual products. In this paper, we present a general approach towards autonomous robotic assembly that combines visual and intrinsic tactile sensing to continuously track parts within a single Bayesian framework. Based on this, it is possible to implement object-centric assembly skills that are guided by the estimated poses of the parts, including cases where occlusions block the vision system. In particular, we investigate the application of this approach for peg-in-hole assembly. A tilt-and-align strategy is implemented using a Cartesian impedance controller, and combined with an adaptive path executor. Experimental results with multiple part combinations are provided and analyzed in detail.

Original languageEnglish
Article number49
JournalJournal of Intelligent and Robotic Systems: Theory and Applications
Volume101
Issue number3
DOIs
StatePublished - Mar 2021

Keywords

  • Autonomous assembly
  • Compliant manipulation
  • Future manufacturing
  • Peg-in-hole
  • Sensor fusion
  • Sequential Monte Carlo

Fingerprint

Dive into the research topics of 'Towards Autonomous Robotic Assembly: Using Combined Visual and Tactile Sensing for Adaptive Task Execution'. Together they form a unique fingerprint.

Cite this