Sim-to-Real Transfer of Robotic Assembly with Visual Inputs Using CycleGAN and Force Control

Chengjie Yuan, Yunlei Shi, Qian Feng, Chunyang Chang, Michael Liu, Zhaopeng Chen, Alois Christian Knoll, Jianwei Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

Recently, deep reinforcement learning (RL) has shown some impressive successes in robotic manipulation applications. However, training robots in the real world is nontrivial owing to sample efficiency and safety concerns. Sim-to-real transfer is proposed to address the aforementioned concerns but introduces a new issue called the reality gap. In this work, we introduce a sim-to-real learning framework for vision-based assembly tasks and perform training in a simulated environment by employing inputs from a single camera to address the aforementioned issues. We present a domain adaptation method based on cycle-consistent generative adversarial networks (CycleGAN) and a force control transfer approach to bridge the reality gap. We demonstrate that the proposed framework trained in a simulated environment can be successfully transferred to a real peg-in-hole setup.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Robotics and Biomimetics, ROBIO 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1426-1432
Number of pages7
ISBN (Electronic)9781665481090
DOIs
StatePublished - 2022
Event2022 IEEE International Conference on Robotics and Biomimetics, ROBIO 2022 - Jinghong, China
Duration: 5 Dec 20229 Dec 2022

Publication series

Name2022 IEEE International Conference on Robotics and Biomimetics, ROBIO 2022

Conference

Conference2022 IEEE International Conference on Robotics and Biomimetics, ROBIO 2022
Country/TerritoryChina
CityJinghong
Period5/12/229/12/22

Fingerprint

Dive into the research topics of 'Sim-to-Real Transfer of Robotic Assembly with Visual Inputs Using CycleGAN and Force Control'. Together they form a unique fingerprint.

Cite this