TY - GEN
T1 - Vision-driven Collaborative Mobile Robotic Human Assistant System for Daily Living Activities
AU - Wu, Yuankai
AU - Messaoud, Rayene
AU - Chen, Xiao
AU - Hildebrandt, Arne Christoph
AU - Baldini, Marco
AU - Patsch, Constantin
AU - Sadeghian, Hamid
AU - Haddadin, Sami
AU - Steinbach, Eckehard
N1 - Publisher Copyright:
Copyright © 2023 The Authors. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/)
PY - 2023/7/1
Y1 - 2023/7/1
N2 - Assistive robotics is a rapidly growing research area. They are increasingly in demand in industry, medical services, or even domestic environments. Many assistance systems were developed in the last decades to help with industry tasks mainly. However, the majority of assistive robotic systems can only execute commands according to pre-defined instructions and are not capable of social interaction with humans. To this end, a human-centered assistive robotic system that can automatically assist or completely help humans to accomplish their intentions is expected. In this paper, we first proposed a novel triplet concept that describes the intended task of humans but will be executed by the assistive robot. The triplet includes target object, subject and action enabling the robot to identify auxiliary subjects, desired objects and perform required actions. Furthermore, a vision-driven mobile robotic human assistant system for assisting humans in daily life tasks is developed. The proposed triplet was proven to be able to complete a fully autonomous robotic system as a predefined input guide. Sensor calibration, deep learning based object detection, manipulation, and grasping techniques are used to achieve this goal. We evaluated the proposed system using various simulation and experimental scenarios and confirmed its effectiveness.
AB - Assistive robotics is a rapidly growing research area. They are increasingly in demand in industry, medical services, or even domestic environments. Many assistance systems were developed in the last decades to help with industry tasks mainly. However, the majority of assistive robotic systems can only execute commands according to pre-defined instructions and are not capable of social interaction with humans. To this end, a human-centered assistive robotic system that can automatically assist or completely help humans to accomplish their intentions is expected. In this paper, we first proposed a novel triplet concept that describes the intended task of humans but will be executed by the assistive robot. The triplet includes target object, subject and action enabling the robot to identify auxiliary subjects, desired objects and perform required actions. Furthermore, a vision-driven mobile robotic human assistant system for assisting humans in daily life tasks is developed. The proposed triplet was proven to be able to complete a fully autonomous robotic system as a predefined input guide. Sensor calibration, deep learning based object detection, manipulation, and grasping techniques are used to achieve this goal. We evaluated the proposed system using various simulation and experimental scenarios and confirmed its effectiveness.
KW - Assistive robot
KW - Autonomous robotic systems
KW - Human-centred automation and design
KW - Perception and sensing
UR - http://www.scopus.com/inward/record.url?scp=85182524214&partnerID=8YFLogxK
U2 - 10.1016/j.ifacol.2023.10.1824
DO - 10.1016/j.ifacol.2023.10.1824
M3 - Conference contribution
AN - SCOPUS:85182524214
T3 - IFAC-PapersOnLine
SP - 4400
EP - 4405
BT - IFAC-PapersOnLine
A2 - Ishii, Hideaki
A2 - Ebihara, Yoshio
A2 - Imura, Jun-ichi
A2 - Yamakita, Masaki
PB - Elsevier B.V.
T2 - 22nd IFAC World Congress
Y2 - 9 July 2023 through 14 July 2023
ER -