TY - JOUR
T1 - Anthropomorphic Grasping with Neural Object Shape Completion
AU - Hidalgo-Carvajal, Diego
AU - Chen, Hanzhi
AU - Bettelani, Gemma C.
AU - Jung, Jaesug
AU - Busse, Laura
AU - Naceri, Abdeldjallil
AU - Leutenegger, Stefan
AU - Haddadin, Sami
AU - Zavaglia, Melissa
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2023/12/1
Y1 - 2023/12/1
N2 - The progressive prevalence of robots in human-suited environments has given rise to a myriad of object manipulation techniques, in which dexterity plays a paramount role. It is well-established that humans exhibit extraordinary dexterity when handling objects. Such dexterity seems to derive from a robust understanding of object properties (such as weight, size, and shape), as well as a remarkable capacity to interact with them. Hand postures commonly demonstrate the influence of specific regions on objects that need to be grasped, especially when objects are partially visible. In this work, we leverage human-like object understanding by reconstructing and completing their full geometry from partial observations, and manipulating them using a 7-DoF anthropomorphic robot hand. Our approach has significantly improved the grasping success rates of baselines with only partial reconstruction by nearly 30% and achieved over 150 successful grasps with three different object categories. This demonstrates our approach's consistent ability to predict and execute grasping postures based on the completed object shapes from various directions and positions in real-world scenarios. Our work opens up new possibilities for enhancing robotic applications that require precise grasping and manipulation skills of real-world reconstructed objects.
AB - The progressive prevalence of robots in human-suited environments has given rise to a myriad of object manipulation techniques, in which dexterity plays a paramount role. It is well-established that humans exhibit extraordinary dexterity when handling objects. Such dexterity seems to derive from a robust understanding of object properties (such as weight, size, and shape), as well as a remarkable capacity to interact with them. Hand postures commonly demonstrate the influence of specific regions on objects that need to be grasped, especially when objects are partially visible. In this work, we leverage human-like object understanding by reconstructing and completing their full geometry from partial observations, and manipulating them using a 7-DoF anthropomorphic robot hand. Our approach has significantly improved the grasping success rates of baselines with only partial reconstruction by nearly 30% and achieved over 150 successful grasps with three different object categories. This demonstrates our approach's consistent ability to predict and execute grasping postures based on the completed object shapes from various directions and positions in real-world scenarios. Our work opens up new possibilities for enhancing robotic applications that require precise grasping and manipulation skills of real-world reconstructed objects.
KW - Deep learning in grasping and manipulation
KW - dexterous manipulation
KW - grasping
KW - multifingered hands
UR - http://www.scopus.com/inward/record.url?scp=85174809185&partnerID=8YFLogxK
U2 - 10.1109/LRA.2023.3322086
DO - 10.1109/LRA.2023.3322086
M3 - Article
AN - SCOPUS:85174809185
SN - 2377-3766
VL - 8
SP - 8034
EP - 8041
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 12
ER -