TY - JOUR
T1 - Fusing joint measurements and visual features for In-Hand object pose estimation
AU - Pfanne, Martin
AU - Chalon, Maxime
AU - Stulp, Freek
AU - Albu-Schaffer, Alin
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2018/10
Y1 - 2018/10
N2 - For a robot to perform complex manipulation tasks, such as an in-hand manipulation, knowledge about the state of the grasp is required at all times. Moreover, even simple pick-and-place tasks may fail because unexpected motions of the object during the grasp are not accounted for. This letter proposes an approach that estimates the grasp state by combining finger measurements, i.e., joint positions and torques, with visual features that are extracted from monocular camera images. The different sensor modalities are fused using an extended Kalman filter. While the finger measurements allow to detect contacts and resolve collisions between the fingers and the estimated object, the visual features are used to align the object with the camera view. Experiments with the DLR robot David demonstrate the wide range of objects and manipulation scenarios that the method can be applied to. They also provide an insight into the strengths and limitations of the different complementary types of measurements.
AB - For a robot to perform complex manipulation tasks, such as an in-hand manipulation, knowledge about the state of the grasp is required at all times. Moreover, even simple pick-and-place tasks may fail because unexpected motions of the object during the grasp are not accounted for. This letter proposes an approach that estimates the grasp state by combining finger measurements, i.e., joint positions and torques, with visual features that are extracted from monocular camera images. The different sensor modalities are fused using an extended Kalman filter. While the finger measurements allow to detect contacts and resolve collisions between the fingers and the estimated object, the visual features are used to align the object with the camera view. Experiments with the DLR robot David demonstrate the wide range of objects and manipulation scenarios that the method can be applied to. They also provide an insight into the strengths and limitations of the different complementary types of measurements.
KW - Perception for grasping and manipulation
KW - dexterous manipulation
KW - sensor fusion
UR - http://www.scopus.com/inward/record.url?scp=85060806191&partnerID=8YFLogxK
U2 - 10.1109/LRA.2018.2853652
DO - 10.1109/LRA.2018.2853652
M3 - Article
AN - SCOPUS:85060806191
SN - 2377-3766
VL - 3
SP - 3497
EP - 3504
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 4
ER -