TY - JOUR
T1 - Navigation and manipulation planning using a visuo-haptic sensor on a mobile platform
AU - Alt, Nicolas
AU - Steinbach, Eckehard
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/11/1
Y1 - 2014/11/1
N2 - Mobile systems interacting with objects in unstructured environments require both haptic and visual sensors to acquire sufficient scene knowledge for tasks such as navigation and manipulation. Typically, separate sensors and processing systems are used for the two modalities. We propose to acquire haptic and visual measurements simultaneously, providing naturally coherent data. For this, compression of a passive, deformable foam rod mounted on the actuator is measured visually by a low-cost camera, yielding a 1-D stress function sampled along the contour of the rod. The same camera observes the nearby scene to detect objects and their reactions to manipulation. The system is passively compliant and the complexity of the sensor subsystems is reduced. Furthermore, we present an integrated approach for navigation and manipulation on mobile platforms, which integrates haptic data from the sensor. A high-level planning graph represents both the structure of a visually acquired map, as well as manipulable obstacles. Paths within this graph represent high-level navigation and manipulation tasks, e.g., pushing of obstacles. A cost-optimal task plan is generated using standard pathfinding techniques. The approach is implemented and validated on a mobile robotic platform. Obtained forces are compared with a reference, showing high accuracy within the medium sensor range. A real-world experiment is presented, which uses the sensor for haptic exploration of obstacles in an office environment. Substantially faster task plans can be found in cluttered scenes compared with purely visual navigation.
AB - Mobile systems interacting with objects in unstructured environments require both haptic and visual sensors to acquire sufficient scene knowledge for tasks such as navigation and manipulation. Typically, separate sensors and processing systems are used for the two modalities. We propose to acquire haptic and visual measurements simultaneously, providing naturally coherent data. For this, compression of a passive, deformable foam rod mounted on the actuator is measured visually by a low-cost camera, yielding a 1-D stress function sampled along the contour of the rod. The same camera observes the nearby scene to detect objects and their reactions to manipulation. The system is passively compliant and the complexity of the sensor subsystems is reduced. Furthermore, we present an integrated approach for navigation and manipulation on mobile platforms, which integrates haptic data from the sensor. A high-level planning graph represents both the structure of a visually acquired map, as well as manipulable obstacles. Paths within this graph represent high-level navigation and manipulation tasks, e.g., pushing of obstacles. A cost-optimal task plan is generated using standard pathfinding techniques. The approach is implemented and validated on a mobile robotic platform. Obtained forces are compared with a reference, showing high accuracy within the medium sensor range. A real-world experiment is presented, which uses the sensor for haptic exploration of obstacles in an office environment. Substantially faster task plans can be found in cluttered scenes compared with purely visual navigation.
KW - Cognitive robotics
KW - motion planning
KW - robot sensing systems
KW - robot vision systems
KW - tactile sensors.
UR - http://www.scopus.com/inward/record.url?scp=84908039908&partnerID=8YFLogxK
U2 - 10.1109/TIM.2014.2315734
DO - 10.1109/TIM.2014.2315734
M3 - Article
AN - SCOPUS:84908039908
SN - 0018-9456
VL - 63
SP - 2570
EP - 2582
JO - IEEE Transactions on Instrumentation and Measurement
JF - IEEE Transactions on Instrumentation and Measurement
IS - 11
M1 - 6822616
ER -