TY - GEN
T1 - Exploiting Augmented Reality for Extrinsic Robot Calibration and Eye-based Human-Robot Collaboration
AU - Weber, Daniel
AU - Kasneci, Enkelejda
AU - Zell, Andreas
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - For sensible human-robot interaction, it is crucial for the robot to have an awareness of its physical surroundings. In practical applications, however, the environment is manifold and possible objects for interaction are innumerable. Due to this fact, the use of robots in variable situations surrounded by unknown interaction entities is challenging and the inclusion of pre-trained object-detection neural networks not always feasible. In this work, we propose deploying augmented reality and eye tracking to flexibilize robots in non-predefined scenarios. To this end, we present and evaluate a method for extrinsic calibration of robot sensors, specifically a camera in our case, that is both fast and user-friendly, achieving competitive accuracy compared to classical approaches. By incorporating human gaze into the robot's segmentation process, we enable the 3D detection and localization of unknown objects without any training. Such an approach can facilitate interaction with objects for which training data is not available. At the same time, a visualization of the resulting 3D bounding boxes in the human's augmented reality leads to exceedingly direct feedback, providing insight into the robot's state of knowledge. Our approach thus opens the door to additional interaction possibilities, such as the subsequent initialization of actions like grasping.
AB - For sensible human-robot interaction, it is crucial for the robot to have an awareness of its physical surroundings. In practical applications, however, the environment is manifold and possible objects for interaction are innumerable. Due to this fact, the use of robots in variable situations surrounded by unknown interaction entities is challenging and the inclusion of pre-trained object-detection neural networks not always feasible. In this work, we propose deploying augmented reality and eye tracking to flexibilize robots in non-predefined scenarios. To this end, we present and evaluate a method for extrinsic calibration of robot sensors, specifically a camera in our case, that is both fast and user-friendly, achieving competitive accuracy compared to classical approaches. By incorporating human gaze into the robot's segmentation process, we enable the 3D detection and localization of unknown objects without any training. Such an approach can facilitate interaction with objects for which training data is not available. At the same time, a visualization of the resulting 3D bounding boxes in the human's augmented reality leads to exceedingly direct feedback, providing insight into the robot's state of knowledge. Our approach thus opens the door to additional interaction possibilities, such as the subsequent initialization of actions like grasping.
KW - augmented reality
KW - eye tracking
KW - human-robot collaboration
KW - object detection
KW - robot calibration
UR - http://www.scopus.com/inward/record.url?scp=85128085167&partnerID=8YFLogxK
U2 - 10.1109/HRI53351.2022.9889538
DO - 10.1109/HRI53351.2022.9889538
M3 - Conference contribution
AN - SCOPUS:85128085167
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 284
EP - 293
BT - HRI 2022 - Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction
PB - IEEE Computer Society
T2 - 17th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2022
Y2 - 7 March 2022 through 10 March 2022
ER -