TY - GEN
T1 - Eye-Hand Behavior in Human-Robot Shared Manipulation
AU - Aronson, Reuben M.
AU - Santini, Thiago
AU - Kübler, Thomas C.
AU - Kasneci, Enkelejda
AU - Srinivasa, Siddhartha
AU - Admoni, Henny
N1 - Publisher Copyright:
© 2018 ACM.
PY - 2018/2/26
Y1 - 2018/2/26
N2 - Shared autonomy systems enhance people's abilities to perform activities of daily living using robotic manipulators. Recent systems succeed by first identifying their operators' intentions, typically by analyzing the user's joystick input. To enhance this recognition, it is useful to characterize people's behavior while performing such a task. Furthermore, eye gaze is a rich source of information for understanding operator intention. The goal of this paper is to provide novel insights into the dynamics of control behavior and eye gaze in human-robot shared manipulation tasks. To achieve this goal, we conduct a data collection study that uses an eye tracker to record eye gaze during a human-robot shared manipulation activity, both with and without shared autonomy assistance. We process the gaze signals from the study to extract gaze features like saccades, fixations, smooth pursuits, and scan paths. We analyze those features to identify novel patterns of gaze behaviors and highlight where these patterns are similar to and different from previous findings about eye gaze in human-only manipulation tasks. The work described in this paper lays a foundation for a model of natural human eye gaze in human-robot shared manipulation.
AB - Shared autonomy systems enhance people's abilities to perform activities of daily living using robotic manipulators. Recent systems succeed by first identifying their operators' intentions, typically by analyzing the user's joystick input. To enhance this recognition, it is useful to characterize people's behavior while performing such a task. Furthermore, eye gaze is a rich source of information for understanding operator intention. The goal of this paper is to provide novel insights into the dynamics of control behavior and eye gaze in human-robot shared manipulation tasks. To achieve this goal, we conduct a data collection study that uses an eye tracker to record eye gaze during a human-robot shared manipulation activity, both with and without shared autonomy assistance. We process the gaze signals from the study to extract gaze features like saccades, fixations, smooth pursuits, and scan paths. We analyze those features to identify novel patterns of gaze behaviors and highlight where these patterns are similar to and different from previous findings about eye gaze in human-only manipulation tasks. The work described in this paper lays a foundation for a model of natural human eye gaze in human-robot shared manipulation.
KW - Eye gaze
KW - Eye tracking
KW - Human-robot interaction
KW - Nonverbal communication
KW - Shared autonomy
UR - http://www.scopus.com/inward/record.url?scp=85045148532&partnerID=8YFLogxK
U2 - 10.1145/3171221.3171287
DO - 10.1145/3171221.3171287
M3 - Conference contribution
AN - SCOPUS:85045148532
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 4
EP - 13
BT - HRI 2018 - Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
PB - IEEE Computer Society
T2 - 13th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2018
Y2 - 5 March 2018 through 8 March 2018
ER -