TY - GEN
T1 - Passer Kinematic Cues for Object Weight Prediction in a Simulated Robot-Human Handover
AU - Günter, Clara
AU - Figueredo, Luis
AU - Hermsdörfer, Joachim
AU - Franklin, David W.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Object handovers, a seemingly straightforward action, involve a complex interplay of predictive and reactive control mechanisms in both partners. Understanding the cues that are used by humans to predict object properties is needed for planning natural robot handovers. In human-human interactions, the receiver can extract information from the passer's movement. Here, we show in a VR simulated agenthuman object handover, that the human receiver can use passer kinematic cues to predict the transported object's properties, such as weight, and preemptively adapt the grasping strategy towards them. We show that when the agent's movement is correlated to the object weight, humans can interpret this cue and produce proportional anticipatory grip forces before object release. This adaptation is learned even when objects are presented in a random order and is strengthened with the repeated presentation of the pairing. The outcome of this study contributes to a better understanding of non-verbal cues in handover tasks and enables more transparent and efficient real-world physical robot-human interactions.
AB - Object handovers, a seemingly straightforward action, involve a complex interplay of predictive and reactive control mechanisms in both partners. Understanding the cues that are used by humans to predict object properties is needed for planning natural robot handovers. In human-human interactions, the receiver can extract information from the passer's movement. Here, we show in a VR simulated agenthuman object handover, that the human receiver can use passer kinematic cues to predict the transported object's properties, such as weight, and preemptively adapt the grasping strategy towards them. We show that when the agent's movement is correlated to the object weight, humans can interpret this cue and produce proportional anticipatory grip forces before object release. This adaptation is learned even when objects are presented in a random order and is strengthened with the repeated presentation of the pairing. The outcome of this study contributes to a better understanding of non-verbal cues in handover tasks and enables more transparent and efficient real-world physical robot-human interactions.
UR - http://www.scopus.com/inward/record.url?scp=85214461194&partnerID=8YFLogxK
U2 - 10.1109/Humanoids58906.2024.10769881
DO - 10.1109/Humanoids58906.2024.10769881
M3 - Conference contribution
AN - SCOPUS:85214461194
T3 - IEEE-RAS International Conference on Humanoid Robots
SP - 173
EP - 180
BT - 2024 IEEE-RAS 23rd International Conference on Humanoid Robots, Humanoids 2024
PB - IEEE Computer Society
T2 - 23rd IEEE-RAS International Conference on Humanoid Robots, Humanoids 2024
Y2 - 22 November 2024 through 24 November 2024
ER -