TY - GEN
T1 - [POSTER] Natural user interface for ambient objects
AU - Ma, Meng
AU - Merckx, Kevin
AU - Fallavollita, Pascal
AU - Navab, Nassir
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/11/11
Y1 - 2015/11/11
N2 - To help the computing device always understand the spacial relationship between the user's gesture and the ambient objects, a methodology is proposed to find the user's virtual eye center in the wearable camera coordinate system and then calculate accurately where a user is pointing at to perform the natural interaction. First, the wearable RGB-D sensor is affixed around the user forehead. A tool-free calibration is done by having the user move their fingers along their lines of sight from his eye center to the random selected targets. The fingertips are detected in the depth camera and then the interaction of these lines of sight is calculated. Then we present how to find where the user is pointing at in different scenarios with a depth map, a detected object and a controlled virtual element. To validate our methods, we perform a point-to-screen experiment. Results demonstrate that when a user is interacting with a display up to 1.5 meters away, our natural gesture interface has an average error of 2.1cm. In conclusion, the presented technique is a viable option for a reliable user interaction.
AB - To help the computing device always understand the spacial relationship between the user's gesture and the ambient objects, a methodology is proposed to find the user's virtual eye center in the wearable camera coordinate system and then calculate accurately where a user is pointing at to perform the natural interaction. First, the wearable RGB-D sensor is affixed around the user forehead. A tool-free calibration is done by having the user move their fingers along their lines of sight from his eye center to the random selected targets. The fingertips are detected in the depth camera and then the interaction of these lines of sight is calculated. Then we present how to find where the user is pointing at in different scenarios with a depth map, a detected object and a controlled virtual element. To validate our methods, we perform a point-to-screen experiment. Results demonstrate that when a user is interacting with a display up to 1.5 meters away, our natural gesture interface has an average error of 2.1cm. In conclusion, the presented technique is a viable option for a reliable user interaction.
KW - H.1.2 [Human-centered computing]
KW - H.1.2 [Human-centered computing]
KW - Human computer interaction (HCI)- Life Cycle
KW - Interaction design process and methods-Interaction Design
UR - https://www.scopus.com/pages/publications/84962264943
U2 - 10.1109/ISMAR.2015.25
DO - 10.1109/ISMAR.2015.25
M3 - Conference contribution
AN - SCOPUS:84962264943
T3 - Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2015
SP - 76
EP - 79
BT - Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2015
A2 - Sakata, Nobuchika
A2 - Newcombe, Richard
A2 - Lindeman, Robert
A2 - Sandor, Christian
A2 - Mayol-Cuevas, Walterio
A2 - Teichrieb, Veronica
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 14th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2015
Y2 - 29 September 2015 through 3 October 2015
ER -