TY - GEN
T1 - RetroDepth
T2 - 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI 2014
AU - Kim, David
AU - Izadi, Shahram
AU - Dostal, Jakub
AU - Rhemann, Christoph
AU - Keskin, Cem
AU - Zach, Christopher
AU - Shotton, Jamie
AU - Large, Tim
AU - Bathiche, Steven
AU - Nießner, Matthias
AU - Butler, D. Alex
AU - Fanello, Sean
AU - Pradeep, Vivek
PY - 2014
Y1 - 2014
N2 - We present RetroDepth, a new vision-based system for accurately sensing the 3D silhouettes of hands, styluses, and other objects, as they interact on and above physical surfaces. Our setup is simple, cheap, and easily reproducible, comprising of two infrared cameras, diffuse infrared LEDs, and any off-the-shelf retro-reflective material. The retro-reflector aids image segmentation, creating a strong contrast between the surface and any object in proximity. A new highly efficient stereo matching algorithm precisely estimates the 3D contours of interacting objects and the retro-reflective surfaces. A novel pipeline enables 3D finger, hand and object tracking, as well as gesture recognition, purely using these 3D contours. We demonstrate high-precision sensing, allowing robust disambiguation between a finger or stylus touching, pressing or interacting above the surface. This allows many interactive scenarios that seamlessly mix together freehand 3D interactions with touch, pressure and stylus input. As shown, these rich modalities of input are enabled on and above any retro-reflective surface, including custom "physical widgets" fabricated by users. We compare our system with Kinect and Leap Motion, and conclude with limitations and future work.
AB - We present RetroDepth, a new vision-based system for accurately sensing the 3D silhouettes of hands, styluses, and other objects, as they interact on and above physical surfaces. Our setup is simple, cheap, and easily reproducible, comprising of two infrared cameras, diffuse infrared LEDs, and any off-the-shelf retro-reflective material. The retro-reflector aids image segmentation, creating a strong contrast between the surface and any object in proximity. A new highly efficient stereo matching algorithm precisely estimates the 3D contours of interacting objects and the retro-reflective surfaces. A novel pipeline enables 3D finger, hand and object tracking, as well as gesture recognition, purely using these 3D contours. We demonstrate high-precision sensing, allowing robust disambiguation between a finger or stylus touching, pressing or interacting above the surface. This allows many interactive scenarios that seamlessly mix together freehand 3D interactions with touch, pressure and stylus input. As shown, these rich modalities of input are enabled on and above any retro-reflective surface, including custom "physical widgets" fabricated by users. We compare our system with Kinect and Leap Motion, and conclude with limitations and future work.
KW - 3D contours
KW - 3D input
KW - Contour classification
KW - Depth sensing
KW - NUI
KW - Stereo matching
KW - Stylus
KW - Touch
KW - Vision-based UIs
UR - http://www.scopus.com/inward/record.url?scp=84900401228&partnerID=8YFLogxK
U2 - 10.1145/2556288.2557336
DO - 10.1145/2556288.2557336
M3 - Conference contribution
AN - SCOPUS:84900401228
SN - 9781450324731
T3 - Conference on Human Factors in Computing Systems - Proceedings
SP - 1377
EP - 1386
BT - CHI 2014
PB - Association for Computing Machinery
Y2 - 26 April 2014 through 1 May 2014
ER -