TY - GEN
T1 - Sticky projections - A new approach to interactive shader lamp tracking
AU - Resch, Christoph
AU - Keitler, Peter
AU - Klinker, Gudrun
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/11/5
Y1 - 2014/11/5
N2 - Shader lamps can augment physical objects with projected virtual replications using a camera-projector system, provided that the physical and virtual object are well registered. Precise registration and tracking has been a cumbersome and intrusive process in the past. In this paper, we present a new method for tracking arbitrarily shaped physical objects interactively. In contrast to previous approaches our system is mobile and makes solely use of the projection of the virtual replication to track the physical object and 'stick' the projection to it. Our method consists of two stages, a fast pose initialization based on structured light patterns and a non-intrusive frame-by-frame tracking based on features detected in the projection. In the initialization phase a dense point cloud of the physical object is reconstructed and precisely matched to the virtual model to perfectly overlay the projection. During the tracking phase, a radiometrically corrected virtual camera view based on the current pose prediction is rendered and compared to the captured image. Matched features are triangulated providing a sparse set of surface points that is robustly aligned to the virtual model. The alignment transformation serves as an input for the new pose prediction. Quantitative experiments show that our approach can robustly track complex objects at interactive rates.
AB - Shader lamps can augment physical objects with projected virtual replications using a camera-projector system, provided that the physical and virtual object are well registered. Precise registration and tracking has been a cumbersome and intrusive process in the past. In this paper, we present a new method for tracking arbitrarily shaped physical objects interactively. In contrast to previous approaches our system is mobile and makes solely use of the projection of the virtual replication to track the physical object and 'stick' the projection to it. Our method consists of two stages, a fast pose initialization based on structured light patterns and a non-intrusive frame-by-frame tracking based on features detected in the projection. In the initialization phase a dense point cloud of the physical object is reconstructed and precisely matched to the virtual model to perfectly overlay the projection. During the tracking phase, a radiometrically corrected virtual camera view based on the current pose prediction is rendered and compared to the captured image. Matched features are triangulated providing a sparse set of surface points that is robustly aligned to the virtual model. The alignment transformation serves as an input for the new pose prediction. Quantitative experiments show that our approach can robustly track complex objects at interactive rates.
UR - http://www.scopus.com/inward/record.url?scp=84943806822&partnerID=8YFLogxK
U2 - 10.1109/ISMAR.2014.6948421
DO - 10.1109/ISMAR.2014.6948421
M3 - Conference contribution
AN - SCOPUS:84943806822
T3 - ISMAR 2014 - IEEE International Symposium on Mixed and Augmented Reality - Science and Technology 2014, Proceedings
SP - 151
EP - 156
BT - ISMAR 2014 - IEEE International Symposium on Mixed and Augmented Reality - Science and Technology 2014, Proceedings
A2 - Lindeman, Robert W.
A2 - Sandor, Christian
A2 - Julier, Simon
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2014
Y2 - 10 September 2014 through 12 September 2014
ER -