TY - GEN
T1 - Leveraging Motion Tracking for Intuitive Interactions in a Tablet-Based 3D Scene Annotation System
AU - Song, Tianyu
AU - Eck, Ulrich
AU - Navab, Nassir
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In the rapidly evolving field of computer vision, efficient and accurate annotation of 3D scenes plays a crucial role. While automation has streamlined this process, manual intervention is still essential for obtaining precise annotations. Existing annotation tools often lack intuitive interactions and efficient interfaces, particularly when it comes to annotating complex elements such as 3D bounding boxes, 6D human poses, and semantic relationships in a 3D scene. Therefore, it is often time-consuming and error-prone. Emerging technologies such as augmented reality (AR) and virtual reality (VR) have shown potential to provide an immersive and interactive environment for annotators to label objects and their relationships. However, the cost and accessibility of these technologies can be a barrier to their widespread adoption. This work introduces a novel tablet-based system that utilizes built-in motion tracking to facilitate an efficient and intuitive 3D scene annotation process. The system supports a variety of annotation tasks and leverages the tracking and mobility features of the tablet to enhance user interactions. Through a thorough user study investigating three distinct tasks - creating bounding boxes, adjusting human poses, and annotating scene relationships - we evaluate the effectiveness and usability of two interaction methods: touch-based interactions and hybrid interactions that utilize both touch and device motion tracking. Our results suggest that leveraging the tablet's motion tracking feature could lead to more intuitive and efficient annotation processes. This work contributes to the understanding of tablet-based interaction and the potential it holds for annotating complex 3D scenes.
AB - In the rapidly evolving field of computer vision, efficient and accurate annotation of 3D scenes plays a crucial role. While automation has streamlined this process, manual intervention is still essential for obtaining precise annotations. Existing annotation tools often lack intuitive interactions and efficient interfaces, particularly when it comes to annotating complex elements such as 3D bounding boxes, 6D human poses, and semantic relationships in a 3D scene. Therefore, it is often time-consuming and error-prone. Emerging technologies such as augmented reality (AR) and virtual reality (VR) have shown potential to provide an immersive and interactive environment for annotators to label objects and their relationships. However, the cost and accessibility of these technologies can be a barrier to their widespread adoption. This work introduces a novel tablet-based system that utilizes built-in motion tracking to facilitate an efficient and intuitive 3D scene annotation process. The system supports a variety of annotation tasks and leverages the tracking and mobility features of the tablet to enhance user interactions. Through a thorough user study investigating three distinct tasks - creating bounding boxes, adjusting human poses, and annotating scene relationships - we evaluate the effectiveness and usability of two interaction methods: touch-based interactions and hybrid interactions that utilize both touch and device motion tracking. Our results suggest that leveraging the tablet's motion tracking feature could lead to more intuitive and efficient annotation processes. This work contributes to the understanding of tablet-based interaction and the potential it holds for annotating complex 3D scenes.
KW - Empirical studies in interaction design
KW - Human-centered computing
KW - Interaction design
KW - Interaction design process and methods
KW - Ubiquitous and mobile computing design and evaluation methods
UR - http://www.scopus.com/inward/record.url?scp=85180370808&partnerID=8YFLogxK
U2 - 10.1109/ISMAR59233.2023.00071
DO - 10.1109/ISMAR59233.2023.00071
M3 - Conference contribution
AN - SCOPUS:85180370808
T3 - Proceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023
SP - 563
EP - 572
BT - Proceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023
A2 - Bruder, Gerd
A2 - Olivier, Anne-Helene
A2 - Cunningham, Andrew
A2 - Peng, Evan Yifan
A2 - Grubert, Jens
A2 - Williams, Ian
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 22nd IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023
Y2 - 16 October 2023 through 20 October 2023
ER -