Leveraging Motion Tracking for Intuitive Interactions in a Tablet-Based 3D Scene Annotation System

Tianyu Song, Ulrich Eck, Nassir Navab

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In the rapidly evolving field of computer vision, efficient and accurate annotation of 3D scenes plays a crucial role. While automation has streamlined this process, manual intervention is still essential for obtaining precise annotations. Existing annotation tools often lack intuitive interactions and efficient interfaces, particularly when it comes to annotating complex elements such as 3D bounding boxes, 6D human poses, and semantic relationships in a 3D scene. Therefore, it is often time-consuming and error-prone. Emerging technologies such as augmented reality (AR) and virtual reality (VR) have shown potential to provide an immersive and interactive environment for annotators to label objects and their relationships. However, the cost and accessibility of these technologies can be a barrier to their widespread adoption. This work introduces a novel tablet-based system that utilizes built-in motion tracking to facilitate an efficient and intuitive 3D scene annotation process. The system supports a variety of annotation tasks and leverages the tracking and mobility features of the tablet to enhance user interactions. Through a thorough user study investigating three distinct tasks - creating bounding boxes, adjusting human poses, and annotating scene relationships - we evaluate the effectiveness and usability of two interaction methods: touch-based interactions and hybrid interactions that utilize both touch and device motion tracking. Our results suggest that leveraging the tablet's motion tracking feature could lead to more intuitive and efficient annotation processes. This work contributes to the understanding of tablet-based interaction and the potential it holds for annotating complex 3D scenes.

Original languageEnglish
Title of host publicationProceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023
EditorsGerd Bruder, Anne-Helene Olivier, Andrew Cunningham, Evan Yifan Peng, Jens Grubert, Ian Williams
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages563-572
Number of pages10
ISBN (Electronic)9798350328387
DOIs
StatePublished - 2023
Event22nd IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023 - Sydney, Australia
Duration: 16 Oct 202320 Oct 2023

Publication series

NameProceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023

Conference

Conference22nd IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2023
Country/TerritoryAustralia
CitySydney
Period16/10/2320/10/23

Keywords

  • Empirical studies in interaction design
  • Human-centered computing
  • Interaction design
  • Interaction design process and methods
  • Ubiquitous and mobile computing design and evaluation methods

Fingerprint

Dive into the research topics of 'Leveraging Motion Tracking for Intuitive Interactions in a Tablet-Based 3D Scene Annotation System'. Together they form a unique fingerprint.

Cite this