Eye Tracking Auto-Correction Using Domain Information

Parviz Asghari, Maike Schindler, Achim J. Lilienthal

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

1 Zitat (Scopus)

Abstract

Webcam-based eye tracking (wcET) comes with the promise to become a pervasive platform for inexpensive, easy, and quick collection of gaze data without requiring dedicated hardware. To fulfill this promise, wcET must address issues with poor and variable spatial accuracy due to, e.g., participant movement, calibration validity, and the uncertainty of the gaze prediction method used. Eye-tracking (ET) data often suffer particularly from a considerable spatial offset that reduces data quality and heavily affects both qualitative and quantitative ET data analysis. Previous works attempted to mitigate the specific source of spatial offset, e.g., by using chin rests to limit participant movement during ET experiments, by frequent re-calibration or by incorporating head position and facial features into the gaze prediction algorithm. Yet spatial offset remains an issue for wcET, particularly in daily life settings involving children. It is currently unclear (1) if spatial offset can be automatically estimated in absence of ground truth; and (2) whether the estimated offset can be used to obtain substantially higher data quality. In response to the first research question, we propose a method to estimate the spatial offset using domain information. We estimate the spatial offset by maximizing the ET data correlation with Areas of Interests (AOIs) defined over the stimulus. To address the second research question, we developed a wcET system and ran it simultaneously with a commercial remote eye tracker, the Tobii Pro X3-120. After temporal synchronization, we calculated the average distance between the gaze points of the two systems as a measure of data quality. For all tasks investigated, we obtained an overall improvement of the raw data. Specifically, we observed an improvement of 1.35, 1.02, and 0.92 in three tasks with varying characteristics of AOIs. This is an important step towards pervasive use of wcET data with a large variety of practical applications.

OriginalspracheEnglisch
TitelHuman-Computer Interaction - Thematic Area, HCI 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Proceedings
Redakteure/-innenMasaaki Kurosu, Ayako Hashizume
Herausgeber (Verlag)Springer Science and Business Media Deutschland GmbH
Seiten373-391
Seitenumfang19
ISBN (Print)9783031355950
DOIs
PublikationsstatusVeröffentlicht - 2023
VeranstaltungThematic Area on Human Computer Interaction, HCI 2023, held as part of the 25th International Conference on Human-Computer Interaction, HCII 2023 - Copenhagen, Dänemark
Dauer: 23 Juli 202328 Juli 2023

Publikationsreihe

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Band14011 LNCS
ISSN (Print)0302-9743
ISSN (elektronisch)1611-3349

Konferenz

KonferenzThematic Area on Human Computer Interaction, HCI 2023, held as part of the 25th International Conference on Human-Computer Interaction, HCII 2023
Land/GebietDänemark
OrtCopenhagen
Zeitraum23/07/2328/07/23

Fingerprint

Untersuchen Sie die Forschungsthemen von „Eye Tracking Auto-Correction Using Domain Information“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren