TY - GEN
T1 - Can Eye Tracking with Pervasive Webcams Replace Dedicated Eye Trackers? an Experimental Comparison of Eye-Tracking Performance
AU - Asghari, Parviz
AU - Schindler, Maike
AU - Lilienthal, Achim J.
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - Eye tracking (ET) is increasingly used in cognitive science and human-computer interaction research. Currently, however, expensive ET devices are required, which limits their pervasive application, for example, as a tool for digital learning. The emergence of low-cost Artificial Intelligence (AI)-based consumer-grade webcam eye tracking, partially already available as open-source software, promises to change this situation. It is currently unclear (1) what performance in terms of, e.g., tracking accuracy, calibration stability and real time sampling rate stability can be achieved with webcam ET (wcET) in comparison to using expensive dedicated hardware eye tracking (dhET); and (2) how the expected performance degradation affects ET-based applications. In this work, we address the first question and present a wcET system, which we developed based on open source code and publicly available data sets. We ran this system with a consumer-grade Logitech BRIO Ultra HD Pro webcam that provides a stream of HD images 60 Hz simultaneously to a commercial remote eye tracker, the Tobii Pro X3-120, which delivers tracked gaze points at a rate of 120 Hz. Based on recordings of 20 participants (age: 27.0±4.5 years), we assessed the data quality in terms of accuracy, precision, and sampling rate stability. The observed performance of the wcET system (accuracy: 2.5 ∘± 0. 7 ∘, precision: 0.3 ∘± 0. 3 ∘ ) is, as expected, worse than the performance of the dhET system (accuracy: 0.9 ∘± 0. 9 ∘, precision: 0.7 ∘± 0. 8 ∘ ), which it is sufficient for many applications, especially if the stimulus can be designed and adapted to the gaze tracking quality correspondingly. Running the wcET system in real-time, we obtained a sampling rate of 26.3±1.03 Hz, i.e., nearly the frequency with which the camera provides images.
AB - Eye tracking (ET) is increasingly used in cognitive science and human-computer interaction research. Currently, however, expensive ET devices are required, which limits their pervasive application, for example, as a tool for digital learning. The emergence of low-cost Artificial Intelligence (AI)-based consumer-grade webcam eye tracking, partially already available as open-source software, promises to change this situation. It is currently unclear (1) what performance in terms of, e.g., tracking accuracy, calibration stability and real time sampling rate stability can be achieved with webcam ET (wcET) in comparison to using expensive dedicated hardware eye tracking (dhET); and (2) how the expected performance degradation affects ET-based applications. In this work, we address the first question and present a wcET system, which we developed based on open source code and publicly available data sets. We ran this system with a consumer-grade Logitech BRIO Ultra HD Pro webcam that provides a stream of HD images 60 Hz simultaneously to a commercial remote eye tracker, the Tobii Pro X3-120, which delivers tracked gaze points at a rate of 120 Hz. Based on recordings of 20 participants (age: 27.0±4.5 years), we assessed the data quality in terms of accuracy, precision, and sampling rate stability. The observed performance of the wcET system (accuracy: 2.5 ∘± 0. 7 ∘, precision: 0.3 ∘± 0. 3 ∘ ) is, as expected, worse than the performance of the dhET system (accuracy: 0.9 ∘± 0. 9 ∘, precision: 0.7 ∘± 0. 8 ∘ ), which it is sufficient for many applications, especially if the stimulus can be designed and adapted to the gaze tracking quality correspondingly. Running the wcET system in real-time, we obtained a sampling rate of 26.3±1.03 Hz, i.e., nearly the frequency with which the camera provides images.
KW - Eye tracking
KW - Eye-tracking hardware
KW - Performance evaluation
KW - Webcam eye tracking
UR - http://www.scopus.com/inward/record.url?scp=85144182926&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-19679-9_1
DO - 10.1007/978-3-031-19679-9_1
M3 - Conference contribution
AN - SCOPUS:85144182926
SN - 9783031196782
T3 - Communications in Computer and Information Science
SP - 3
EP - 10
BT - HCI International 2022 – Late Breaking Posters - 24th International Conference on Human-Computer Interaction, HCII 2022, Proceedings
A2 - Stephanidis, Constantine
A2 - Antona, Margherita
A2 - Ntoa, Stavroula
A2 - Salvendy, Gavriel
PB - Springer Science and Business Media Deutschland GmbH
T2 - 24th International Conference on Human-Computer Interaction, HCII 2022
Y2 - 26 June 2022 through 1 July 2022
ER -