Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing

Nora Castner, Thomas C. Küebler, Katharina Scheiter, Juliane Richter, Thérése Eder, Fabian Hüettig, Constanze Keutel, Enkelejda Kasneci

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

41 Zitate (Scopus)

Abstract

Modeling eye movement indicative of expertise behavior is decisive in user evaluation. However, it is indisputable that task semantics affect gaze behavior. We present a novel approach to gaze scanpath comparison that incorporates convolutional neural networks (CNN) to process scene information at the fixation level. Image patches linked to respective fixations are used as input for a CNN and the resulting feature vectors provide the temporal and spatial gaze information necessary for scanpath similarity comparison. We evaluated our proposed approach on gaze data from expert and novice dentists interpreting dental radiographs using a local alignment similarity score. Our approach was capable of distinguishing experts from novices with 93% accuracy while incorporating the image semantics. Moreover, our scanpath comparison using image patch features has the potential to incorporate task semantics from a variety of tasks.

OriginalspracheEnglisch
TitelProceedings ETRA 2020 Full Papers - ACM Symposium on Eye Tracking Research and Applications
Redakteure/-innenStephen N. Spencer
Herausgeber (Verlag)Association for Computing Machinery
ISBN (elektronisch)9781450371339
DOIs
PublikationsstatusVeröffentlicht - 6 Feb. 2020
Extern publiziertJa
Veranstaltung2020 ACM Symposium on Eye Tracking Research and Applications, ETRA 2020 - Stuttgart, Deutschland
Dauer: 2 Juni 20205 Juni 2020

Publikationsreihe

NameEye Tracking Research and Applications Symposium (ETRA)

Konferenz

Konferenz2020 ACM Symposium on Eye Tracking Research and Applications, ETRA 2020
Land/GebietDeutschland
OrtStuttgart
Zeitraum2/06/205/06/20

Fingerprint

Untersuchen Sie die Forschungsthemen von „Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren