TY - GEN
T1 - Towards automated comparison of eye-tracking recordings in dynamic scenes
AU - Kubler, Thomas C.
AU - Bukenberger, Dennis R.
AU - Ungewiss, Judith
AU - Worner, Alexandra
AU - Rothe, Colleen
AU - Schiefer, Ulrich
AU - Rosenstiel, Wolfgang
AU - Kasneci, Enkelejda
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2015/1/22
Y1 - 2015/1/22
N2 - Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.
AB - Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.
KW - Scan pattern
KW - area of interest annotation
KW - automation
KW - eye tracking data analysis
KW - image segmentation
KW - scanpath comparison
UR - http://www.scopus.com/inward/record.url?scp=84923622483&partnerID=8YFLogxK
U2 - 10.1109/EUVIP.2014.7018371
DO - 10.1109/EUVIP.2014.7018371
M3 - Conference contribution
AN - SCOPUS:84923622483
T3 - EUVIP 2014 - 5th European Workshop on Visual Information Processing
BT - EUVIP 2014 - 5th European Workshop on Visual Information Processing
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th European Workshop on Visual Information Processing, EUVIP 2014
Y2 - 10 December 2014 through 12 December 2014
ER -