TY - GEN
T1 - Automatic generation of saliency-based areas of interest for the visualization and analysis of eye-tracking data
AU - Fuhl, Wolfgang
AU - Kuebler, Thomas
AU - Santini, Thiago
AU - Kasneci, Enkelejda
N1 - Publisher Copyright:
© 2018 The Author(s) Eurographics Proceedings © 2018 The Eurographics Association.
PY - 2018
Y1 - 2018
N2 - Areas of interest (AOIs) are a powerful basis for the analysis and visualization of eye-tracking data. They allow to relate eyetracking metrics to semantic stimulus regions and to perform further statistics. In this work, we propose a novel method for the automated generation of AOIs based on saliency maps. In contrast to existing methods from the state-of-the-art, which generate AOIs based on eye-tracking data, our method generates AOIs based solely on the stimulus saliency, mimicking thus our natural vision. This way, our method is not only independent of the eye-tracking data, but allows to work AOI-based even for complex stimuli, such as abstract art, where proper manual definition of AOIs is not trivial. For evaluation, we cross-validate support vector machine classifiers with the task of separating visual scanpaths of art experts from those of novices. The motivation for this evaluation is to use AOIs as projection functions and to evaluate their robustness on different feature spaces. A good AOI separation should result in different feature sets that enable a fast evaluation with a widely automated work-flow. The proposed method together with the data shown in this paper is available as part of the software EyeTrace [?] http://www.ti.unituebingen.de/Eyetrace.1751.0.html.
AB - Areas of interest (AOIs) are a powerful basis for the analysis and visualization of eye-tracking data. They allow to relate eyetracking metrics to semantic stimulus regions and to perform further statistics. In this work, we propose a novel method for the automated generation of AOIs based on saliency maps. In contrast to existing methods from the state-of-the-art, which generate AOIs based on eye-tracking data, our method generates AOIs based solely on the stimulus saliency, mimicking thus our natural vision. This way, our method is not only independent of the eye-tracking data, but allows to work AOI-based even for complex stimuli, such as abstract art, where proper manual definition of AOIs is not trivial. For evaluation, we cross-validate support vector machine classifiers with the task of separating visual scanpaths of art experts from those of novices. The motivation for this evaluation is to use AOIs as projection functions and to evaluate their robustness on different feature spaces. A good AOI separation should result in different feature sets that enable a fast evaluation with a widely automated work-flow. The proposed method together with the data shown in this paper is available as part of the software EyeTrace [?] http://www.ti.unituebingen.de/Eyetrace.1751.0.html.
UR - http://www.scopus.com/inward/record.url?scp=85086180602&partnerID=8YFLogxK
U2 - 10.2312/vmv.20181252
DO - 10.2312/vmv.20181252
M3 - Conference contribution
AN - SCOPUS:85086180602
T3 - Vision, Modeling and Visualization, VMV 2018
BT - Vision, Modeling and Visualization, VMV 2018
A2 - Beck, Fabian
A2 - Dachsbacher, Carsten
A2 - Sadlo, Filip
PB - Eurographics Association
T2 - 2018 Conference on Vision, Modeling and Visualization, VMV 2018
Y2 - 10 October 2018 through 12 October 2018
ER -