SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies

Thomas C. Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, Enkelejda Kasneci

Research output: Contribution to journalArticlepeer-review

51 Scopus citations

Abstract

Our eye movements are driven by a continuous trade-off between the need for detailed examination of objects of interest and the necessity to keep an overview of our surrounding. In consequence, behavioral patterns that are characteristic for our actions and their planning are typically manifested in the way we move our eyes to interact with our environment. Identifying such patterns from individual eye movement measurements is however highly challenging. In this work, we tackle the challenge of quantifying the influence of experimental factors on eye movement sequences. We introduce an algorithm for extracting sequence-sensitive features from eye movements and for the classification of eye movements based on the frequencies of small subsequences. Our approach is evaluated against the state-of-the art on a novel and a very rich collection of eye movements data derived from four experimental settings, from static viewing tasks to highly dynamic outdoor settings. Our results show that the proposed method is able to classify eye movement sequences over a variety of experimental designs. The choice of parameters is discussed in detail with special focus on highlighting different aspects of general scanpath shape. Algorithms and evaluation data are available at: http://www.ti.uni-tuebingen.de/scanpathcomparison.html.

Original languageEnglish
Pages (from-to)1048-1064
Number of pages17
JournalBehavior Research Methods
Volume49
Issue number3
DOIs
StatePublished - 1 Jun 2017
Externally publishedYes

Keywords

  • Comparison
  • Eye movements
  • Eye tracking
  • Scan pattern
  • String kernel

Fingerprint

Dive into the research topics of 'SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies'. Together they form a unique fingerprint.

Cite this