Guidelines for eye tracking analysis method selection and use depending on analysis tasks

The following table provides guidelines for selecting methods and method combinations for analyzing eye movement data depending on the analysis tasks. The tasks are listed in the first column. The second column specifies the size of the data set or subset for which the methods listed in the third column can be effective. In most cases the size is specified in terms of the number of users whose eye trajectories are analyzed; however, in some cases it is the number of user groups or the number of different displays (visual stimuli), for which the eye movements are compared. The visual analytics methods are listed in the third column. Each method is represented by its name and an image. Clicking on the name or image opens the page with an illustrated description of the method. The last column contains references to relevant papers, which are listed below the table.

Some of the methods involve data transformations, such as generalization and aggregation. Before applying various analysis methods, it may be useful to transform temporal references in the data.

Note about the illustrations

The images that were used as the visual stimuli in the eye tracking experiment are shown in the background of most of the illustrations. Although the original images had very high resolution (1920x1200 pixels), they appear as low resolution in the illustrations. This is the effect of the automatic scaling of the images for fitting the available size of the display window.

Tasks N of users Methods References
Overall spatial pattern of movements;
relation to display content or structure
multiple
Map display of trajectories (MT)

Flow map (FM)
[1] [2]
General character of movements;
individual spatial pattern of movements;
relation to display content or structure;
individual search strategy
1-few
Map display of trajectories (MT) with interactive filtering

Space-time cube (STC)

Flow map (FM) with interactive filtering and dynamic aggregation
[3] [4]
Spatial pattern of attention distribution;
relation of attention foci to display content or structure;
repeated visits
1-multiple Attention map (AM)

[3]
Relation of movements to particular AOIs;
Returns to previous points;
places where users have difficulties
1-multiple
Temporal view of trajectories (TVT)

Trajectory segment filtering (TSF)

Map of filtered trajectory segments (MTF)

Extraction of events from trajectories (TEE)

[5]
Connections between AOIs
Presence and frequency of repeated moves
1-multiple Flow map (FM)
[3]
Comparison of trajectories few
Map display of trajectories (MT) with interactive filtering
Space-time cube (STC)

[4]
Comparison of trajectories multiple Path similarity analysis (PSA)
[6]
Comparison of spatial patterns of movements of different users or user groups few users or few groups Multiple flow maps (MFM)

Flow maps of differences (FMD)

[2]
Comparison of spatial patterns of attention of different users or user groups few users or few groups Multiple attention maps (MAM)

Attention maps of differences (AMD)

[2]
Comparison of spatial patterns of eye movements on different displays 1 – multiple users; few displays Juxtaposed flow maps (JFM)

Comparison of attention distributions on different displays 1 – multiple users; few displays Juxtaposed attention maps (JAM)

Progression of eye movements over time; general search strategy; types of activities and their temporal order 1 – multiple Multiple flow maps (MFM)

Time clustering by similarity of flows (CTF)

[2]
Changes of attention distribution over time 1 – multiple Multiple attention maps (MAM)

Time clustering by similarity of attention distribution (CTA)

[2]
Frequent/typical sequences of attending AOIs; cyclic scanning behaviors multiple Discovery of frequent sequences of area visits (FSD)

References:

  1. Natalia Andrienko, Gennady Andrienko
    Spatial Generalization and Aggregation of Massive Movement Data
    IEEE Transactions on Visualization and Computer Graphics (TVCG), 2011, v.17 (2), pp.205-219
    published version: http://doi.ieeecomputersociety.org/10.1109/TVCG.2010.44
  2. Natalia Andrienko, Gennady Andrienko, Hendrik Stange, Thomas Liebig, Dirk Hecker
    Visual Analytics for Understanding Spatial Situations from Episodic Movement Data
    Künstliche Intelligenz, 2012
    pre-print published version: http://dx.doi.org/10.1007/s13218-012-0177-4
  3. Gennady Andrienko, Natalia Andrienko
    A General Framework for Using Aggregation in Visual Exploration of Movement Data
    The Cartographic Journal, 2010, v.47 (1), pp. 22-40
    pre-print published version: http://www.ingentaconnect.com/content/maney/caj/2010/00000047/00000001/art00004
  4. Gennady Andrienko, Natalia Andrienko
    Poster: Dynamic Time Transformation for Interpreting Clusters of Trajectories with Space-Time Cube
    IEEE Visual Analytics Science and Technology (VAST 2010)
    Proceedings, IEEE Computer Society Press, pp.213-214
    published version
  5. Gennady Andrienko, Natalia Andrienko, Marco Heurich
    An Event-Based Conceptual Model for Context-Aware Movement Analysis
    International Journal Geographical Information Science, 2011, v.25 (9), pp.1347-1370
    pre-print, published version: http://www.tandfonline.com/doi/abs/10.1080/13658816.2011.556120
  6. Gennady Andrienko, Natalia Andrienko, Stefan Wrobel
    Visual Analytics Tools for Analysis of Movement Data
    ACM SIGKDD Explorations, 2007, v.9 (2), pp.38-46
    pre-print; published version: http://www.sigkdd.org/explorations/issues/9-2-2007-12/5_Adrienko_Geo.pdf