Computational Models of Eye Movements in Reading A Data-Driven Approach to the Eye-Mind Link
Abstract: This thesis investigates new methods for understanding eye movement behavior in reading based on the use of eye tracking corpora and data-driven modeling. Eye movement behavior is characterized by two basic, generally unconscious, decisions: where and when to move the eyes. We explore the idea that empirical eye movement data carries rich information about the processes that guide these decisions. Two methods are investigated, each addressing a different aspect of eye movements in reading. The role of prediction in eye movement modeling is emphasized, and new evaluation methods for assessing the predictive accuracy of models are proposed. The decision of where to move the eyes is approached using standard machine learning methods. The model proposed learns where to move the eyes under different conditions associated with the words being read. Applied to new text, the model moves the eyes in ways it has learnt, showing characteristics similar to human readers. Furthermore, we propose the use of entropy to measure the similarity between observed and predicted eye movement behavior on held-out data. The main contribution is a flexible model, with few fixed parameters, that can be used to investigate decisions about where the eyes move during reading. The decision of when to move the eyes is approached using time-to-event modeling (survival analysis). The model proposed learns the timing of eye movements under different conditions associated with the words being read. Applied to new text, the model estimates the probability that a fixation survives for any given length of time. We propose an entropy-related measure to assess the probabilistic temporal predictions of the model. The main contribution is the use of Cox hazards modeling to address questions about the strength, as well as the timing, of processes that influence the decision of when to move the eyes during reading.
CLICK HERE TO DOWNLOAD THE WHOLE DISSERTATION. (in PDF format)