Date of Award

7-25-2019

Publication Type

Master Thesis

Degree Name

M.A.Sc.

Department

Electrical and Computer Engineering

Keywords

Estimation, Eye-gaze tracking, Hidden Markov Models, Human Machine Systems, Kalman Filters, Statistical Modeling

Supervisor

Balasingam, B.

Rights

info:eu-repo/semantics/openAccess

Abstract

In this thesis, we investigate methods to accurately track reading progression by analyzing eye-gaze fixation points, using commercially available eye tracking devices and without the imposition of unnatural movement constraints. In order to obtain the most accurate eye-gaze fixation point data possible, the current state of the art relies on expensive, cumbersome apparatuses. Eye-gaze tracking using less expensive hardware, and without constraints imposed on the individual whose gaze is being tracked, results in less reliable, noise-corrupt data which proves difficult to interpret. Extending the accessibility of accurate reading progression tracking beyond its current limits and enabling its feasibility in a real-world, constraint-free environment will enable a multitude of futuristic functionalities for educational, enterprise, and consumer technologies. We first discuss the ``Line Detection System'' (LDS), a Kalman filter and hidden Markov model based algorithm designed to infer from noisy data the line of text associated with each eye-gaze fixation point reported every few milliseconds during reading. This system is shown to yield an average line detection accuracy of 88.1\%. Next, we discuss a ``Horizontal Saccade Tracking System'' (HSTS) which aims to track horizontal progression within each line, using a least squares approach to filter out noise. Finally, we discuss a novel ``Slip-Kalman'' filter which is custom designed to track the progression of reading. This method improves upon the original LDS, performing at an average line detection accuracy of 97.8\%, and offers advanced capability in horizontal tracking compared to the HSTS. The performance of each method is demonstrated using 25 pages worth of data collected during reading

Share

COinS