Date of Award

9-19-2019

Publication Type

Master Thesis

Degree Name

M.A.Sc.

Department

Electrical and Computer Engineering

First Advisor

Balasingam, B.

Second Advisor

Biondi, F.

Keywords

ADAS, Cognitive load, Cognitive load classification, Eye Tracking, SNR, Statistical Analysis

Rights

info:eu-repo/semantics/embargoedAccess

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Abstract

In this thesis, we investigate cognitive load detection and classification based on minimally invasive methods. Cognitive load detection is crucial for many emerging applications such as advanced driver assistance systems (ADAS) and industrial automation. Numerous studies in the past have reported several psychological measures, such as eye-tracking, electrocardiogram (ECG), electroencephalogram (EEG), as indicators of cognitive load. However, existing physiological features are invasive in nature. Consequently, the objective of this study is to determine the feasibility of non-invasive features such as pupil dilation measurements low-cost eye-tracker with minimal constraints on the subject for cognitive load detection. In this study, data from 33 participants were collected while they underwent tasks that are designed to permeate three different cognitive difficulty levels with and without cognitive maskers and the following measurements were recorded: eye-tracking measures (pupil dilation, eye-gaze, and eye-blinks), and the response time from the detection response task (DRT). We also demonstrate the classification of cognitive load experienced by humans under different task conditions with the help of pupil dilation and reaction time. Developing a model that can accurately classify cognitive load can be used in various sectors such as semi-autonomous vehicles and aviation. we have used a data fusion approach by combining pupil dilation and DRT reaction time to determine if the classification accuracy increases. Further, we have compared the classifier with the highest classification accuracy using data fusion against the accuracy of the same classifier with only one feature (pupil dilation; reaction time) at a time.

Available for download on Thursday, November 12, 2020

Share

COinS