Date of Award

2-15-2024

Publication Type

Thesis

Degree Name

M.Sc.

Department

Computer Science

Keywords

CNN;IMAGE AUGMENTATION;MARIJUANA INTOXICATION DETECTION

Supervisor

DAN WU

Abstract

The primary psychoactive component of marijuana is ∆-9-tetrahydrocannabinol (THC). There has been a significant increase in motor vehicle accidents and work- place mishaps due to the misuse of marijuana, often leading to intoxication impacting societies worldwide. Civil bodies and organizations continue to rely on conventional marijuana intoxication detection techniques to battle such problems. They often em- ploy techniques such as field sobriety tests, breath analyzer tests, blood tests and DRUID. These tests for detecting cannabis use have demonstrated a range of limita- tions. Consequently, the emphasis is directed toward developing a machine learning- based solution that can reliably and instantaneously determine whether a person is under the influence of marijuana. Developing a machine-learning solution for marijuana detection requires exten- sive, credible data for training, and the scarcity of such data shows the need for improved data generation and classification methods. Recent work addresses the is- sue of data availability by sourcing images of marijuana-intoxicated individuals from YouTube and Google searches. Sourced images were used to train MobileNet, SVM, Decision Tree and Random Forest classifier, which detects the presence of marijuana. However, the recent work must incorporate current state-of-the-art neural classifica- tion models and deep learning-based image augmentation techniques. This research implements StlyeGAN3, a state-of-the-art model for image generation, to proliferate the dataset of screenshots of faces of marijuana-intoxicated individuals sourced from the internet. Additionally, ResNet-50, InceptionV3 and VGG-16 classifiers were used to detect marijuana intoxication. VGG-16 classifiers outperformed other classifiers and achieved an accuracy of 94.66%, precision of 96.84%, recall of 89.32%, and an F1-score of 92.92%, surpassing recent work.

Share

COinS