Date of Award

6-21-2024

Publication Type

Thesis

Degree Name

M.Sc.

Department

Computer Science

Supervisor

Xiaobu Yuan

Abstract

Embodied Conversational Agents (ECAs) are becoming more and more incorporated into diverse sectors such as customer service, healthcare, education, and entertainment. They are utilised to improve user experiences and make interactions more efficient. Lip synchronization and Human-Like expressions play a crucial role in enhancing user trust, satisfaction, and overall anthropomorphism of ECAs. By ensuring that the lips of an ECA correspond with its spoken words, lip synchronisation improves the naturalness and perceived realism of interactions. Through expression manipulation, ECAs may use their facial expressions to communicate intents and feelings, leading to more empathic and engaging conversations. When combined, these characteristics improve user engagement, understanding, and rapport in human-computer interactions by making ECAs more believable and successful overall. Their smooth integration improves the ECA's capacity to forge deep relationships and accomplish its communication objectives. This study presents a novel approach to achieve accurate lip synchronisation and facial expression manipulation utilising machine learning and 3D animation. The algorithm uses a pre-trained multilingual machine learning model for real-time phoneme representations for Text-to-Speech generation and lip synchronization. 3D object files are used for illustrating phonemes derived from the machine learning model. These files are utilised to interpolate between different lip forms and to monitor and map the motions of the lips. This process ensures precise synchronisation of the avatar's lip movements with the produced speech. Additionally, the algorithm incorporates ECA’s facial expression manipulation enabling the ECA to express happiness, surprise, concern, etc. increasing audience engagement. Through the integration of the newly developed algorithm with the overall E-tutoring system architecture, this research contributes to the advancement of interactive online learning experiences.

Share

COinS