Date of Award

8-31-2020

Publication Type

Master Thesis

Degree Name

M.Sc.

Department

Computer Science

First Advisor

Xiaobu Yuan

Keywords

Dialogue Management, ECA, Expressive, POMDP, Q-Learning, Virtual Reality

Rights

info:eu-repo/semantics/openAccess

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Abstract

The recent advancements in virtual reality have allowed for the creation of autonomous agents to aid humans in the retrieval and processing of useful digital information or to aid humans in requesting tasks to be completed by these autonomous agents. Known as embodied conversational agents (ECA), these intelligent agents bridge the physical and virtual worlds by providing natural verbal and non-verbal forms of communication with the user. To provide a positive user experience, it is essential for an ECA not only to appear human-like but also correctly identify the user’s intention so the ECA can correctly assist the user. This thesis continues the research done by our research group investigating the further improvement of POMDP-based dialogue management using machine learning on POMDP’s belief state history. This thesis integrates a technique to match lip movements with the rendered ECA audio alongside the automatically selected emotion. Finally, this research conducts experiments using machine learning techniques to adjust POMDP policies and compare its effectiveness in terms of dialogue lengths and successful intention discovery rates.

Share

COinS