Usage
  • 139 views
  • 246 downloads

Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems

  • Author / Creator
    Saffar Mehrjardi, Mansour
  • Dialogue generation systems (chatbots) are currently one of the most noted topics in natural language processing, and many companies are investing intensively on creating such systems for automation of tasks. Given the notable success of deep learning methods in the creation of state-of-the-art models in other domains such as computer vision, recommender systems, and natural language processing, there has been in recent years a surge in creating chatbot systems using deep learning methods. In the case of dialogue generation, deep learning provides methods to bootstrap a large corpus of data with minimal feature engineering, enabling the chatbot systems to learn the feature representation from data and create coherent and well-structured responses. Task-oriented chatbots are a type of dialogue generation system which tries to help the users accomplish specific tasks, such as booking a restaurant table or buying movie tickets, in a continuous and uninterrupted conversational interface and usually in as few steps as possible. Given the abundance of such systems in industry and how lucrative chatbots can be in saving cost and increasing user satisfaction, we decided to focus our study on this type of chatbot systems. Self-attentional models are a new deep learning paradigm for sequence modelling tasks. These models differ from the previous sequence modelling methods, e.g. models based on recurrent neural networks and convolutional neural networks, in a way that their architecture is only based on the attention mechanism. Transformer was the first self-attentional model which was recently proposed by Google Brain and could beat the state-of-the-art models in neural machine translation tasks. This fact motivated us to explore the usage of self-attentional models for the task of creating task-oriented chatbots. We compare common sequence modelling models with the self-attentional models and provide a comprehensive analysis. Our results show that self-attentional models can create task-oriented chatbots with higher BLEU score compared to recurrent neural networks and also with faster training time. Lastly, we address the problem of evaluating chatbots by implementing a user-simulation based evaluation method for task-oriented chatbots. Our evaluation method is a pipelined user simulator, i.e. containing natural language understanding, user simulator, and natural language generation components, similar to Microsoft's user simulators used to train reinforcement learning agents but conditioned on user profile and characteristics which make it more realistic.

  • Subjects / Keywords
  • Graduation date
    Fall 2019
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-jjv1-kt52
  • License
    Permission is hereby granted to the University of Alberta Libraries to reproduce single copies of this thesis and to lend or sell such copies for private, scholarly or scientific research purposes only. Where the thesis is converted to, or otherwise made available in digital form, the University of Alberta will advise potential users of the thesis of these terms. The author reserves all other publication and other rights in association with the copyright in the thesis and, except as herein before provided, neither the thesis nor any substantial portion thereof may be printed or otherwise reproduced in any material form whatsoever without the author's prior written permission.