BERT-Based Multi-Task Learning for Aspect-Based Opinion Mining

Document Type

Conference Proceeding

Publication Date

1-1-2021

Publication Title

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Volume

12923 LNCS

First Page

192

Keywords

Aspect-based opinion mining, BERT, Multi-task learning, Pooling strategies, Sentiment analysis

Last Page

204

Abstract

Aspect-Based Opinion Mining (ABOM) mainly focuses on mining the aspect terms (product’s features) and related opinion polarities (e.g., Positive, Negative, and Neutral) from user’s reviews. The most prominent neural network-based methods to perform ABOM tasks include BERT-based approaches, such as BERT-PT and BAT. These approaches build separate models to complete each ABOM subtasks, such as aspect term extraction (e.g., pizza, staff member) and aspect sentiment classification. Both approaches use different training algorithms, such as Post-Training and Adversarial Training. Also, the BERT-LSTM/Attention approach uses different pooling strategies on the intermediate layers of the BERT model to achieve better results. Moreover, they do not consider the subtasks of aspect categories (e.g., a category of aspect pizza in a review is food) and related opinion polarity. This paper proposes a new system for ABOM, called BERT-MTL, which uses Multi-Task Learning (MTL) approach and differentiates from these previous approaches by solving two tasks such as aspect terms and categories extraction simultaneously by taking advantage of similarities between tasks and enhancing the model’s accuracy as well as reduce the training time. Our proposed system also builds models to identify user’s opinions for aspect terms and aspect categories by applying different pooling strategies on the last layer of the BERT model. To evaluate our model’s performance, we have used the SemEval-14 task 4 restaurant dataset. Our model outperforms previous models in several ABOM tasks, and the experimental results support its validity.

DOI

10.1007/978-3-030-86472-9_18

ISSN

03029743

E-ISSN

16113349

ISBN

9783030864712

Share

COinS