Sorry, you need to enable JavaScript to visit this website.

PRIOR-BERT AND MULTITASK LEARNING FOR TARGET-ASPECT-SENTIMENT JOINT DETECTION

Citation Author(s):
Submitted by:
Cai Ke
Last updated:
15 May 2022 - 10:57am
Document Type:
Poster
Document Year:
2022
Event:
Presenters:
Cai Ke
Paper Code:
SPE-63.2
Categories:
Keywords:
 

Aspect-Based Sentiment Analysis (ABSA) is a fine-grained sentiment analysis task and has become a significant task with real-world scenario value. The challenge of this task is how to generate an effective text representation and construct an end-to-end model that can simultaneously detect (target, aspect, sentiment) triples from a sentence. Besides, the existing models do not take the heavily unbalanced distribution of labels into account and also do not give enough consideration to long-distance dependence of targets and aspect-sentiment pairs. To overcome these challenges, we propose a novel end-to-end model named Prior-BERT and Multi-Task Learning (PBERT-MTL), which can detect all triples more efficiently. We evaluate our model on SemEval-2015 and SemEval-2016 datasets. Extensive results show the validity of our work in this paper. In addition, our model also achieves higher performance on a series of subtasks of target-aspect-sentiment detection. Code is available at https://github.com/CQUPTCaiKe/PBERT-MTL.

up
0 users have voted: