Dual Features Local-Global Attention Model with BERT for Aspect Sentiment Analysis
Aspect-based sentiment analysis aims to predict the sentiment polarity of a specific aspect in a sentence or document. Most of recent research uses attention mechanism to model the context. But there is a problem in that the context information needs to be considered according to different contexts...
Saved in:
Published in: | Jisuanji kexue yu tansuo Vol. 18; no. 1; pp. 205 - 216 |
---|---|
Main Author: | |
Format: | Journal Article |
Language: | Chinese |
Published: |
Journal of Computer Engineering and Applications Beijing Co., Ltd., Science Press
01-01-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Aspect-based sentiment analysis aims to predict the sentiment polarity of a specific aspect in a sentence or document. Most of recent research uses attention mechanism to model the context. But there is a problem in that the context information needs to be considered according to different contexts when the BERT model is used to calculate the dependencies between representations to extract features by sentiment classification models, which leads to the lack of contextual knowledge of the modelled features. And the importance of aspect words is not given more attention, affecting the overall classification performance of the model. To address the problems above, this paper proposes a dual features local-global attention model with BERT (DFLGA-BERT). Local and global feature extraction modules are designed respectively to fully capture the semantic association between aspect words and context. Moreover, an improved quasi-attention mechanism is used in DFLGA-BERT, which leads to the model using minus attention i |
---|---|
ISSN: | 1673-9418 |
DOI: | 10.3778/j.issn.1673-9418.2210012 |