학술논문

Transformer-Encoder-GRU (T-E-GRU) for Chinese Sentiment Analysis on Chinese Comment Text
Document Type
Original Paper
Source
Neural Processing Letters. 55(2):1847-1867
Subject
Chinese comments
Chinese sentiment analysis
Recurrent model
Transformer-Encoder
T-E-GRU
Language
English
ISSN
1370-4621
1573-773X
Abstract
Chinese sentiment analysis (CSA) has always been one of the challenges in natural language processing due to its complexity and uncertainty. Transformer has been successfully utilized in the understanding of semantics. However, it captures the sequence features in the text through position encoding, which is naturally insufficient compared with the recurrent model. To address this problem, we propose T-E-GRU. T-E-GRU combines the powerful global feature extraction of Transformer encoder and the natural sequence feature extraction of GRU for CSA. The experimental evaluations are conducted on three real Chinese datasets, the experimental results show that T-E-GRU has unique advantages over recurrent model, recurrent model with attention and BERT-based model.