학술논문

Research on high precision Chinese text sentiment Classification based on ALBERT Optimization
Document Type
Conference
Source
2023 15th International Conference on Advanced Computational Intelligence (ICACI) Advanced Computational Intelligence (ICACI), 2023 15th International Conference on. :1-6 May, 2023
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Training
Fuses
Computational modeling
Bit error rate
Semantics
Optimization methods
Feature extraction
ALBERT
sentiment classification
combinatorial model
TextCNN
Attention
Language
Abstract
Text sentiment classification is an important research field in natural language processing and has great research value for social public opinion and business development. The existing emotion classification models have some problems, such as low classification accuracy, huge parameter amount, and difficult model training, which cannot provide high precision models for users with low computing power. ALBERT reduces the number of BERT parameters of the same magnitude by more than ten times, making it possible for the average user to run the model. In this paper, we dynamically fuse the middle layer of ALBERT model with two channels to obtain different granularity of text semantic information and improve the accuracy of the model. We added TextCNN to enhance the model's ability to capture local emotion words, added Attention mechanism to enrich the feature extraction process of the model, and constructed a new model ALBERT-TextCNN-Attention (PATA). We not only compensate for the slight accuracy loss of lightweight ALBERT model compared to BERT model, but also develop a new model with low parameter number and high accuracy for the study of text sentiment classification. The experimental results show that the parameter number of the proposed model is only 19.72% of the BERT model, and the accuracy of the proposed model is 90.63% on the public data set waimai-10k, which is better than other models based on ALBERT in recent studies.