학술논문

Classifying Textual Sentiment Using Bidirectional Encoder Representations from Transformers
Document Type
Conference
Source
2023 26th International Conference on Computer and Information Technology (ICCIT) Computer and Information Technology (ICCIT), 2023 26th International Conference on. :1-6 Dec, 2023
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Fields, Waves and Electromagnetics
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Training
Sentiment analysis
Social networking (online)
Computational modeling
Transformers
Grammar
Task analysis
Natural language processing
Transformer learning
Sentiment corpus
Text processing
Language
Abstract
Textual sentiment analysis (TSA) has gained significant attention recently for its wide-ranging applications across various research domains and industries. However, most existing research and sentiment analysis tools are primarily tailored for English texts. The unique linguistic complexities of the Bengali language, coupled with a paucity of comprehensive resources and tools, pose distinctive challenges for TSA in Bengali. This paper introduces an intelligent approach, leveraging transformer-based learning techniques by harnessing the potent capabilities of self-attention mechanisms for dealing with Bengali sentences containing ungrammatical structures or local dialects. To tackle the downstream TSA task in Bengali, this work explores a range of machine learning (ML), deep learning (DL), and transformer-based baselines. Experimental results reveal that the Bangla BERT model outperforms the other baselines, achieving the highest weighted f 1 -score of 0.69.