학술논문
Comparative Approaches to Sentiment Analysis Using Datasets in Major European and Arabic Languages
Document Type
Working Paper
Author
Source
Subject
Language
Abstract
This study explores transformer-based models such as BERT, mBERT, and XLM-R for multi-lingual sentiment analysis across diverse linguistic structures. Key contributions include the identification of XLM-R superior adaptability in morphologically complex languages, achieving accuracy levels above 88%. The work highlights fine-tuning strategies and emphasizes their significance for improving sentiment classification in underrepresented languages.
Comment: 11th International Conference on Advances in Computer Science and Information Technology (ACSTY 2025)
Comment: 11th International Conference on Advances in Computer Science and Information Technology (ACSTY 2025)