학술논문

Context-Aware Dynamic Word Embeddings for Aspect Term Extraction
Document Type
Periodical
Source
IEEE Transactions on Affective Computing IEEE Trans. Affective Comput. Affective Computing, IEEE Transactions on. 15(1):144-156 Jan, 2024
Subject
Computing and Processing
Robotics and Control Systems
Signal Processing and Analysis
Task analysis
Fans
Feature extraction
Data mining
Context modeling
Touch sensitive screens
Portable computers
Aspect term extraction
attention mechanism
sentiment analysis
word embedding
Language
ISSN
1949-3045
2371-9850
Abstract
The aspect term extraction (ATE) task aims to extract aspect terms describing a part or an attribute of a product from review sentences. Most existing works rely on either general or domain embedding to address this problem. Despite the promising results, the importance of general and domain embeddings is still ignored by most methods, resulting in degraded performances. Besides, word embedding is also related to downstream tasks, and how to regularize word embeddings to capture context-aware information is an unresolved problem. To solve these issues, we first propose context-aware dynamic word embedding (CDWE), which could simultaneously consider general meanings, domain-specific meanings, and the context information of words. Based on CDWE, we propose an attention-based convolution neural network, called ADWE-CNN for ATE, which could adaptively capture the previous meanings of words by utilizing an attention mechanism to assign different importance to the respective embeddings. The experimental results show that ADWE-CNN achieves a comparable performance with the state-of-the-art approaches. Various ablation studies have been conducted to explore the benefit of each component. Our code is publicly available at http://github.com/xiejiajia2018/ADWE-CNN.