학술논문

A Named Entity Recognition Method Based on ALBERT's Multi-Headed Attention Mechanism with Word Fusion
Document Type
Conference
Source
2023 IEEE 14th International Conference on Software Engineering and Service Science (ICSESS) Software Engineering and Service Science (ICSESS), 2023 IEEE 14th International Conference on. :195-200 Oct, 2023
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Engineering Profession
Photonics and Electrooptics
Signal Processing and Analysis
Training
Annotations
Semantics
Neural networks
Feature extraction
Decoding
Data mining
named entity recognition
multi-headed attention mechanism
word fusion
ALBERT pretrained language model
BiLSTM
CRF
Language
ISSN
2327-0594
Abstract
To address the shortcomings of Chinese named entity recognition in terms of sub-word ambiguity and performance improvement, we propose a named entity recognition method based on ALBERT's multi-headed attention mechanism and word fusion. The method first uses the pretrained language model ALBERT to obtain the dynamic word vector of the text and fully extract the information of the text. After that, the features of the character vectors are extracted using CNN in the word fusion CAW layer, and the contextual semantic features of each word are extracted using BiLSTM, and the output results of both are stitched and fused. Then the output of CAW layer is fed into BiLSTM to obtain deep features. After that the tightness of the connections between elements in the sentence is captured using a multi-headed attention mechanism. Finally, the most likely label is obtained by CRF as the identification result. The model was experimented on the People's Daily dataset and achieved an F1 value of 97.14%, which is better than the comparison model, indicating the effectiveness of the model.