학술논문
Joint Extraction of Entity and Relation Based on Pre-trained Language Model
Document Type
Conference
Author
Source
2020 12th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC) Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2020 12th International Conference on. 2:179-183 Aug, 2020
Subject
Language
Abstract
Extracting entities and relations from unstructured text is the key to building a large-scale knowledge graph. In recent years, the relation extraction approaches based on the neural network has achieved good results. However, the existing methods cannot accurately extract overlapping entities (i.e., one entity is shared by multiple relations). In this paper, we propose a simple Electra-based joint model for relation extraction. We use the subject extractor to identify the subject first, and then use the predicate-object extractor to predict its corresponding predicates and objects based on the encoded representation vector of the detected subject. Experiments on DuIE2.0 show good performance on the extraction of Chinese dataset with overlapping triples.