Journal of Jilin University Science Edition ›› 2021, Vol. 59 ›› Issue (6): 1439-1444.

Previous Articles     Next Articles

Causality Extraction Based on BERT

JIANG Bo1, ZUO Wanli1,2, WANG Ying1,2   

  1. 1. College of Computer Science and Technology, Jilin University, Changchun 130012, China;
    2. Key Laboratory of Symbol Computation and Knowledge Engineering Ministry of Education, Jilin University, Changchun 130012, China
  • Received:2020-11-16 Online:2021-11-26 Published:2021-11-26

Abstract: Aiming at the problem that the traditional relation extraction model relied on machine learning methods such as feature engineering, which had low accuracy and complicated rules, we proposed a BERT+BiLSTM+CRF method. Firstly,  bidirectional encoder representations from transformers (BERT) was used to pre-train the corpus. Secondly, by using BERT to dynamically generate word vectors according to context features, the generated word vectors were encoded through bidirectional long and short-term memory network (BiLSTM). Finally, we input it into the conditional random field (CRF) layer to complete the extraction of causality. The experimental result shows that the accuracy of the model on the SemEval-CE dataset is 0.054 1 higher than that of the BiLSTM+CRF+self-ATT model, which improves the performance of the deep learning method in the task of causality extraction.

Key words: causality extraction, sequence labeling, bidirectional long short-term memory (BiLSTM), bidirectional encoder representations
 ,
from transformers (BERT) model

CLC Number: 

  • TP391