Journal of Jilin University Science Edition ›› 2026, Vol. 64 ›› Issue (2): 291-0300.

Previous Articles     Next Articles

Knowledge Graph Completion Algorithm Based on Large Language Models and Adapter Driver

JIANG Yunqi1, HAN Xiaotong2, TIAN Yuan2   

  1. 1. College of Computer Science and Technology, Jilin University, Changchun 130012, China;
    2. College of Artificial Intelligence, Jilin University, Changchun 130012, China
  • Received:2024-12-19 Online:2026-03-26 Published:2026-03-26

Abstract: Aiming at the problems that the  knowledge graph completion method based on  Transformer as the backbone network, included parameter redundancy in feed-forward networks, difficulties in identifying tail entities under commonsense scenarios, and embedding biases in contrastive learning, we proposed an adapter-enhanced knowledge graph completion algorithm that 
integrated large language models with multi-positive sample contrastive learning. The algorithm reduced redundant features by introducing multi-head adapters in feed-forward network, and utilized  large language models to enhance commonsense reasoning ability. At the same time, it corrected embedding biases through  multi-positive sample contrastive learning. Experimental results 
show that, compared to the current state-of-the-art models, the algorithm improves MRR by 5.4% and 9.2% on WN18RR and FB15k-237 datasets, respectively, and by  3.6% and 6.7% in transductive and inductive settings  on the more complex Wikidata5M dataset,  respectively, and demonstrates superior generalization ability under low-resource and complex scenarios.

Key words: knowledge graph completion, knowledge graph, large language model, contrastive learning, adapter learning

CLC Number: 

  • TP391.1