[1]李卫军,顾建来,张新勇,等.基于关系学习的小样本知识图谱补全模型[J].郑州大学学报(工学版),2024,45(04):53-61.[doi:10.13705/ j.issn.1671-6833.2024.01.016]
 LI Weijun,GU Jianlai,ZHANG Xinyong,et al.Relation Learning Completion Model for Few-shot Knowledge Graphs[J].Journal of Zhengzhou University (Engineering Science),2024,45(04):53-61.[doi:10.13705/ j.issn.1671-6833.2024.01.016]
点击复制

基于关系学习的小样本知识图谱补全模型()
分享到:

《郑州大学学报(工学版)》[ISSN:1671-6833/CN:41-1339/T]

卷:
45
期数:
2024年04期
页码:
53-61
栏目:
出版日期:
2024-06-16

文章信息/Info

Title:
Relation Learning Completion Model for Few-shot Knowledge Graphs
文章编号:
1671-6833(2024)04-0053-09
作者:
李卫军 顾建来 张新勇 高庾潇 刘锦彤
北方民族大学 计算机科学与工程学院,宁夏 银川 750021
Author(s):
LI Weijun GU Jianlai ZHANG Xinyong GAO Yuxiao LIU Jintong
School of Computer Science and Engineering, North Minzu University, Yinchuan 750021, China
关键词:
知识图谱补全 小样本关系 邻域聚合 链接预测
Keywords:
knowledge graph completion few-shot relation neighborhood aggregation link prediction
分类号:
TP391TP183
DOI:
10.13705/ j.issn.1671-6833.2024.01.016
文献标志码:
A
摘要:
在小样本知识图谱中,实体对之间的关系表示复杂多样。然而,现有的小样本知识图谱补全方法普遍存在 关系学习能力不足和忽略实体上下文语义的问题。为解决这些问题,提出了一种基于关系学习的小样本知识图谱 补全模型FRLC。首先,在聚合高阶邻域实体信息的过程中引入了门控机制,这一步骤旨在丰富中心实体表达的同 时减少噪声对邻居的不良影响。其次,在关系学习阶段充分利用参考集实体对之间的相关性,实现更加准确的关 系表示。最后,在Transformer学习器中,引入了LSTM结构进一步学习实体和关系的上下文语义信息,用于预测新 的事实知识。为了验证FRLC的有效性,在公开的NELL-One和Wiki-One数据集上将FRLC与6个小样本知识图 谱补全模型和5个传统模型的5-shot链接预测进行了对比实验,结果表明:FRLC在MRR、Hits@10、Hits@5和Hits @1这4个指标上都有所提升,证明了模型的有效性。
Abstract:
In few-show knowledge graphs, the representation of relationships between entity pairs was diverse and complex. However, existing few-show knowledge graph completion methods commonly suffered from insufficient re lational learning capabilities and the neglect of contextual semantics associated with entities. To address these chal lenges, a novel approach called the few-shot relation learning completion model (FRLC) was proposed. Firstly, during the process of aggregating high-order neighborhood entity information, a gating mechanism was introduced to mitigate the adverse effects of noise on neighbors while enriching the representation of central entities. Secondly, in the phase of relation representation learning, the correlations among entity pairs in a reference set were leveraged to obtain more accurate relationship representations. Lastly, within the Transformer-based learning framework, an LSTM structure was incorporated to further capture contextual semantic information of entities and relationships, which was used for predicting new factual knowledge. To validate the effectiveness of FRLC, comparative experi ments were conducted on the publicly available NELL-One and Wiki-One datasets, in which FRLC was compared with six few-shot knowledge graph completion models and five traditional models for 5-shot link prediction. The ex perimental results showed improvements in FRLC across four metrics: MRR, Hits@10, Hits@5, and Hits@1, demonstrating the model′s effectiveness.

参考文献/References:

[1] CHEN X J, JIA S B, XIANG Y. A review: knowledge reasoning over knowledge graph[J]. Expert Systems with Applications, 2020, 141: 112948. 

[2] BOLLACKER K, EVANS C, PARITOSH P, et al. Free base: a collaboratively created graph database for structu ring human knowledge[C]∥Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data. New York: ACM, 2008: 1247-1250. 
[3] SUCHANEK F M, KASNECI G, WEIKUM G. YAGO: a core of semantic knowledge[C]∥Proceedings of the 16th International Conference on World Wide Web. New York: ACM, 2007: 697-706. 
[4] BERANT J, CHOU A, FROSTIG R, et al. Semantic parsing on freebase from question-answer pairs[C]∥2013 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2013: 1533-1544. 
[5] 左敏, 徐泽龙, 张青川, 等. 基于双维度中文语义分 析的食品领域知识库问答[J]. 郑州大学学报(工学 版), 2020, 41(3): 8-13. 
ZUO M, XU Z L, ZHANG Q C, et al. A question an swering model of food domain knowledge bases with two dimension Chinese semantic analysis[J]. Journal of Zhengzhou University (Engineering Science), 2020, 41 (3): 8-13. 
[6] ZHANG F Z, YUAN N J, LIAN D F, et al. Collabora tive knowledge base embedding for recommender systems [C]∥Proceedings of the 22nd ACM SIGKDD Internation al Conference on Knowledge Discovery and Data Mining. New York: ACM, 2016: 353-362. 
[7] CHAMI I, WOLF A, JUAN D C, et al. Low-dimensional hyperbolic knowledge graph embeddings[C]∥Proceed ings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2020: 6901-6914. 
[8] BORDES A, USUNIER N, GARCIA-DURÁN A, et al. Translating embeddings for modeling multi-relational data [C]∥Proceedings of the 26th International Conference on Neural Information Processing Systems. New York: ACM, 2013: 2787-2795. 
[9] YANG B S, YIH W T, HE X D, et al. Embedding enti ties and relations for learning and inference in knowledge bases[EB/OL]. (2015-08-29)[2023-07-08]. https: ∥arxiv.org/abs/1412.6575. 
[10] TROUILLON T, WELBL J, RIEDEL S, et al. Complex embeddings for simple link prediction[C]∥Proceedings of the 33rd International Conference on International Con ference on Machine Learning. New York: ACM, 2016: 2071-2080.
[11] VELI ˇ CKOVIC ’ P, CUCURULL G, CASANOVA A, et al. Graph attention networks[EB/OL]. (2018-02-04) [2023-07-08]. https:∥arxiv.org/abs/1710.10903. 
[12] XIONG W H, YU M, CHANG S Y, et al. One-shot rela tional learning for knowledge graphs[EB/OL]. (2018-08-27) [2023-07-08]. https:∥arxiv. org/ abs/1808.09040. 
[13] ZHANG C X, YAO H X, HUANG C, et al. Few-shot knowledge graph completion[EB/OL]. (2019-11-26) [2023-07-08]. https:∥arxiv.org/abs/1911.11298. 
[14] SHENG J W, GUO S, CHEN Z Y, et al. Adaptive atten tional network for few-shot knowledge graph completion [C]∥Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2020: 1681-1691. 
[15] CHEN M Y, ZHANG W, ZHANG W, et al. Meta rela tional learning for few-shot link prediction in knowledge graphs[C]∥Proceedings of the 2019 Conference on Em pirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: ACL, 2019: 4217-4226. 
[16] NIU G L, LI Y, TANG C G, et al. Relational learning with gated and attentive neighbor aggregator for few-shot knowledge graph completion[C]∥Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2021: 213-222. 
[17] VASWANI A, SHAZEER N, PARMAR N, et al. Atten tion is all you need[C]∥Proceedings of the 31st Interna tional Conference on Neural Information Processing Sys tems. New York: ACM, 2017: 6000-6010. 
[18] SUN G, ZHANG C, WOODLAND P C. Transformer lan guage models with LSTM-based cross-utterance informa tion representation[C]∥2021 IEEE International Confer ence on Acoustics, Speech and Signal Processing. Piscat away: IEEE, 2021: 7363-7367. 
[19]冉丈杰, 孙林夫, 邹益胜, 等. 基于关系学习网络的 小样本知识图谱补全模型[J].计算机工程, 2023, 49 (9):52-59. 
RAN Z J, SUN L F, ZOU Y S, et al. Few-shot knowledge graph completion model based on relation learning network [J]. Computer Engineering, 2023, 49(9):52-59. 
[20] KINGMA D P, BA J. Adam: a method for stochastic op timization[EB/OL]. (2017-01-30)[2023-07-08]. https:∥arxiv.org/abs/1412.6980. 
[21] KAZEMI S M, POOLE D. SimplE embedding for link pre diction in knowledge graphs[EB/OL]. (2018-10-26) [2023-07-08]. https:∥arxiv.org/abs/1802.04868. 
[22] SUN Z Q, DENG Z H, NIE J Y, et al. RotatE: knowl edge graph embedding by relational rotation in complex space[EB/OL]. (2019-02-26)[2023-07-08]. https: ∥arxiv.org/abs/1902.10197.

更新日期/Last Update: 2024-06-14