CHEN H, CHEN X C, GONG X B, et al. Construction and application of wind turbine diagnosis system based on knowledge graph[J]. Journal of Zhengzhou University (Engineering Science), 2023, 44(6): 54-60, 98.
[2]LIU Y H, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized BERT pretraining approach [EB/OL].(2019-07-26)[2025-07-08]. https:∥doi. org/10.48550/arXiv.1907.11692.
[3]ZELENKO D, AONE C, RICHARDELLA A. Kernel methods for relation extraction[C]∥Proceedings of the ACL-02 Conference on Empirical Methods in Natural Language Processing.Stroudsburg:ACL,2002:71-78.
[4]YU X F, LAM W. Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach[C]∥Proceedings of the 23rd International Conference on Computational Linguistics. Stroudsburg:ACL,2010: 1399-1407.
[5]LI Q, JI H. Incremental joint extraction of entity mentions and relations[C]∥Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg:ACL, 2014: 402-412.
[6]MIWA M, SASAKI Y. Modeling joint entity and relation extraction with table representation[C]∥Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL, 2014: 1858-1869.
[7]WEI Z P, SU J L, WANG Y, et al. A novel cascade binary tagging framework for relational triple extraction[EB/OL]. (2019-09-07)[2025-07-08].https:∥doi.org/10.48550/arXiv.1909.03227.
[8]WANG Y C, YU B W, ZHANG Y Y, et al. TPLinker: single-stage joint extraction of entities and relations through token pair linking[EB/OL]. (2020-10-26)[2025-07-08]. https:∥doi. org/10.48550/arXiv.2010.13415.
[9]YAN Z H, ZHANG C, FU J L, et al. A partition filter network for joint entity and relation extraction[C]∥Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2021,185-197.
[10] ZHENG H Y, WEN R, CHEN X, et al. PRGC: potential relation and global correspondence based joint relational triple extraction[EB/OL]. (2021-06-18)[202507-08].https:∥doi.org/10.48550/arXiv.2106.09895.
[11] LI X M, LUO X T, DONG C H, et al. TDEER: an efficient translating decoding schema for joint extraction of entities and relations[C]∥Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2021: 8055-8064.
[12] SUI D B, ZENG X R, CHEN Y B, et al. Joint entity and relation extraction with set prediction networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35(9): 12784-12795.
[13] DEVLIN J, CHANG M W, LEE K, et al. BERT: pretraining of deep bidirectional transformers for language understanding[EB/OL]. (2018-10-11)[2025-0708].https:∥doi.org/10.48550/arXiv.1810.04805.
[14] GAO C, ZHANG X, LI L Y, et al. ERGM: a multistage joint entity and relation extraction with global entity match [ J ]. Knowledge-Based Systems, 2023, 271: 110550.
[15] LI R, LA K J, LEI J S, et al. Joint extraction model of entity relations based on decomposition strategy[J]. Scientific Reports, 2024, 14(1): 1786.
[16]宋玲, 韦紫君, 陈燕, 等. 基于RoBERTa和指针网络的中文实体与关系联合抽取方法及系统: CN116663539A[P]. 2023-08-29.
SONG L,WEI Z J,CHEN Y,et al. Joint extraction method and system of Chinese entities and relations based on RoBERTa and pointer network:CN116663539A[P].202308-29.
[17] CUI Y M, CHE W X, LIU T, et al. Pre-training with whole word masking for Chinese BERT[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, 29: 3504-3514.
[18] VINYALS O, FORTUNATO M, JAITLY N. Pointer networks[EB/OL]. (2015-06-09)[2025-07-08].https:∥doi.org/10.48550/arXiv.1506.03134.
[19]张强, 曾俊玮, 陈锐. 基于对比学习与梯度惩罚的实体关系联合抽取模型[J]. 吉林大学学报(理学版),2024, 62(5): 1155-1162.
ZHANG Q, ZENG J W, CHEN R. Entity-relation joint extraction model based on contrastive learning and gradient penalty[J]. Journal of Jilin University (Science Edition), 2024, 62(5): 1155-1162.
[20] LI S J, HE W, SHI Y B, et al. DuIE: a large-scale Chinese dataset for information extraction[C]∥Natural Language Processing and Chinese Computing Natural Language Processing and Chinese Computing: 8th CCF International Conference. Cham: Springer, 2019: 791-800.
[21] LAN Z Z, CHEN M D, GOODMAN S, et al. ALBERT: a lite BERT for self-supervised learning of language representations[EB/OL]. (2019-09-26)[2025-07-08].https:∥doi.org/10.48550/arXiv.1909.11942.
[22] CLARK K, LUONG M T, LE Q V, et al. ELECTRA: pre-training text encoders as discriminators rather than generators[EB/OL]. (2020-03-23)[2025-07-08].https:∥doi.org/10.48550/arXiv.2003.10555.