STATISTICS

Viewed491

Downloads419

Topic Matching Algorithm Based on Multi-feature Fusion of Key Entities and Text Abstracts
[1]JI Ke,ZHANG Xiu,MA Kun,et al.Topic Matching Algorithm Based on Multi-feature Fusion of Key Entities and Text Abstracts[J].Journal of Zhengzhou University (Engineering Science),2024,45(02):51-59.[doi:10.13705/j.issn.1671-6833.2024.02.008]
Copy
References:
[1] MALA V, LOBIYAL D K. Semantic and keyword based web techniques in information retrieval[ C]∥2016 International Conference on Computing, Communication and Automation (ICCCA). Piscataway: IEEE, 2017: 23-26. 
[2] 陈宁. 基于网络的关键词检索技巧[ J] . 中国科技信 息, 2008(2) :115-115, 117. 
CHEN N. Key words retrieval skills based on network [ J] . China Science and Technology Information, 2008 (2) :115-115, 117.
 [3] COHEN W W, RAVIKUMAR P, FIENBERG S. A comparison of string distance metrics for name-matching tasks [C]∥ Proceedings of the 2003 International Conference on Information Integration on the Web. New York:ACM, 2003:73-78.
 [4] 庞亮, 兰艳艳, 徐君, 等. 深度文本匹配综述[ J] . 计 算机学报, 2017, 40(4) : 985-1003.
 PANG L, LAN Y Y, XU J, et al. A survey on deep text matching[ J] . Chinese Journal of Computers, 2017, 40 (4) : 985-1003. 
[5] LIU J, KONG X, ZHOU X, et al. Data mining and information retrieval in the 21st century: a bibliographic review[ J] . Computer Science Review, 2019, 34: 100193.
 [6] ARORA S, BATRA K, SINGH S. Dialogue system: a brief review[EB / OL] . ( 2013- 6- 18) [ 2023 - 06 - 15] . https:∥arxiv. org / abs/ 1306. 4134.
 [7] MUELLER J, THYAGARAJAN A. Siamese recurrent architectures for learning sentence similarity[ C]∥Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. New York: ACM, 2016: 2786-2792. 
[8] YIN W P, SCHÜTZE H, XIANG B, et al. ABCNN: attention-based convolutional neural network for modeling sentence pairs [ J ] . Transactions of the Association for Computational Linguistics, 2016, 4: 259-272. 
[9] DEVLIN J, CHANG M W, LEE K, et al. BERT: pretraining of deep bidirectional transformers for language understanding[ EB / OL ] . ( 2018 - 10 - 11 ) [ 2023 - 06 - 15] . https:∥arxiv. org / abs/ 1810. 04805.
 [10] LIU Y H, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized BERT pretraining approach [ EB / OL ] . (2019 - 7 - 26 ) [ 2023 - 06 - 15 ] . https:∥arxiv. org / abs/ 1907. 11692. 
[11] WEI J Q, REN X Z, LI X G, et al. NEZHA: neural contextualized representation for Chinese language understanding[EB / OL] . ( 2019- 8- 31) [ 2023- 06- 15] . https:∥arxiv. org / abs/ 1909. 00204. pdf.
 [12] PEINELT N, NGUYEN D, LIAKATA M. TBERT: topic models and BERT joining forces for semantic similarity detection[C]∥Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2020: 7047-7055.
 [13] 周澳回, 翁知远, 周思源, 等. 一种基于主题过滤和 语义匹配的服务发现方法[ J] . 郑州大学学报( 工学 版) , 2022, 43(6) :36-41, 56.
 ZHOU A H, WENG Z Y, ZHOU S Y, et al. A service discovery method based on topic filtering and semantic matching [ J ] . Journal of Zhengzhou University ( Engineering Science) , 2022, 43(6) :36-41, 56. 
[14] MIAO C Y, CAO Z, TAM Y C. Keyword-attentive deep semantic matching[EB / OL] . (2020- 05- 11) [ 2023- 06 -15] . https:∥arxiv. org / abs/ 2003. 11516. 
[15] ZOU Y C, LIU H W, GUI T, et al. Divide and conquer: text semantic matching with disentangled keywords and intents[EB / OL] . (2022-05-6) [2023-06-15] . https: ∥arxiv. org / abs/ 2203. 02898. 
[16] HUANG Z H, XU W, YU K. Bidirectional LSTM-CRF models for sequence tagging [ EB / OL]. ( 2015 - 08 - 09) [2023-06-15]. https:∥arxiv. org / abs/ 1508. 01991. pdf. 
[17] LI J Y, FEI H, LIU J, et al. Unified named entity recognition as word-word relation classification [ J] . Proceedings of the AAAI Conference on Artificial Intelligence, 2022, 36(10) : 10965-10973.
 [18] LIU Y. Fine-tune BERT for extractive summarization [EB / OL] . (2019 - 05 - 25) [ 2023 - 06 - 15] . https:∥ arxiv. org / abs/ 1903. 10318. 
[19] ZHANG J Q, ZHAO Y, SALEH M, et al. PEGASUS: pre-training with extracted gap-sentences for abstractive summarization[ EB / OL] . ( 2019 - 12 - 18) [ 2023 - 06 - 15] . https:∥arxiv. org / abs/ 1912. 08777.
 [20] YU Y, SI X, HU C, et al. A review of recurrent neuranetworks: LSTM cells and network architectures [ J ] . Neural computation, 2019, 31(7) : 1235-1270.
 [21] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space [ EB / OL] . (2013 - 01 - 16) [ 2023 - 06 - 15] . https:∥arxiv. org / abs/ 1301. 3781. 
[22] ZHANG J X, GAN R Y, WANG J J, et al. Fengshenbang 1. 0: being the foundation of Chinese cognitive intelligence[EB / OL] . ( 2022- 09- 07) [ 2023- 06- 15] . https:∥arxiv. org / abs/ 2209. 02970. 
[23] 李勇, 金庆雨, 张青川. 融合位置注意力机制和改进 BLSTM 的食品评论情感分析[ J] . 郑州大学学报( 工 学版) , 2020, 41(1) :58-62. 
LI Y, JIN Q Y, ZHANG Q C. Improved BLSTM food review sentiment analysis with positional attention mechanisms[ J] . Journal of Zhengzhou University ( Engineering Science) , 2020, 41(1) :58-62.
 [24] 搜狐. 2021 搜 狐 校 园 文 本 匹 配 算 法 大 赛 [ EB / OL ] . (2021-03-29) [ 2023- 06- 15] . 
https:∥www. biendata. xyz/ competition / sohu_2021 / . Sohu. 2021 Sohu campus text matching algorithm competition[EB / OL] . (2021- 03- 29) [ 2023- 06- 15] . https: ∥www. biendata. xyz/ competition / sohu_2021 / .
 [25] REIMERS N, GUREVYCH I. Sentence-BERT: sentence embeddings using Siamese BERT-networks [ EB / OL ] . (2019 - 8 - 27 ) [ 2023 - 06 - 15 ] . https:∥arxiv. org / abs/ 1908. 10084. 
[26] SUN Y, WANG S H, FENG S K, et al. ERNIE 3. 0: large-scale knowledge enhanced pre-training for language understanding and generation[ EB / OL]. ( 2021- 07 - 05) [2023-06-15]. https:∥arxiv. org / abs/ 2107. 02137.
Similar References:
Memo

-

Last Update: 2024-03-08
Copyright © 2023 Editorial Board of Journal of Zhengzhou University (Engineering Science)