期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
A Knowledge Graph Enhanced Pre-Trained Large Language Model for Predicting MicroRNA-circRNA Interactions
1
作者 Jiren Zhou Rui Niu +2 位作者 Boya Ji Zhuhong You Xuequn Shang 《Big Data Mining and Analytics》 2025年第6期1405-1417,共13页
The interactions between circular RNAs(circRNAs)and microRNAs are one of the key mechanisms determining the functions of non-coding RNAs(ncRNAs)in biological processes such as DNA methylation and RNA-induced silencing... The interactions between circular RNAs(circRNAs)and microRNAs are one of the key mechanisms determining the functions of non-coding RNAs(ncRNAs)in biological processes such as DNA methylation and RNA-induced silencing.Studying these relationships can deepen our understanding of the function of these RNAs’roles in developing cancer vaccines and designing treatments.Therefore,we propose a knowledge graph enhanced pre-trained Large Language Model(LLM)for predicting circRNA-microRNA interactions.Our approach employs graph contrastive learning to represent a knowledge graph consisting of circRNA and microRNA entities from multi-views.The features of these entities are derived by fine-tuning a sequential LLM by two types of ncRNAs separately.At the final,the embedding is fed into classifier for prediction.We employ an independent testing set to evaluate the model’s performance and against our model with recently reported models on two datasets.Our model achieves approximately a 3%improvement in Area Under the Receiver Operating Characteristic Curve(AUROC),reaching 93.77%and 93.07%,respectively.The stability of our model is tested by performing 10-fold cross-validation on the remaining training set where our model performs the best stability.In ablation study,we comprehensively compare strategies for sequence processing and effectiveness of independent module.Finally,on a case study dataset derived from real-world scenarios,the model assign scores to all candidates and rank them accordingly.Among the top 10 highest-scoring results,7 have been validated by wet-lab experiments,highlighting the model’s strong generalization capability. 展开更多
关键词 knowledge graph sequential Large Language Model(LLM) graph contrastive learning circrnamicrorna interactions self-supervised neural network
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部