期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
New criteria on the existence and global exponential stability of periodic solutions for quaternion-valued cellular neural networks
1
作者 LI Ai-ling ZHOU Zheng ZHANG Zheng-qiu 《Applied Mathematics(A Journal of Chinese Universities)》 2025年第3期523-542,共20页
In this paper,a class of quaternion-valued cellular neural networks(QVCNNS)with time-varying delays are considered.Combining graph theory with the continuation theorem of Mawhin’s coincidence degree theory as well as... In this paper,a class of quaternion-valued cellular neural networks(QVCNNS)with time-varying delays are considered.Combining graph theory with the continuation theorem of Mawhin’s coincidence degree theory as well as Lyapunov functional method,we establish new criteria on the existence and exponential stability of periodic solutions for QVCNNS by removing the assumptions for the boundedness on the activation functions and the assumptions that the values of the activation functions are zero at origin.Hence,our results are less conservative and new. 展开更多
关键词 the existence of periodic solutions exponential stability quaternion-valued cellular neural networks combining graph theory with Mawhin’s continuation theorem of coincidence degree theory Lyapunov function method inequality techniques
在线阅读 下载PDF
E-CGL:an efficient continual graph learner
2
作者 Jianhao GUO Zixuan NI +1 位作者 Yun ZHU Siliang TANG 《Frontiers of Information Technology & Electronic Engineering》 2025年第8期1441-1453,共13页
Continual learning(CL)has emerged as a crucial paradigm for learning from sequential data while retaining previous knowledge.Continual graph learning(CGL),characterized by dynamically evolving graphs from streaming da... Continual learning(CL)has emerged as a crucial paradigm for learning from sequential data while retaining previous knowledge.Continual graph learning(CGL),characterized by dynamically evolving graphs from streaming data,presents distinct challenges that demand efficient algorithms to prevent catastrophic forgetting.The first challenge stems from the interdependencies between different graph data,in which previous graphs infuence new data distributions.The second challenge is handling large graphs in an efficient manner.To address these challenges,we propose an eficient continual graph learner(E-CGL)in this paper.We address the interdependence issue by demonstrating the effectiveness of replay strategies and introducing a combined sampling approach that considers both node importance and diversity.To improve efficiency,E-CGL leverages a simple yet effective multilayer perceptron(MLP)model that shares weights with a graph neural network(GNN)during training,thereby accelerating computation by circumventing the expensive message-passing process.Our method achieves state-ofthe-art results on four CGL datasets under two settings,while significantly lowering the catastrophic forgetting value to an average of-1.1%.Additionally,E-CGL achieves the training and inference speedup by an average of 15.83x and 4.89x,respectively,across four datasets.These results indicate that E-CGL not only effectively manages correlations between different graph data during continual training but also enhances efficiency in large-scale CGL. 展开更多
关键词 graph neural networks Continual learning Dynamic graphs Continual graph learning graph acceleration
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部