期刊文献+
共找到26篇文章
< 1 2 >
每页显示 20 50 100
Graph Transformer技术与研究进展:从基础理论到前沿应用 被引量:2
1
作者 游浩 丁苍峰 +2 位作者 马乐荣 延照耀 曹璐 《计算机应用研究》 北大核心 2025年第4期975-986,共12页
图数据处理是一种用于分析和操作图结构数据的方法,广泛应用于各个领域。Graph Transformer作为一种直接学习图结构数据的模型框架,结合了Transformer的自注意力机制和图神经网络的方法,是一种新型模型。通过捕捉节点间的全局依赖关系... 图数据处理是一种用于分析和操作图结构数据的方法,广泛应用于各个领域。Graph Transformer作为一种直接学习图结构数据的模型框架,结合了Transformer的自注意力机制和图神经网络的方法,是一种新型模型。通过捕捉节点间的全局依赖关系和精确编码图的拓扑结构,Graph Transformer在节点分类、链接预测和图生成等任务中展现出卓越的性能和准确性。通过引入自注意力机制,Graph Transformer能够有效捕捉节点和边的局部及全局信息,显著提升模型效率和性能。深入探讨Graph Transformer模型,涵盖其发展背景、基本原理和详细结构,并从注意力机制、模块架构和复杂图处理能力(包括超图、动态图)三个角度进行细分分析。全面介绍Graph Transformer的应用现状和未来发展趋势,并探讨其存在的问题和挑战,提出可能的改进方法和思路,以推动该领域的研究和应用进一步发展。 展开更多
关键词 图神经网络 graph Transformer 图表示学习 节点分类
在线阅读 下载PDF
Towards automated software model checking using graph transformation systems and Bogor
2
作者 Vahid RAFE Adel T.RAHMANI 《Journal of Zhejiang University-Science A(Applied Physics & Engineering)》 SCIE EI CAS CSCD 2009年第8期1093-1105,共13页
Graph transformation systems have become a general formal modeling language to describe many models in software development process.Behavioral modeling of dynamic systems and model-to-model transformations are only a ... Graph transformation systems have become a general formal modeling language to describe many models in software development process.Behavioral modeling of dynamic systems and model-to-model transformations are only a few examples in which graphs have been used to software development.But even the perfect graph transformation system must be equipped with automated analysis capabilities to let users understand whether such a formal specification fulfills their requirements.In this paper,we present a new solution to verify graph transformation systems using the Bogor model checker.The attributed graph grammars(AGG)-like graph transformation systems are translated to Bandera intermediate representation(BIR),the input language of Bogor,and Bogor verifies the model against some interesting properties defined by combining linear temporal logic(LTL) and special-purpose graph rules.Experimental results are encouraging,showing that in most cases our solution improves existing approaches in terms of both performance and expressiveness. 展开更多
关键词 graph transformation VERIFICATION Bogor Attributed graph grammars (AGG) Software model checking
原文传递
基于Graph Transformer的无人机全覆盖路径规划方法
3
作者 陈旭 王从庆 +1 位作者 曾强 李战 《计算机测量与控制》 2025年第12期224-229,277,共7页
为了实现无人机对三维结构的损伤检测,同时避免无人机与三维结构之间的碰撞,保证检测过程的准确、高效,针对无人机全覆盖路径规划问题,提出了一种基于Graph Transformer的无人机全覆盖路径规划方法:将其视为旅行商问题的变体,在全连接... 为了实现无人机对三维结构的损伤检测,同时避免无人机与三维结构之间的碰撞,保证检测过程的准确、高效,针对无人机全覆盖路径规划问题,提出了一种基于Graph Transformer的无人机全覆盖路径规划方法:将其视为旅行商问题的变体,在全连接图上用图神经网络进行求解;在图神经网络中引入了注意力模块,缓解了图神经网络中稀疏消息传递的局限性;结合图卷积和注意力机制对节点和边进行特征提取;在解码器中评估每条边在解中存在的概率,生成概率热力图;通过波束搜索获得初步解,并使用局部搜索进行优化;实验结果表明,与基于强化学习、搜索的深度学习方法以及改进的蚁群方法和遗传算法相比,该方法在性能表现、泛化性等方面具有显著优势;并适用于二维和三维空间中的欧氏距离及非欧氏距离情况,在无人机导航和全覆盖路径规划方面具有很好的应用价值。 展开更多
关键词 graph Transformer 全覆盖路径规划 图神经网络 旅行商问题 注意力机制
在线阅读 下载PDF
DEGENERATE OPTIMAL BASIS GRAPHS IN LINEAR PROGRAMMING 被引量:1
4
作者 Lin Yixun\ Wen JianjunDept.of Math.,Zhengzhou Univ.,Zhengzhou450 0 52 . 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2000年第2期184-192,共9页
The basis graph \%G\% for a linear programming consists of all bases under pivot transformations. A degenerate optimal basis graph G * is a subgraph of \%G\% induced by all optimal bases at a degenerate optimal verte... The basis graph \%G\% for a linear programming consists of all bases under pivot transformations. A degenerate optimal basis graph G * is a subgraph of \%G\% induced by all optimal bases at a degenerate optimal vertex x 0. In this paper, several conditions for the characterization of G * are presented. 展开更多
关键词 Linear programming DEGENERACY transformation graphs.
在线阅读 下载PDF
Full Graph Methods of Switched Current Circuit Solution
5
作者 Bohumil Brtnik 《Computer Technology and Application》 2011年第6期471-478,共8页
Circuits with switched current are described by an admittance matrix and seeking current transfers then means calculating the ratio of algebraic supplements of this matrix. As there are also graph methods of circuit a... Circuits with switched current are described by an admittance matrix and seeking current transfers then means calculating the ratio of algebraic supplements of this matrix. As there are also graph methods of circuit analysis in addition to algebraic methods, it is clearly possible in theory to carry out an analysis of the whole switched circuit in two-phase switching exclusively by the graph method as well. For this purpose it is possible to plot a Mason graph of a circuit, use transformation graphs to reduce Mason graphs for all the four phases of switching, and then plot a summary graph from the transformed graphs obtained this way. First the author draws nodes and possible branches, obtained by transformation graphs for transfers of EE (even-even) and OO (odd-odd) phases. In the next step, branches obtained by transformation graphs for EO and OE phase are drawn between these nodes, while their resulting transfer is 1 multiplied by z^1/2. This summary graph is extended by two branches from input node and to output node, the extended graph can then be interpreted by the Mason's relation to provide transparent current transfers. Therefore it is not necessary to compose a sum admittance matrix and to express this consequently in numbers, and so it is possible to reach the final result in a graphical way. 展开更多
关键词 Switched current circuits two phases transformation graph Mason's formula current transfer summary MC-graph
在线阅读 下载PDF
Graph Transformers研究进展综述 被引量:3
6
作者 周诚辰 于千城 +2 位作者 张丽丝 胡智勇 赵明智 《计算机工程与应用》 CSCD 北大核心 2024年第14期37-49,共13页
随着图结构数据在各种实际场景中的广泛应用,对其进行有效建模和处理的需求日益增加。Graph Transformers(GTs)作为一类使用Transformers处理图数据的模型,能够有效缓解传统图神经网络(GNN)中存在的过平滑和过挤压等问题,因此可以学习... 随着图结构数据在各种实际场景中的广泛应用,对其进行有效建模和处理的需求日益增加。Graph Transformers(GTs)作为一类使用Transformers处理图数据的模型,能够有效缓解传统图神经网络(GNN)中存在的过平滑和过挤压等问题,因此可以学习到更好的特征表示。根据对近年来GTs相关文献的研究,将现有的模型架构分为两类:第一类通过绝对编码和相对编码向Transformers中加入图的位置和结构信息,以增强Transformers对图结构数据的理解和处理能力;第二类根据不同的方式(串行、交替、并行)将GNN与Transformers进行结合,以充分利用两者的优势。介绍了GTs在信息安全、药物发现和知识图谱等领域的应用,对比总结了不同用途的模型及其优缺点。最后,从可扩展性、复杂图、更好的结合方式等方面分析了GTs未来研究面临的挑战。 展开更多
关键词 graph Transformers(GTs) 图神经网络 图表示学习 异构图
在线阅读 下载PDF
Graph Transformer for Communities Detection in Social Networks 被引量:2
7
作者 G.Naga Chandrika Khalid Alnowibet +3 位作者 K.Sandeep Kautish E.Sreenivasa Reddy Adel F.Alrasheedi Ali Wagdy Mohamed 《Computers, Materials & Continua》 SCIE EI 2022年第3期5707-5720,共14页
Graphs are used in various disciplines such as telecommunication,biological networks,as well as social networks.In large-scale networks,it is challenging to detect the communities by learning the distinct properties o... Graphs are used in various disciplines such as telecommunication,biological networks,as well as social networks.In large-scale networks,it is challenging to detect the communities by learning the distinct properties of the graph.As deep learning hasmade contributions in a variety of domains,we try to use deep learning techniques to mine the knowledge from large-scale graph networks.In this paper,we aim to provide a strategy for detecting communities using deep autoencoders and obtain generic neural attention to graphs.The advantages of neural attention are widely seen in the field of NLP and computer vision,which has low computational complexity for large-scale graphs.The contributions of the paper are summarized as follows.Firstly,a transformer is utilized to downsample the first-order proximities of the graph into a latent space,which can result in the structural properties and eventually assist in detecting the communities.Secondly,the fine-tuning task is conducted by tuning variant hyperparameters cautiously,which is applied to multiple social networks(Facebook and Twitch).Furthermore,the objective function(crossentropy)is tuned by L0 regularization.Lastly,the reconstructed model forms communities that present the relationship between the groups.The proposed robust model provides good generalization and is applicable to obtaining not only the community structures in social networks but also the node classification.The proposed graph-transformer shows advanced performance on the social networks with the average NMIs of 0.67±0.04,0.198±0.02,0.228±0.02,and 0.68±0.03 on Wikipedia crocodiles,Github Developers,Twitch England,and Facebook Page-Page networks,respectively. 展开更多
关键词 Social networks graph transformer community detection graph classification
在线阅读 下载PDF
基于Graph Transformer的半监督异配图表示学习模型
8
作者 黎施彬 龚俊 汤圣君 《计算机应用》 CSCD 北大核心 2024年第6期1816-1823,共8页
现有的图卷积网络(GCN)模型基于同配性假设,无法直接应用于异配图的表示学习,且许多异配图表示学习的研究工作受消息传递机制的限制,导致节点特征混淆和特征过度挤压而出现过平滑问题。针对这些问题,提出一种基于Graph Transformer的半... 现有的图卷积网络(GCN)模型基于同配性假设,无法直接应用于异配图的表示学习,且许多异配图表示学习的研究工作受消息传递机制的限制,导致节点特征混淆和特征过度挤压而出现过平滑问题。针对这些问题,提出一种基于Graph Transformer的半监督异配图表示学习模型HPGT(HeteroPhilic Graph Transformer)。首先,使用度连接概率矩阵采样节点的路径邻域,再通过自注意力机制自适应地聚合路径上的节点异配连接模式,编码得到节点的结构信息,用节点的原始属性信息和结构信息构建Transformer层的自注意力模块;其次,将每个节点自身的隐层表示与它的邻域节点的隐层表示分离更新以避免节点通过自注意力模块聚合过量的自身信息,再把每个节点表示与它的邻域表示连接,得到单个Transformer层的输出,另外,将所有的Transformer层的输出跳连到最终的节点隐层表示以防止中间层信息丢失;最后,使用线性层和Softmax层将节点的隐层表示映射到节点的预测标签。实验结果表明,与无结构编码(SE)的模型相比,基于度连接概率的SE能为Transformer层的自注意力模块提供有效的偏差信息,HPGT平均准确率提升0.99%~11.98%;与对比模型相比,在异配数据集(Texas、Cornell、Wisconsin和Actor)上,模型节点分类准确率提升0.21%~1.69%,在同配数据集(Cora、CiteSeer和PubMed)上,节点分类准确率分别达到了0.8379、0.7467和0.8862。以上结果验证了HPGT具有较强的异配图表示学习能力,尤其适用于强异配图节点分类任务。 展开更多
关键词 图卷积网络 异配图 图表示学习 graph Transformer 节点分类
在线阅读 下载PDF
Markov Chains Based on Random Generalized 1-Flipper Operations for Connected Regular Multi-digraphs
9
作者 邓爱平 伍陈晨 +1 位作者 王枫杰 胡宇庭 《Journal of Donghua University(English Edition)》 CAS 2023年第1期110-115,共6页
The properties of generalized flip Markov chains on connected regular digraphs are discussed.The 1-Flipper operation on Markov chains for undirected graphs is generalized to that for multi-digraphs.The generalized 1-F... The properties of generalized flip Markov chains on connected regular digraphs are discussed.The 1-Flipper operation on Markov chains for undirected graphs is generalized to that for multi-digraphs.The generalized 1-Flipper operation preserves the regularity and weak connectivity of multi-digraphs.The generalized 1-Flipper operation is proved to be symmetric.Moreover,it is presented that a series of random generalized 1-Flipper operations eventually lead to a uniform probability distribution over all connected d-regular multi-digraphs without loops. 展开更多
关键词 random graph transformation regular multi-digraph Markov chain 1-Flipper triangle reverse
在线阅读 下载PDF
Mechatronic Modeling and Domain Transformation of Multi-physics Systems
10
作者 Clarence W.DE SILVA 《Instrumentation》 2021年第1期14-28,共15页
The enhanced definition of Mechatronics involves the four underlying characteristics of integrated,unified,unique,and systematic approaches.In this realm,Mechatronics is not limited to electro-mechanical systems,in th... The enhanced definition of Mechatronics involves the four underlying characteristics of integrated,unified,unique,and systematic approaches.In this realm,Mechatronics is not limited to electro-mechanical systems,in the multi-physics sense,but involves other physical domains such as fluid and thermal.This paper summarizes the mechatronic approach to modeling.Linear graphs facilitate the development of state-space models of mechatronic systems,through this approach.The use of linear graphs in mechatronic modeling is outlined and an illustrative example of sound system modeling is given.Both time-domain and frequency-domain approaches are presented for the use of linear graphs.A mechatronic model of a multi-physics system may be simplified by converting all the physical domains into an equivalent single-domain system that is entirely in the output domain of the system.This approach of converting(transforming)physical domains is presented.An illustrative example of a pressure-controlled hydraulic actuator system that operates a mechanical load is given. 展开更多
关键词 Mechatronic Modeling Multi-physics Systems Integrated Unified Unique and Systematic Approach Linear graphs Physical Domain Conversion/transformation
原文传递
融合图Transformer和Vina-GPU+的多模态虚拟筛选新方法
11
作者 张豪 张堃然 +2 位作者 阮晓东 沐勇 吴建盛 《南京大学学报(自然科学版)》 北大核心 2025年第1期83-93,共11页
现代药物发现面临对大规模化合物库进行虚拟筛选的挑战,提高分子对接的速度与精度是核心问题.AutoDock Vina是最受欢迎的分子对接工具之一,我们的Vina-GPU和Vina-GPU+方法在确保对接准确性的同时,分别实现了对AutoDock Vina最高50倍和6... 现代药物发现面临对大规模化合物库进行虚拟筛选的挑战,提高分子对接的速度与精度是核心问题.AutoDock Vina是最受欢迎的分子对接工具之一,我们的Vina-GPU和Vina-GPU+方法在确保对接准确性的同时,分别实现了对AutoDock Vina最高50倍和65.6倍的加速.近年来,大规模预训练模型在自然语言处理和计算机视觉领域取得了巨大成功,这种范式对解决虚拟筛选面临的重大挑战也具有巨大潜力.因此,提出一种多模态虚拟筛选新方法Vina-GPU GT,结合了Vina-GPU+分子对接技术和预训练的Graph Transformer(GT)模型,以实现快速精确的虚拟筛选.该方法包括三个连续步骤:(1)通过对已有分子属性预测的预训练GT模型进行知识蒸馏,学到一个小的SMILES Transformer(ST)模型;(2)通过ST模型推理化合物库中所有分子,并根据主动学习规则微调ST模型;(3)利用微调后的ST模型进行虚拟筛选.在三个重要靶点和两个化合物库上进行了虚拟筛选实验,并与两种虚拟筛选方法进行了比较,结果表明,Vina-GPU GT的虚拟筛选性能最优. 展开更多
关键词 虚拟筛选 graph Transformer Vina-GPU+ 多模态 知识蒸馏 主动学习
在线阅读 下载PDF
基于图神经网络的多粒度软件系统交互关系预测
12
作者 邓文涛 程璨 +2 位作者 何鹏 陈孟瑶 李兵 《软件学报》 北大核心 2025年第5期2043-2063,共21页
当下,软件系统中元素间的交互错综复杂,涵盖了包间、类间和函数间等多种关系.准确理解这些关系对于优化系统结构以及提高软件质量至关重要.分析包间关系有助于揭示模块间的依赖性,有利于开发者更好地管理和组织软件架构;而类间关系的明... 当下,软件系统中元素间的交互错综复杂,涵盖了包间、类间和函数间等多种关系.准确理解这些关系对于优化系统结构以及提高软件质量至关重要.分析包间关系有助于揭示模块间的依赖性,有利于开发者更好地管理和组织软件架构;而类间关系的明晰理解则有助于构建更具扩展性和可维护性的代码库;清晰了解函数间关系则能够迅速定位和解决程序中的逻辑错误,提升软件的鲁棒性和可靠性.然而,现有的软件系统交互关系预测存在着粒度差异、特征不足和版本变化等问题.针对这一挑战,从软件包、类和函数这3种粒度构建相应的软件网络模型,并提出一种结合局部和全局特征的全新方法,通过软件网络的特征提取和链路预测方式,来增强对软件系统的分析和预测.该方法基于软件网络的构建和处理,具体步骤包括利用node2vec方法学习软件网络的局部特征,并结合拉普拉斯特征向量编码以综合表征节点的全局位置信息.随后,利用Graph Transformer模型进一步优化节点属性的特征向量,最终完成软件系统的交互关系预测任务.在3个Java开源项目上进行广泛的实验验证,包括版本内和跨版本的交互关系预测任务.实验结果显示,相较于基准方法,所提方法在版本内的预测任务中,平均AUC和AP值分别提升8.2%和8.5%;在跨版本预测任务中,平均AUC和AP值分别提升3.5%和2.4%. 展开更多
关键词 软件网络 交互关系预测 graph Transformer 粒度差异 软件质量
在线阅读 下载PDF
结合全局信息和局部信息的三维网格分割框架
13
作者 张梦瑶 周杰 +1 位作者 李文婷 赵勇 《浙江大学学报(工学版)》 北大核心 2025年第5期912-919,共8页
针对Graph Transformer比较擅长捕获全局信息,但对局部精细信息的提取不够充分的问题,将图卷积神经网络(GCN)引入Graph Transformer中,得到Graph Transformer and GCN (GTG)模块,构建了能够结合全局信息和局部信息的网格分割框架. GTG... 针对Graph Transformer比较擅长捕获全局信息,但对局部精细信息的提取不够充分的问题,将图卷积神经网络(GCN)引入Graph Transformer中,得到Graph Transformer and GCN (GTG)模块,构建了能够结合全局信息和局部信息的网格分割框架. GTG模块利用Graph Transformer的全局自注意力机制和GCN的局部连接性质,不仅可以捕获全局信息,还能够加强局部精细信息的提取.为了更好地保留边界区域的信息,设计边缘保持的粗化算法,可以使粗化过程仅作用在非边界区域.利用边界信息对损失函数进行加权,提高了神经网络对边界区域的关注程度.在实验方面,通过视觉效果和定量比较证明了采用本文算法能够获得高质量的分割结果,利用消融实验表明了GTG模块和边缘保持粗化算法的有效性. 展开更多
关键词 三维网格 网格分割 graph Transformer 图卷积神经网络(GCN) 边缘保持的粗化算法
在线阅读 下载PDF
Graph transformer with disease subgraph positional encoding for improved comorbidity prediction
14
作者 Xihan Qin Li Liao 《Quantitative Biology》 2025年第4期91-100,共10页
Comorbidity,the co-occurrence of multiple medical conditions in a single patient,profoundly impacts disease management and outcomes.Understanding these complex interconnections is crucial,especially in contexts where ... Comorbidity,the co-occurrence of multiple medical conditions in a single patient,profoundly impacts disease management and outcomes.Understanding these complex interconnections is crucial,especially in contexts where comorbidities exacerbate outcomes.Leveraging insights from the human interactome and advancements in graph-based methodologies,this study introduces transformer with subgraph positional encoding(TSPE)for disease comorbidity prediction.Inspired by biologically supervised embedding,TSPE employs transformer's attention mechanisms and subgraph positional encoding(SPE)to capture interactions between nodes and disease associations.Our proposed SPE proves more effective than Laplacian positional encoding,as used in Dwivedi et al.'s graph transformer,underscoring the importance of integrating clustering and disease-specific information for improved predictive accuracy.Evaluated on real clinical benchmark datasets(RR0 and RR1),TSPE demonstrates substantial performance enhancements over the state-of-the-art method,achieving up to 28.24%higher ROC AUC(receiver operating characteristic-area under the curve)and 4.93%higher accuracy.This method shows promise for adaptation to other complex graph-based tasks and applications.The source code is available at GitHub website(xihan-qin/TSPE-GraphTransformer). 展开更多
关键词 comorbidity graph embedding graph transformer human interactome subgraph positional encoding
原文传递
Using Markov Chain Based Estimation of Distribution Algorithm for Model-Based Safety Analysis of Graph Transformation
15
作者 Einollah Pira 《Journal of Computer Science & Technology》 SCIE EI CSCD 2021年第4期839-855,共17页
The ability to assess the reliability of safety-critical systems is one of the most crucial requirements in the design of modern safety-critical systems where even a minor failure can result in loss of life or irrepar... The ability to assess the reliability of safety-critical systems is one of the most crucial requirements in the design of modern safety-critical systems where even a minor failure can result in loss of life or irreparable damage to the environment.Model checking is an automatic technique that verifies or refutes system properties by exploring all reachable states(state space)of a model.In large and complex systems,it is probable that the state space explosion problem occurs.In exploring the state space of systems modeled by graph transformations,the rule applied on the current state specifies the rule that can perform on the next state.In other words,the allowed rule on the current state depends only on the applied rule on the previous state,not the ones on earlier states.This fact motivates us to use a Markov chain(MC)to capture this type of dependencies and applies the Estimation of Distribution Algorithm(EDA)to improve the quality of the MC.EDA is an evolutionary algorithm directing the search for the optimal solution by learning and sampling probabilistic models through the best individuals of a population at each generation.To show the effectiveness of the proposed approach,we implement it in GROOVE,an open source toolset for designing and model checking graph transformation systems.Experimental results confirm that the proposed approach has a high speed and accuracy in comparison with the existing meta-heuristic and evolutionary techniques in safety analysis of systems specified formally through graph transformations. 展开更多
关键词 safety analysis model checking Markov chain estimation of distribution algorithm graph transformation system
原文传递
TRANSFORMATIONS FOR THE PRIZE-COLLECTING STEINER TREE PROBLEM AND THE MAXIMUM-WEIGHT CONNECTED SUBGRAPH PROBLEM TO SAP
16
作者 Daniel Rehfeldt Thorsten Koch 《Journal of Computational Mathematics》 SCIE CSCD 2018年第3期459-468,共10页
Transformations of Steiner tree problem variants have been frequently discussed in the literature. Besides allowing to easily transfer complexity results, they constitute a central pillar of exact state-of-the-art sol... Transformations of Steiner tree problem variants have been frequently discussed in the literature. Besides allowing to easily transfer complexity results, they constitute a central pillar of exact state-of-the-art solvers for well-known variants such as the Steiner tree problem in graphs. In this article transformations for both the prize-collecting Steiner tree problem and the maximum-weight connected subgraph problem to the Steiner arborescence problem are introduced for the first time. Furthermore, the considerable implications for practical solving approaches will be demonstrated, including the computation of strong upper and lower bounds. 展开更多
关键词 Prize-collecting Steiner tree problem Maximum-weight connected subgraphproblem graph transformations Dual-ascent heuristics.
原文传递
A Family of Inertial Manifolds of Coupled Kirchhoff Equations
17
作者 Guoguang Lin Fumei Chen 《Journal of Applied Mathematics and Physics》 2022年第6期2074-2085,共12页
In this paper, we study the long-time behavior of the solution of the initial boundary value problem of the coupled Kirchhoff equations. Based on the relevant assumptions, the equivalent norm on E<sub>k</sub&... In this paper, we study the long-time behavior of the solution of the initial boundary value problem of the coupled Kirchhoff equations. Based on the relevant assumptions, the equivalent norm on E<sub>k</sub> is obtained by using the Hadamard graph transformation method, and the Lipschitz constant l<sub>F</sub><sub> </sub>of F is further estimated. Finally, a family of inertial manifolds satisfying the spectral interval condition is obtained. 展开更多
关键词 Kirchhoff Equation the Family of Inertial Manifolds Hadamard graph transformation Spectral Interval Condition
在线阅读 下载PDF
A Family of Inertial Manifolds for a Class of Asymmetrically Coupled Generalized Higher-Order Kirchhoff Equations
18
作者 Guoguang Lin Min Shao 《Open Journal of Applied Sciences》 CAS 2022年第7期1174-1183,共10页
In this paper, we study the inertial manifolds for a class of asymmetrically coupled generalized Higher-order Kirchhoff equations. Under appropriate assumptions, we firstly exist Hadamard’s graph transformation metho... In this paper, we study the inertial manifolds for a class of asymmetrically coupled generalized Higher-order Kirchhoff equations. Under appropriate assumptions, we firstly exist Hadamard’s graph transformation method to structure a graph norm of a Lipschitz continuous function, then we prove the existence of a family of inertial manifolds by showing that the spectral gap condition is true. 展开更多
关键词 Inertial Manifold Hadamard’s graph transformation Method Lipschitz Continuous Spectral Gap Condition
在线阅读 下载PDF
Learning Efficient Linear Graph Transformer via Graph-attention Distillation
19
作者 Haotian Tao Ziyan Zhang +1 位作者 Bo Jiang Bin Luo 《Machine Intelligence Research》 2025年第6期1138-1152,共15页
In recent years,graph transformers have been demonstrated to be effective learning architectures for various graphbased learning tasks.However,their scalability on large-scale data is usually restricted due to the qua... In recent years,graph transformers have been demonstrated to be effective learning architectures for various graphbased learning tasks.However,their scalability on large-scale data is usually restricted due to the quadratic computational complexity of graph transformers when compared to graph convolutional network(GCN)models.To overcome this issue,in this work,we propose to learn an efficient linear graph transformer by employing graph attention distillation model.The proposed method provides a faster and lighter graph transformer framework for graph data learning tasks.The core of the proposed distillation model is to employ the kernel decomposition approach to rebuild the graph transformer architecture,thereby reducing the quadratic complexity to the linear complexity.Furthermore,to seamlessly transfer the rich learning capacity from the regular graph transformer of teacher branch to its linear student counterpart,we devise a novel graph-attention knowledge distillation strategy to enhance the capabilities of the student network.Empirical evaluations conducted on six commonly employed benchmark datasets validate our model′s superiority,as it consistently outperforms existing methods in terms of both effectiveness and efficiency. 展开更多
关键词 graph neural network graph transformer knowledge distillation node classification graph representation learning
原文传递
Fake News Detection:Extendable to Global Heterogeneous Graph Attention Network with External Knowledge
20
作者 Yihao Guo Longye Qiao +3 位作者 Zhixiong Yang Jianping Xiang Xinlong Feng Hongbing Ma 《Tsinghua Science and Technology》 2025年第3期1125-1138,共14页
Distinguishing genuine news from false information is crucial in today’s digital era.Most of the existing methods are based on either the traditional neural network sequence model or graph neural network model that h... Distinguishing genuine news from false information is crucial in today’s digital era.Most of the existing methods are based on either the traditional neural network sequence model or graph neural network model that has become more popularity in recent years.Among these two types of models,the latter solve the former’s problem of neglecting the correlation among news sentences.However,one layer of the graph neural network only considers the information of nodes directly connected to the current nodes and omits the important information carried by distant nodes.As such,this study proposes the Extendable-to-Global Heterogeneous Graph Attention network(namely EGHGAT)to manage heterogeneous graphs by cleverly extending local attention to global attention and addressing the drawback of local attention that can only collect information from directly connected nodes.The shortest distance matrix is computed among all nodes on the graph.Specifically,the shortest distance information is used to enable the current nodes to aggregate information from more distant nodes by considering the influence of different node types on the current nodes in the current network layer.This mechanism highlights the importance of directly or indirectly connected nodes and the effect of different node types on the current nodes,which can substantially enhance the performance of the model.Information from an external knowledge base is used to compare the contextual entity representation with the entity representation of the corresponding knowledge base to capture its consistency with news content.Experimental results from the benchmark dataset reveal that the proposed model significantly outperforms the state-of-the-art approach.Our code is publicly available at https://github.com/gyhhk/EGHGAT_FakeNewsDetection. 展开更多
关键词 fake news detection attention mechanism graph Neural Network(GNN) graph transformer heterogeneous graph
原文传递
上一页 1 2 下一页 到第
使用帮助 返回顶部