期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Limits of Depth: Over-Smoothing and Over-Squashing in GNNs 被引量:1
1
作者 Aafaq Mohi ud din Shaima Qureshi 《Big Data Mining and Analytics》 EI CSCD 2024年第1期205-216,共12页
Graph Neural Networks(GNNs)have become a widely used tool for learning and analyzing data on graph structures,largely due to their ability to preserve graph structure and properties via graph representation learning.H... Graph Neural Networks(GNNs)have become a widely used tool for learning and analyzing data on graph structures,largely due to their ability to preserve graph structure and properties via graph representation learning.However,the effect of depth on the performance of GNNs,particularly isotropic and anisotropic models,remains an active area of research.This study presents a comprehensive exploration of the impact of depth on GNNs,with a focus on the phenomena of over-smoothing and the bottleneck effect in deep graph neural networks.Our research investigates the tradeoff between depth and performance,revealing that increasing depth can lead to over-smoothing and a decrease in performance due to the bottleneck effect.We also examine the impact of node degrees on classification accuracy,finding that nodes with low degrees can pose challenges for accurate classification.Our experiments use several benchmark datasets and a range of evaluation metrics to compare isotropic and anisotropic GNNs of varying depths,also explore the scalability of these models.Our findings provide valuable insights into the design of deep GNNs and offer potential avenues for future research to improve their performance. 展开更多
关键词 Graph Neural Networks(GNNs) learning on graphs over-smoothing over-squashing isotropic-GNNs anisotropic-GNNs
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部