Recent advances in machine learning have demonstrated an enormous utility of deep learning approaches,particularly Graph Neural Networks(GNNs)for materials science.These methods have emerged as powerful tools for high...Recent advances in machine learning have demonstrated an enormous utility of deep learning approaches,particularly Graph Neural Networks(GNNs)for materials science.These methods have emerged as powerful tools for high-throughput prediction of material properties,offering a compelling enhancement and alternative to traditional first-principles calculations.While the community has predominantly focused on developing increasingly complex and universal models to enhance predictive accuracy,such approaches often lack physical interpretability and insights into materials behavior.Here,we introduce a novel computational paradigm—Self-Adaptable Graph Attention Networks integrated with Symbolic Regression(SA-GAT-SR)—that synergistically combines the predictive capability of GNNs with the interpretative power of symbolic regression.Our framework employs a self-adaptable encoding algorithm that automatically identifies and adjust attention weights so as to screen critical features from an expansive 180-dimensional feature space while maintainingO(n)computational scaling.The integratedSRmodule subsequently distills these features into compact analytical expressions that explicitly reveal quantum-mechanically meaningful relationships,achieving 23×acceleration compared to conventional SR implementations that heavily rely on first-principle calculations-derived features as input.This work suggests a new framework in computational materials science,bridging the gap between predictive accuracy and physical interpretability,offering valuable physical insights into material behavior.展开更多
基金supported by National Natural Science Foundation of China (No.12374057)Fundamental Research Funds for the Central Universities. The work (S.T.) at Los Alamos National Laboratory (LANL) was performed at the Center for Integrated Nanotechnologies (CINT), a U.S. Department of Energy, Office of Science user facility at LANL.
文摘Recent advances in machine learning have demonstrated an enormous utility of deep learning approaches,particularly Graph Neural Networks(GNNs)for materials science.These methods have emerged as powerful tools for high-throughput prediction of material properties,offering a compelling enhancement and alternative to traditional first-principles calculations.While the community has predominantly focused on developing increasingly complex and universal models to enhance predictive accuracy,such approaches often lack physical interpretability and insights into materials behavior.Here,we introduce a novel computational paradigm—Self-Adaptable Graph Attention Networks integrated with Symbolic Regression(SA-GAT-SR)—that synergistically combines the predictive capability of GNNs with the interpretative power of symbolic regression.Our framework employs a self-adaptable encoding algorithm that automatically identifies and adjust attention weights so as to screen critical features from an expansive 180-dimensional feature space while maintainingO(n)computational scaling.The integratedSRmodule subsequently distills these features into compact analytical expressions that explicitly reveal quantum-mechanically meaningful relationships,achieving 23×acceleration compared to conventional SR implementations that heavily rely on first-principle calculations-derived features as input.This work suggests a new framework in computational materials science,bridging the gap between predictive accuracy and physical interpretability,offering valuable physical insights into material behavior.