Dear Editor,Severe fever with thrombocytopenia syndrome(SFTS)is an emerging tick-borne infectious disease caused by a novel bunyavirus called SFTS virus(SFTSV).It was initially identified in China in 2009(Yu et al.,20...Dear Editor,Severe fever with thrombocytopenia syndrome(SFTS)is an emerging tick-borne infectious disease caused by a novel bunyavirus called SFTS virus(SFTSV).It was initially identified in China in 2009(Yu et al.,2011).Since then,the number of reported SFTS cases has rapidly increased in China,South Korea,and Japan(Li,et al.,2018;Takahashi et al.,2014;Kim et al.,2018).Sporadic SFTS cases have also been identified in several other Asian countries,such as Vietnam,Pakistan,Myanmar,and Thailand(Takahashi et al.,2014;Tran et al.,2019;Li et al.,2021).This disease is recognized as a highly lethal viral hemorrhagic fever with a mortality rate ranging from 12%to 50%(Yu et al.,2011;Li et al.,2018;Takahashi et al.,2014;Kim et al.,2013).SFTS primarily spreads to humans through bites from ticks infected with SFTSV,with the Haemaphysalis longicornis tick acting as the predominant vector for SFTSV(Zhuang et al.,2018).展开更多
Due to the presence of a large amount of personal sensitive information in social networks,privacy preservation issues in social networks have attracted the attention of many scholars.Inspired by the self-nonself disc...Due to the presence of a large amount of personal sensitive information in social networks,privacy preservation issues in social networks have attracted the attention of many scholars.Inspired by the self-nonself discrimination paradigmin the biological immune system,the negative representation of information indicates features such as simplicity and efficiency,which is very suitable for preserving social network privacy.Therefore,we suggest a method to preserve the topology privacy and node attribute privacy of attribute social networks,called AttNetNRI.Specifically,a negative survey-based method is developed to disturb the relationship between nodes in the social network so that the topology structure can be kept private.Moreover,a negative database-based method is proposed to hide node attributes,so that the privacy of node attributes can be preserved while supporting the similarity estimation between different node attributes,which is crucial to the analysis of social networks.To evaluate the performance of the AttNetNRI,empirical studies have been conducted on various attribute social networks and compared with several state-of-the-art methods tailored to preserve the privacy of social networks.The experimental results show the superiority of the developed method in preserving the privacy of attribute social networks and demonstrate the effectiveness of the topology disturbing and attribute hiding parts.The experimental results show the superiority of the developed methods in preserving the privacy of attribute social networks and demonstrate the effectiveness of the topological interference and attribute-hiding components.展开更多
Many types of real-world information systems, including social media and e-commerce platforms, can be modelled by means of attribute-rich, connected networks. The goal of anomaly detection in artificial intelligence i...Many types of real-world information systems, including social media and e-commerce platforms, can be modelled by means of attribute-rich, connected networks. The goal of anomaly detection in artificial intelligence is to identify illustrations that deviate significantly from the main distribution of data or that differ from known cases. Anomalous nodes in node-attributed networks can be identified with greater precision if both graph and node attributes are taken into account. Almost all of the studies in this area focus on supervised techniques for spotting outliers. While supervised algorithms for anomaly detection work well in theory, they cannot be applied to real-world applications owing to a lack of labelled data. Considering the possible data distribution, our model employs a dual variational autoencoder (VAE), while a generative adversarial network (GAN) assures that the model is robust to adversarial training. The dual VAEs are used in another capacity: as a fake-node generator. Adversarial training is used to ensure that our latent codes have a Gaussian or uniform distribution. To provide a fair presentation of the graph, the discriminator instructs the generator to generate latent variables with distributions that are more consistent with the actual distribution of the data. Once the model has been learned, the discriminator is used for anomaly detection via reconstruction loss which has been trained to distinguish between the normal and artificial distributions of data. First, using a dual VAE, our model simultaneously captures cross-modality interactions between topological structure and node characteristics and overcomes the problem of unlabeled anomalies, allowing us to better understand the network sparsity and nonlinearity. Second, the proposed model considers the regularization of the latent codes while solving the issue of unregularized embedding techniques that can quickly lead to unsatisfactory representation. Finally, we use the discriminator reconstruction loss for anomaly detection as the discriminator is well-trained to separate the normal and generated data distributions because reconstruction-based loss does not include the adversarial component. Experiments conducted on attributed networks demonstrate the effectiveness of the proposed model and show that it greatly surpasses the previous methods. The area under the curve scores of our proposed model for the BlogCatalog, Flickr, and Enron datasets are 0.83680, 0.82020, and 0.71180, respectively, proving the effectiveness of the proposed model. The result of the proposed model on the Enron dataset is slightly worse than other models;we attribute this to the dataset’s low dimensionality as the most probable explanation.展开更多
Purpose:Based on real-world academic data,this study aims to use network embedding technology to mining academic relationships,and investigate the effectiveness of the proposed embedding model on academic collaborator...Purpose:Based on real-world academic data,this study aims to use network embedding technology to mining academic relationships,and investigate the effectiveness of the proposed embedding model on academic collaborator recommendation tasks.Design/methodology/approach:We propose an academic collaborator recommendation model based on attributed network embedding(ACR-ANE),which can get enhanced scholar embedding and take full advantage of the topological structure of the network and multi-type scholar attributes.The non-local neighbors for scholars are defined to capture strong relationships among scholars.A deep auto-encoder is adopted to encode the academic collaboration network structure and scholar attributes into a low-dimensional representation space.Findings:1.The proposed non-local neighbors can better describe the relationships among scholars in the real world than the first-order neighbors.2.It is important to consider the structure of the academic collaboration network and scholar attributes when recommending collaborators for scholars simultaneously.Research limitations:The designed method works for static networks,without taking account of the network dynamics.Practical implications:The designed model is embedded in academic collaboration network structure and scholarly attributes,which can be used to help scholars recommend potential collaborators.Originality/value:Experiments on two real-world scholarly datasets,Aminer and APS,show that our proposed method performs better than other baselines.展开更多
Software architectures shift the focus of developers from lines-of-code to coarser-grained architectural elements and their overall interconnection structure. There are, however, many features of the distributed softw...Software architectures shift the focus of developers from lines-of-code to coarser-grained architectural elements and their overall interconnection structure. There are, however, many features of the distributed software that make the developing methods of distributed software quite different from the traditional ways. Furthermore, the traditional centralized ways with fixed interfaces cannot adapt to the flexible requirements of distributed software. In this paper, the attributed grammar (AG) is extended to refine the characters of distributed software, and a distributed software architecture description language (DSADL) based on attributed grammar is introduced, and then a model of integrated environment for software architecture design is proposed. It can be demonstrated by the practice that DSADL can help the programmers to analyze and design distributed software effectively, so the efficiency of the development can be improved greatly.展开更多
Parameter estimation of the attributed scattering center(ASC) model is significant for automatic target recognition(ATR). Sparse representation based parameter estimation methods have developed rapidly. Construction o...Parameter estimation of the attributed scattering center(ASC) model is significant for automatic target recognition(ATR). Sparse representation based parameter estimation methods have developed rapidly. Construction of the separable dictionary is a key issue for sparse representation technology. A compressive time-domain dictionary(TD) for ASC model is presented. Two-dimensional frequency domain responses of the ASC are produced and transformed into the time domain. Then these time domain responses are cutoff and stacked into vectors. These vectored time-domain responses are amalgamated to form the TD. Compared with the traditional frequency-domain dictionary(FD), the TD is a matrix that is quite spare and can markedly reduce the data size of the dictionary. Based on the basic TD construction method, we present four extended TD construction methods, which are available for different applications. In the experiments, the performance of the TD, including the basic model and the extended models, has been firstly analyzed in comparison with the FD. Secondly, an example of parameter estimation from SAR synthetic aperture radar(SAR) measurements of a target collected in an anechoic room is exhibited. Finally, a sparse image reconstruction example is from two apart apertures. Experimental results demonstrate the effectiveness and efficiency of the proposed TD.展开更多
The sparse recovery algorithms formulate synthetic aperture radar (SAR) imaging problem in terms of sparse representation (SR) of a small number of strong scatters' positions among a much large number of potentia...The sparse recovery algorithms formulate synthetic aperture radar (SAR) imaging problem in terms of sparse representation (SR) of a small number of strong scatters' positions among a much large number of potential scatters' positions, and provide an effective approach to improve the SAR image resolution. Based on the attributed scatter center model, several experiments were performed with different practical considerations to evaluate the performance of five representative SR techniques, namely, sparse Bayesian learning (SBL), fast Bayesian matching pursuit (FBMP), smoothed 10 norm method (SL0), sparse reconstruction by separable approximation (SpaRSA), fast iterative shrinkage-thresholding algorithm (FISTA), and the parameter settings in five SR algorithms were discussed. In different situations, the performances of these algorithms were also discussed. Through the comparison of MSE and failure rate in each algorithm simulation, FBMP and SpaRSA are found suitable for dealing with problems in the SAR imaging based on attributed scattering center model. Although the SBL is time-consuming, it always get better performance when related to failure rate and high SNR.展开更多
Contrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently.However,there are still two challenges.First,most of t...Contrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently.However,there are still two challenges.First,most of the real‐word system are multiple relations,where entities are linked by different types of relations,and each relation is a view of the graph network.Second,the rich multi‐scale information(structure‐level and feature‐level)of the graph network can be seen as self‐supervised signals,which are not fully exploited.A novel contrastive self‐supervised representation learning framework on attributed multiplex graph networks with multi‐scale(named CoLM^(2)S)information is presented in this study.It mainly contains two components:intra‐relation contrast learning and interrelation contrastive learning.Specifically,the contrastive self‐supervised representation learning framework on attributed single‐layer graph networks with multi‐scale information(CoLMS)framework with the graph convolutional network as encoder to capture the intra‐relation information with multi‐scale structure‐level and feature‐level selfsupervised signals is introduced first.The structure‐level information includes the edge structure and sub‐graph structure,and the feature‐level information represents the output of different graph convolutional layer.Second,according to the consensus assumption among inter‐relations,the CoLM^(2)S framework is proposed to jointly learn various graph relations in attributed multiplex graph network to achieve global consensus node embedding.The proposed method can fully distil the graph information.Extensive experiments on unsupervised node clustering and graph visualisation tasks demonstrate the effectiveness of our methods,and it outperforms existing competitive baselines.展开更多
Synthetic aperture radar(SAR)is able to acquire high-resolution method using the active microwave imaging method.SAR images are widely used in target recognition,classification,and surface analysis,with extracted feat...Synthetic aperture radar(SAR)is able to acquire high-resolution method using the active microwave imaging method.SAR images are widely used in target recognition,classification,and surface analysis,with extracted features.Attribute scattering center(ASC)is able to describe the image features for these tasks.However,sidelobe effects reduce the accuracy and reliability of the estimated ASC model parameters.This paper incorporates the SAR super-resolution into the ASC extraction to improve its performance.Both filter bank and subspace methods are demonstrated for preprocessing to supress the sidelobe.Based on the preprocessed data,a reinforcement based ASC method is used to get the parameters.The experimental results show that the super-resolution method can reduce noise and suppress sidelobe effect,which improve accuracy of the estimated ASC model parameters.展开更多
The homogeneity analysis of multi-airport system can provide important decision-making support for the route layout and cooperative operation.Existing research seldom analyzes the homogeneity of multi-airport system f...The homogeneity analysis of multi-airport system can provide important decision-making support for the route layout and cooperative operation.Existing research seldom analyzes the homogeneity of multi-airport system from the perspective of route network analysis,and the attribute information of airport nodes in the airport route network is not appropriately integrated into the airport network.In order to solve this problem,a multi-airport system homogeneity analysis method based on airport attribute network representation learning is proposed.Firstly,the route network of a multi-airport system with attribute information is constructed.If there are flights between airports,an edge is added between airports,and regional attribute information is added for each airport node.Secondly,the airport attributes and the airport network vector are represented respectively.The airport attributes and the airport network vector are embedded into the unified airport representation vector space by the network representation learning method,and then the airport vector integrating the airport attributes and the airport network characteristics is obtained.By calculating the similarity of the airport vectors,it is convenient to calculate the degree of homogeneity between airports and the homogeneity of the multi-airport system.The experimental results on the Beijing-Tianjin-Hebei multi-airport system show that,compared with other existing algorithms,the homogeneity analysis method based on attributed network representation learning can get more consistent results with the current situation of Beijing-Tianjin-Hebei multi-airport system.展开更多
Objective To estimate the lung cancer burden that may be attributable to ambient fine particulate matter (PM2.5) pollution in Guangzhou city in China from 2005 to 2013. Methods The data regarding PM2.5 exposure were...Objective To estimate the lung cancer burden that may be attributable to ambient fine particulate matter (PM2.5) pollution in Guangzhou city in China from 2005 to 2013. Methods The data regarding PM2.5 exposure were obtained from the 'Ambient air pollution exposure estimation for the Global Burden of Disease 2013' dataset at 0.1° ×0.1° spatial resolution. Disability-adjusted life years (DALYs) were estimated based on the information of mortality and incidence of lung cancer. Comparative risk analysis and integrated exposure-response function were used to estimate attributed disease burden. Results The population-weighted average concentration of PM2.5 was increased by 34.6% between 1990 and 2013, from 38.37 μg/m3 to 51.31 μg/m^3. The lung cancer DALYs in both men and women were increased by 36.2% from 2005 to 2013. The PM2.5 attributed lung cancer DALYs increased from 12105.0 (8181.0 for males and 3924.0 for females) in 2005 to 16489.3 (11291.7 for males and 5197.6 for females) in 2013. An average of 23.1% lung cancer burden was attributable to PM2.5 pollution in 2013. Conclusion PM2.5 has caused serious but under-appreciated public health burden in Guangzhou and the trend deteriorates. Effective strategies are needed to tackle this major public health problem.展开更多
During software development,developers tend to tangle multiple concerns into a single commit,resulting in many composite commits.This paper studies the problem of detecting and untangling composite commits,so as to im...During software development,developers tend to tangle multiple concerns into a single commit,resulting in many composite commits.This paper studies the problem of detecting and untangling composite commits,so as to improve the maintainability and understandability of software.Our approach is built upon the observation that both the textual content of code statements and the dependencies between code statements are helpful in comprehending the code commit.Based on this observation,we first construct an attributed graph for each commit,where code statements and various code dependencies are modeled as nodes and edges,respectively,and the textual bodies of code statements are maintained as node attributes.Based on the attributed graph,we propose graph-based learning algorithms that first detect whether the given commit is a composite commit,and then untangle the composite commit into atomic ones.We evaluate our approach on nine C#projects,and the results demonstrate the effectiveness and efficiency of our approach.展开更多
Attributed graph clustering plays a vital role in uncovering hidden network structures,but it presents significant challenges.In recent years,various models have been proposed to identify meaningful clusters by integr...Attributed graph clustering plays a vital role in uncovering hidden network structures,but it presents significant challenges.In recent years,various models have been proposed to identify meaningful clusters by integrating both structural and attribute-based information.However,these models often emphasize node proximities without adequately balancing the efficiency of clustering based on both structural and attribute data.Furthermore,they tend to neglect the critical fuzzy information inherent in attributed graph clusters.To address these issues,we introduce a new framework,Markov lumpability optimization,for efficient clustering of large-scale attributed graphs.Specifically,we define a lumped Markov chain on an attribute-augmented graph and introduce a new metric,Markov lumpability,to quantify the differences between the original and lumped Markov transition probability matrices.To minimize this measure,we propose a conjugate gradient projectionbased approach that ensures the partitioning closely aligns with the intrinsic structure of fuzzy clusters through conditional optimization.Extensive experiments on both synthetic and real-world datasets demonstrate the superior performance of the proposed framework compared to existing clustering algorithms.This framework has many potential applications,including dynamic community analysis of social networks,user profiling in recommendation systems,functional module identification in biological molecular networks,and financial risk control,offering a new paradigm for mining complex patterns in high-dimensional attributed graph data.展开更多
Attributed graphs have an additional sign vector for each node.Typically,edge signs represent like or dislike relationship between the node pairs.This has applications in domains,such as recommender systems,personalis...Attributed graphs have an additional sign vector for each node.Typically,edge signs represent like or dislike relationship between the node pairs.This has applications in domains,such as recommender systems,personalised search,etc.However,limited availability of edge sign information in attributed networks requires inferring the underlying graph embeddings to fill-in the knowledge gap.Such inference is performed by way of node classification which aims to deduce the node characteristics based on the topological structure of the graph and signed interactions between the nodes.The study of attributed networks is challenging due to noise,sparsity,and class imbalance issues.In this work,we consider node centrality in conjunction with edge signs to contemplate the node classification problem in attributed networks.We propose Semi-supervised Node Classification in Attributed graphs(SNCA).SNCA is robust to underlying network noise,and has in-built class imbalance handling capabilities.We perform an extensive experimental study on real-world datasets to showcase the efficiency,scalability,robustness,and pertinence of the solution.The performance results demonstrate the suitability of the solution for large attributed graphs in real-world settings.展开更多
Dear Editor,Inguinal hernia repair(IHR)performed during childhood is a prevalent etiological factor for obstructive azoospermia(OA)attributed to vasal injury.OA couples can achieve pregnancy through intracytoplasmic s...Dear Editor,Inguinal hernia repair(IHR)performed during childhood is a prevalent etiological factor for obstructive azoospermia(OA)attributed to vasal injury.OA couples can achieve pregnancy through intracytoplasmic sperm injection or natural pregnancy after microsurgical anastomosis.Recent advancements have highlighted the potential utility of laparoscopy-assisted vasovasostomy for treating OA caused by childhood herniorrhaphy.展开更多
Cross-domain graph anomaly detection(CD-GAD)is a promising task that leverages knowledge from a labelled source graph to guide anomaly detection on an unlabelled target graph.CD-GAD classifies anomalies as unique or c...Cross-domain graph anomaly detection(CD-GAD)is a promising task that leverages knowledge from a labelled source graph to guide anomaly detection on an unlabelled target graph.CD-GAD classifies anomalies as unique or common based on their presence in both the source and target graphs.However,existing models often fail to fully explore domain-unique knowledge of the target graph for detecting unique anomalies.Additionally,they tend to focus solely on node-level differences,overlooking structural-level differences that provide complementary information for common anomaly detection.To address these issues,we propose a novel method,Synthetic Graph Anomaly Detection via Graph Transfer and Graph Decouple(GTGD),which effectively detects common and unique anomalies in the target graph.Specifically,our approach ensures deeper learning of domain-unique knowledge by decoupling the reconstruction graphs of common and unique features.Moreover,we simulta-neously consider node-level and structural-level differences by transferring node and edge information from the source graph to the target graph,enabling comprehensive domain-common knowledge representation.Anomalies are detected using both common and unique features,with their synthetic score serving as the final result.Extensive experiments demonstrate the effectiveness of our approach,improving an average performance by 12.6%on the AUC-PR compared to state-of-the-art methods.展开更多
Numerous experimental and theoretical investigations have highlighted the power law behavior of the proton structure function F_(2)(x,Q^(2)),particularly the dependence of its power constant on various kinematic varia...Numerous experimental and theoretical investigations have highlighted the power law behavior of the proton structure function F_(2)(x,Q^(2)),particularly the dependence of its power constant on various kinematic variables.In this study,we analyze the proton structure function F_(2)employing the analytical solution of the Balitsky–Kovchegov equation,with a focus on the high Q^(2)regime and small x domains.Our results indicate that as Q^(2)increases,the slope parameterλ,which characterizes the growth rate of F_(2),exhibits a gradual decrease,approaching a limiting value ofλ≈0.41±0.01 for large Q^(2).We suggest that this behavior ofλmay be attributed to mechanisms such as gluon overlap and the suppression of phase space growth.To substantiate these conclusions,further high-precision electron–ion collision experiments are required,encompassing a broad range of Q^(2)and x.展开更多
Despite the availability of effective antibacterial drugs,the increasing prevalence of antibiotic-resistant bacteria is primarily attributed to their excessive and inappropriate utilization.Antimicrobial resistance(AM...Despite the availability of effective antibacterial drugs,the increasing prevalence of antibiotic-resistant bacteria is primarily attributed to their excessive and inappropriate utilization.Antimicrobial resistance(AMR)represents one of the most concerning global health and development threats[1].Thereby,the continuous escalation in AMR necessitates urgent advancements in novel antibacterial strategies.The MraY enzyme,which plays a pivotal role in synthesizing bacterial cell wall-composing polysaccharides,holds significant potential as a target for antibacterial agents[2].However,its conformational dynamics have presented substantial challenges in developing MraY-targeting inhibitors.展开更多
Dear Editor,Genetics-focused approaches have been widely used to uncover major genetic variants associated with performance variation.Selecting,manipulating,and editing genetic variants significantly improve crop perf...Dear Editor,Genetics-focused approaches have been widely used to uncover major genetic variants associated with performance variation.Selecting,manipulating,and editing genetic variants significantly improve crop performance.Meanwhile,the genetic component explains a portion of performance variation,and the environ-mental component contributes to the remaining,often large,portion(Laidig et al.,2017;Bonecke et al.,2020;Li et al.,2021).To ensure superior and robust performance,elite varieties are extensively tested across multiple years and locations.These extensive performance records,coupled with climatic profiles,could be leveraged to understand climate's impact on agriculture through approaches parallel to quantitative genetics approaches(Figure 1A).展开更多
基金funded by the National Nature Science Foundation of China(81,825,019 and 82,330,103)performed with approval from the Ethical Committee of Beijing Institute of Microbiology and Epidemiology(AF/SC-08/02.128).
文摘Dear Editor,Severe fever with thrombocytopenia syndrome(SFTS)is an emerging tick-borne infectious disease caused by a novel bunyavirus called SFTS virus(SFTSV).It was initially identified in China in 2009(Yu et al.,2011).Since then,the number of reported SFTS cases has rapidly increased in China,South Korea,and Japan(Li,et al.,2018;Takahashi et al.,2014;Kim et al.,2018).Sporadic SFTS cases have also been identified in several other Asian countries,such as Vietnam,Pakistan,Myanmar,and Thailand(Takahashi et al.,2014;Tran et al.,2019;Li et al.,2021).This disease is recognized as a highly lethal viral hemorrhagic fever with a mortality rate ranging from 12%to 50%(Yu et al.,2011;Li et al.,2018;Takahashi et al.,2014;Kim et al.,2013).SFTS primarily spreads to humans through bites from ticks infected with SFTSV,with the Haemaphysalis longicornis tick acting as the predominant vector for SFTSV(Zhuang et al.,2018).
基金supported by the National Natural Science Foundation of China(Nos.62006001,62372001)the Natural Science Foundation of Chongqing City(Grant No.CSTC2021JCYJ-MSXMX0002).
文摘Due to the presence of a large amount of personal sensitive information in social networks,privacy preservation issues in social networks have attracted the attention of many scholars.Inspired by the self-nonself discrimination paradigmin the biological immune system,the negative representation of information indicates features such as simplicity and efficiency,which is very suitable for preserving social network privacy.Therefore,we suggest a method to preserve the topology privacy and node attribute privacy of attribute social networks,called AttNetNRI.Specifically,a negative survey-based method is developed to disturb the relationship between nodes in the social network so that the topology structure can be kept private.Moreover,a negative database-based method is proposed to hide node attributes,so that the privacy of node attributes can be preserved while supporting the similarity estimation between different node attributes,which is crucial to the analysis of social networks.To evaluate the performance of the AttNetNRI,empirical studies have been conducted on various attribute social networks and compared with several state-of-the-art methods tailored to preserve the privacy of social networks.The experimental results show the superiority of the developed method in preserving the privacy of attribute social networks and demonstrate the effectiveness of the topology disturbing and attribute hiding parts.The experimental results show the superiority of the developed methods in preserving the privacy of attribute social networks and demonstrate the effectiveness of the topological interference and attribute-hiding components.
文摘Many types of real-world information systems, including social media and e-commerce platforms, can be modelled by means of attribute-rich, connected networks. The goal of anomaly detection in artificial intelligence is to identify illustrations that deviate significantly from the main distribution of data or that differ from known cases. Anomalous nodes in node-attributed networks can be identified with greater precision if both graph and node attributes are taken into account. Almost all of the studies in this area focus on supervised techniques for spotting outliers. While supervised algorithms for anomaly detection work well in theory, they cannot be applied to real-world applications owing to a lack of labelled data. Considering the possible data distribution, our model employs a dual variational autoencoder (VAE), while a generative adversarial network (GAN) assures that the model is robust to adversarial training. The dual VAEs are used in another capacity: as a fake-node generator. Adversarial training is used to ensure that our latent codes have a Gaussian or uniform distribution. To provide a fair presentation of the graph, the discriminator instructs the generator to generate latent variables with distributions that are more consistent with the actual distribution of the data. Once the model has been learned, the discriminator is used for anomaly detection via reconstruction loss which has been trained to distinguish between the normal and artificial distributions of data. First, using a dual VAE, our model simultaneously captures cross-modality interactions between topological structure and node characteristics and overcomes the problem of unlabeled anomalies, allowing us to better understand the network sparsity and nonlinearity. Second, the proposed model considers the regularization of the latent codes while solving the issue of unregularized embedding techniques that can quickly lead to unsatisfactory representation. Finally, we use the discriminator reconstruction loss for anomaly detection as the discriminator is well-trained to separate the normal and generated data distributions because reconstruction-based loss does not include the adversarial component. Experiments conducted on attributed networks demonstrate the effectiveness of the proposed model and show that it greatly surpasses the previous methods. The area under the curve scores of our proposed model for the BlogCatalog, Flickr, and Enron datasets are 0.83680, 0.82020, and 0.71180, respectively, proving the effectiveness of the proposed model. The result of the proposed model on the Enron dataset is slightly worse than other models;we attribute this to the dataset’s low dimensionality as the most probable explanation.
基金supported by National Natural Science Foundation of China(No.61603310)the Fundamental Research Funds for the Central Universities(No.XDJK2018B019).
文摘Purpose:Based on real-world academic data,this study aims to use network embedding technology to mining academic relationships,and investigate the effectiveness of the proposed embedding model on academic collaborator recommendation tasks.Design/methodology/approach:We propose an academic collaborator recommendation model based on attributed network embedding(ACR-ANE),which can get enhanced scholar embedding and take full advantage of the topological structure of the network and multi-type scholar attributes.The non-local neighbors for scholars are defined to capture strong relationships among scholars.A deep auto-encoder is adopted to encode the academic collaboration network structure and scholar attributes into a low-dimensional representation space.Findings:1.The proposed non-local neighbors can better describe the relationships among scholars in the real world than the first-order neighbors.2.It is important to consider the structure of the academic collaboration network and scholar attributes when recommending collaborators for scholars simultaneously.Research limitations:The designed method works for static networks,without taking account of the network dynamics.Practical implications:The designed model is embedded in academic collaboration network structure and scholarly attributes,which can be used to help scholars recommend potential collaborators.Originality/value:Experiments on two real-world scholarly datasets,Aminer and APS,show that our proposed method performs better than other baselines.
基金Project (No. 2000K08-G12) supported by Shaanxi Provincial Science and Technology Development Plan, China
文摘Software architectures shift the focus of developers from lines-of-code to coarser-grained architectural elements and their overall interconnection structure. There are, however, many features of the distributed software that make the developing methods of distributed software quite different from the traditional ways. Furthermore, the traditional centralized ways with fixed interfaces cannot adapt to the flexible requirements of distributed software. In this paper, the attributed grammar (AG) is extended to refine the characters of distributed software, and a distributed software architecture description language (DSADL) based on attributed grammar is introduced, and then a model of integrated environment for software architecture design is proposed. It can be demonstrated by the practice that DSADL can help the programmers to analyze and design distributed software effectively, so the efficiency of the development can be improved greatly.
基金Project(NCET-11-0866)supported by Education Ministry's new Century Excellent Talents Supporting Plan,China
文摘Parameter estimation of the attributed scattering center(ASC) model is significant for automatic target recognition(ATR). Sparse representation based parameter estimation methods have developed rapidly. Construction of the separable dictionary is a key issue for sparse representation technology. A compressive time-domain dictionary(TD) for ASC model is presented. Two-dimensional frequency domain responses of the ASC are produced and transformed into the time domain. Then these time domain responses are cutoff and stacked into vectors. These vectored time-domain responses are amalgamated to form the TD. Compared with the traditional frequency-domain dictionary(FD), the TD is a matrix that is quite spare and can markedly reduce the data size of the dictionary. Based on the basic TD construction method, we present four extended TD construction methods, which are available for different applications. In the experiments, the performance of the TD, including the basic model and the extended models, has been firstly analyzed in comparison with the FD. Secondly, an example of parameter estimation from SAR synthetic aperture radar(SAR) measurements of a target collected in an anechoic room is exhibited. Finally, a sparse image reconstruction example is from two apart apertures. Experimental results demonstrate the effectiveness and efficiency of the proposed TD.
基金Project(61171133)supported by the National Natural Science Foundation of ChinaProject(11JJ1010)supported by the Natural Science Fund for Distinguished Young Scholars of Hunan Province,ChinaProject(61101182)supported by National Natural Science Foundation for Young Scientists of China
文摘The sparse recovery algorithms formulate synthetic aperture radar (SAR) imaging problem in terms of sparse representation (SR) of a small number of strong scatters' positions among a much large number of potential scatters' positions, and provide an effective approach to improve the SAR image resolution. Based on the attributed scatter center model, several experiments were performed with different practical considerations to evaluate the performance of five representative SR techniques, namely, sparse Bayesian learning (SBL), fast Bayesian matching pursuit (FBMP), smoothed 10 norm method (SL0), sparse reconstruction by separable approximation (SpaRSA), fast iterative shrinkage-thresholding algorithm (FISTA), and the parameter settings in five SR algorithms were discussed. In different situations, the performances of these algorithms were also discussed. Through the comparison of MSE and failure rate in each algorithm simulation, FBMP and SpaRSA are found suitable for dealing with problems in the SAR imaging based on attributed scattering center model. Although the SBL is time-consuming, it always get better performance when related to failure rate and high SNR.
基金support by the National Natural Science Foundation of China(NSFC)under grant number 61873274.
文摘Contrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently.However,there are still two challenges.First,most of the real‐word system are multiple relations,where entities are linked by different types of relations,and each relation is a view of the graph network.Second,the rich multi‐scale information(structure‐level and feature‐level)of the graph network can be seen as self‐supervised signals,which are not fully exploited.A novel contrastive self‐supervised representation learning framework on attributed multiplex graph networks with multi‐scale(named CoLM^(2)S)information is presented in this study.It mainly contains two components:intra‐relation contrast learning and interrelation contrastive learning.Specifically,the contrastive self‐supervised representation learning framework on attributed single‐layer graph networks with multi‐scale information(CoLMS)framework with the graph convolutional network as encoder to capture the intra‐relation information with multi‐scale structure‐level and feature‐level selfsupervised signals is introduced first.The structure‐level information includes the edge structure and sub‐graph structure,and the feature‐level information represents the output of different graph convolutional layer.Second,according to the consensus assumption among inter‐relations,the CoLM^(2)S framework is proposed to jointly learn various graph relations in attributed multiplex graph network to achieve global consensus node embedding.The proposed method can fully distil the graph information.Extensive experiments on unsupervised node clustering and graph visualisation tasks demonstrate the effectiveness of our methods,and it outperforms existing competitive baselines.
基金supported by the National Natural Foundation of China(No.62201158).
文摘Synthetic aperture radar(SAR)is able to acquire high-resolution method using the active microwave imaging method.SAR images are widely used in target recognition,classification,and surface analysis,with extracted features.Attribute scattering center(ASC)is able to describe the image features for these tasks.However,sidelobe effects reduce the accuracy and reliability of the estimated ASC model parameters.This paper incorporates the SAR super-resolution into the ASC extraction to improve its performance.Both filter bank and subspace methods are demonstrated for preprocessing to supress the sidelobe.Based on the preprocessed data,a reinforcement based ASC method is used to get the parameters.The experimental results show that the super-resolution method can reduce noise and suppress sidelobe effect,which improve accuracy of the estimated ASC model parameters.
基金supported by the Natural Science Foundation of Tianjin(No.20JCQNJC00720)the Fundamental Research Fund for the Central Universities(No.3122021052)。
文摘The homogeneity analysis of multi-airport system can provide important decision-making support for the route layout and cooperative operation.Existing research seldom analyzes the homogeneity of multi-airport system from the perspective of route network analysis,and the attribute information of airport nodes in the airport route network is not appropriately integrated into the airport network.In order to solve this problem,a multi-airport system homogeneity analysis method based on airport attribute network representation learning is proposed.Firstly,the route network of a multi-airport system with attribute information is constructed.If there are flights between airports,an edge is added between airports,and regional attribute information is added for each airport node.Secondly,the airport attributes and the airport network vector are represented respectively.The airport attributes and the airport network vector are embedded into the unified airport representation vector space by the network representation learning method,and then the airport vector integrating the airport attributes and the airport network characteristics is obtained.By calculating the similarity of the airport vectors,it is convenient to calculate the degree of homogeneity between airports and the homogeneity of the multi-airport system.The experimental results on the Beijing-Tianjin-Hebei multi-airport system show that,compared with other existing algorithms,the homogeneity analysis method based on attributed network representation learning can get more consistent results with the current situation of Beijing-Tianjin-Hebei multi-airport system.
基金supported by the Centre for Health Statistics Information,National Health and Family Planning Commission of the People’s Republic of China
文摘Objective To estimate the lung cancer burden that may be attributable to ambient fine particulate matter (PM2.5) pollution in Guangzhou city in China from 2005 to 2013. Methods The data regarding PM2.5 exposure were obtained from the 'Ambient air pollution exposure estimation for the Global Burden of Disease 2013' dataset at 0.1° ×0.1° spatial resolution. Disability-adjusted life years (DALYs) were estimated based on the information of mortality and incidence of lung cancer. Comparative risk analysis and integrated exposure-response function were used to estimate attributed disease burden. Results The population-weighted average concentration of PM2.5 was increased by 34.6% between 1990 and 2013, from 38.37 μg/m3 to 51.31 μg/m^3. The lung cancer DALYs in both men and women were increased by 36.2% from 2005 to 2013. The PM2.5 attributed lung cancer DALYs increased from 12105.0 (8181.0 for males and 3924.0 for females) in 2005 to 16489.3 (11291.7 for males and 5197.6 for females) in 2013. An average of 23.1% lung cancer burden was attributable to PM2.5 pollution in 2013. Conclusion PM2.5 has caused serious but under-appreciated public health burden in Guangzhou and the trend deteriorates. Effective strategies are needed to tackle this major public health problem.
基金supported by the National Natural Science Foundation of China under Grant No.62025202the Fundamental Research Funds for the Central Universities under Grant No.020214380102.
文摘During software development,developers tend to tangle multiple concerns into a single commit,resulting in many composite commits.This paper studies the problem of detecting and untangling composite commits,so as to improve the maintainability and understandability of software.Our approach is built upon the observation that both the textual content of code statements and the dependencies between code statements are helpful in comprehending the code commit.Based on this observation,we first construct an attributed graph for each commit,where code statements and various code dependencies are modeled as nodes and edges,respectively,and the textual bodies of code statements are maintained as node attributes.Based on the attributed graph,we propose graph-based learning algorithms that first detect whether the given commit is a composite commit,and then untangle the composite commit into atomic ones.We evaluate our approach on nine C#projects,and the results demonstrate the effectiveness and efficiency of our approach.
基金supported by the National Natural Science Foundation of China(Grant No.72571150)Beijing Natural Science Foundation(Grant No.9182015)。
文摘Attributed graph clustering plays a vital role in uncovering hidden network structures,but it presents significant challenges.In recent years,various models have been proposed to identify meaningful clusters by integrating both structural and attribute-based information.However,these models often emphasize node proximities without adequately balancing the efficiency of clustering based on both structural and attribute data.Furthermore,they tend to neglect the critical fuzzy information inherent in attributed graph clusters.To address these issues,we introduce a new framework,Markov lumpability optimization,for efficient clustering of large-scale attributed graphs.Specifically,we define a lumped Markov chain on an attribute-augmented graph and introduce a new metric,Markov lumpability,to quantify the differences between the original and lumped Markov transition probability matrices.To minimize this measure,we propose a conjugate gradient projectionbased approach that ensures the partitioning closely aligns with the intrinsic structure of fuzzy clusters through conditional optimization.Extensive experiments on both synthetic and real-world datasets demonstrate the superior performance of the proposed framework compared to existing clustering algorithms.This framework has many potential applications,including dynamic community analysis of social networks,user profiling in recommendation systems,functional module identification in biological molecular networks,and financial risk control,offering a new paradigm for mining complex patterns in high-dimensional attributed graph data.
基金supported by the National Key Research and Development Program of China(No.2020YFA0909100).
文摘Attributed graphs have an additional sign vector for each node.Typically,edge signs represent like or dislike relationship between the node pairs.This has applications in domains,such as recommender systems,personalised search,etc.However,limited availability of edge sign information in attributed networks requires inferring the underlying graph embeddings to fill-in the knowledge gap.Such inference is performed by way of node classification which aims to deduce the node characteristics based on the topological structure of the graph and signed interactions between the nodes.The study of attributed networks is challenging due to noise,sparsity,and class imbalance issues.In this work,we consider node centrality in conjunction with edge signs to contemplate the node classification problem in attributed networks.We propose Semi-supervised Node Classification in Attributed graphs(SNCA).SNCA is robust to underlying network noise,and has in-built class imbalance handling capabilities.We perform an extensive experimental study on real-world datasets to showcase the efficiency,scalability,robustness,and pertinence of the solution.The performance results demonstrate the suitability of the solution for large attributed graphs in real-world settings.
基金supported by the National Key R&D Program of China(No.2022YFC2702701)Shanghai Scientific and Technological Project(No.20Y11907600)Clinical Research Innovation Plan of Shanghai General Hospital(No.CTCCR-2021C17).
文摘Dear Editor,Inguinal hernia repair(IHR)performed during childhood is a prevalent etiological factor for obstructive azoospermia(OA)attributed to vasal injury.OA couples can achieve pregnancy through intracytoplasmic sperm injection or natural pregnancy after microsurgical anastomosis.Recent advancements have highlighted the potential utility of laparoscopy-assisted vasovasostomy for treating OA caused by childhood herniorrhaphy.
基金supported by the National Nature Science Foundation of China,Grant/Award Numbers:62337001,62037001“Pioneer”and“Leading Goose”R&D Program of Zhejiang,Grant/Award Number:2022C03106.
文摘Cross-domain graph anomaly detection(CD-GAD)is a promising task that leverages knowledge from a labelled source graph to guide anomaly detection on an unlabelled target graph.CD-GAD classifies anomalies as unique or common based on their presence in both the source and target graphs.However,existing models often fail to fully explore domain-unique knowledge of the target graph for detecting unique anomalies.Additionally,they tend to focus solely on node-level differences,overlooking structural-level differences that provide complementary information for common anomaly detection.To address these issues,we propose a novel method,Synthetic Graph Anomaly Detection via Graph Transfer and Graph Decouple(GTGD),which effectively detects common and unique anomalies in the target graph.Specifically,our approach ensures deeper learning of domain-unique knowledge by decoupling the reconstruction graphs of common and unique features.Moreover,we simulta-neously consider node-level and structural-level differences by transferring node and edge information from the source graph to the target graph,enabling comprehensive domain-common knowledge representation.Anomalies are detected using both common and unique features,with their synthetic score serving as the final result.Extensive experiments demonstrate the effectiveness of our approach,improving an average performance by 12.6%on the AUC-PR compared to state-of-the-art methods.
基金supported by the National Key R&D Program of China(Grant Nos.2024YFE0109800 and 2024YFE0109802)the National Natural Science Foundation of China(Grant No.12305127)the International Partnership Program of the Chinese Academy of Sciences(Grant No.016GJHZ2022054FN)。
文摘Numerous experimental and theoretical investigations have highlighted the power law behavior of the proton structure function F_(2)(x,Q^(2)),particularly the dependence of its power constant on various kinematic variables.In this study,we analyze the proton structure function F_(2)employing the analytical solution of the Balitsky–Kovchegov equation,with a focus on the high Q^(2)regime and small x domains.Our results indicate that as Q^(2)increases,the slope parameterλ,which characterizes the growth rate of F_(2),exhibits a gradual decrease,approaching a limiting value ofλ≈0.41±0.01 for large Q^(2).We suggest that this behavior ofλmay be attributed to mechanisms such as gluon overlap and the suppression of phase space growth.To substantiate these conclusions,further high-precision electron–ion collision experiments are required,encompassing a broad range of Q^(2)and x.
文摘Despite the availability of effective antibacterial drugs,the increasing prevalence of antibiotic-resistant bacteria is primarily attributed to their excessive and inappropriate utilization.Antimicrobial resistance(AMR)represents one of the most concerning global health and development threats[1].Thereby,the continuous escalation in AMR necessitates urgent advancements in novel antibacterial strategies.The MraY enzyme,which plays a pivotal role in synthesizing bacterial cell wall-composing polysaccharides,holds significant potential as a target for antibacterial agents[2].However,its conformational dynamics have presented substantial challenges in developing MraY-targeting inhibitors.
基金supported by the Agriculture and Food Rosoarch Initiative competitive grant(2021-67013-33833)the Federal Hatch Funds(IDA01312)from the USDA National Institute of Food and Agriculture,by the USDA-ARS In-House Project 2090-21000-033-000.
文摘Dear Editor,Genetics-focused approaches have been widely used to uncover major genetic variants associated with performance variation.Selecting,manipulating,and editing genetic variants significantly improve crop performance.Meanwhile,the genetic component explains a portion of performance variation,and the environ-mental component contributes to the remaining,often large,portion(Laidig et al.,2017;Bonecke et al.,2020;Li et al.,2021).To ensure superior and robust performance,elite varieties are extensively tested across multiple years and locations.These extensive performance records,coupled with climatic profiles,could be leveraged to understand climate's impact on agriculture through approaches parallel to quantitative genetics approaches(Figure 1A).