Due to the heterogeneity of nodes and edges,heterogeneous network embedding is a very challenging task to embed highly coupled networks into a set of low-dimensional vectors.Existing models either only learn embedding...Due to the heterogeneity of nodes and edges,heterogeneous network embedding is a very challenging task to embed highly coupled networks into a set of low-dimensional vectors.Existing models either only learn embedding vectors for nodes or only for edges.These two methods of embedding learning are rarely performed in the same model,and they both overlook the internal correlation between nodes and edges.To solve these problems,a node and edge joint embedding model is proposed for Heterogeneous Information Networks(HINs),called NEJE.The NEJE model can better capture the latent structural and semantic information from an HIN through two joint learning strategies:type-level joint learning and element-level joint learning.Firstly,node-type-aware structure learning and edge-type-aware semantic learning are sequentially performed on the original network and its line graph to get the initial embedding of nodes and the embedding of edges.Then,to optimize performance,type-level joint learning is performed through the alternating training of node embedding on the original network and edge embedding on the line graph.Finally,a new homogeneous network is constructed from the original heterogeneous network,and the graph attention model is further used on the new network to perform element-level joint learning.Experiments on three tasks and five public datasets show that our NEJE model performance improves by about 2.83%over other models,and even improves by 6.42%on average for the node clustering task on Digital Bibliography&Library Project(DBLP)dataset.展开更多
As a joint-optimization problem which simultaneously fulfills two different but correlated embedding tasks (i.e., entity embedding and relation embedding), knowledge embedding problem is solved in a joint embedding ...As a joint-optimization problem which simultaneously fulfills two different but correlated embedding tasks (i.e., entity embedding and relation embedding), knowledge embedding problem is solved in a joint embedding scheme. In this embedding scheme, we design a joint compatibility scoring function to quantitatively evaluate the relational facts with respect to entities and relations, and further incorporate the scoring function into the maxmargin structure learning process that explicitly learns the embedding vectors of entities and relations using the context information of the knowledge base. By optimizing the joint problem, our design is capable of effectively capturing the intrinsic topological structures in the learned embedding spaces. Experimental results demonstrate the effectiveness of our embedding scheme in characterizing the semantic correlations among different relation units, and in relation prediction for knowledge inference.展开更多
基金supported by the National Natural Science Foundation of China(No.62103143)the Hunan Province Key Research and Development Program(No.2022WK2006)+2 种基金the Special Project for the Construction of Innovative Provinces in Hunan(Nos.2020TP2018 and 2019GK4030)the Young Backbone Teacher of Hunan Province(No.2022101)the Scientific Research Fund of Hunan Provincial Education Department(No.22B0471).
文摘Due to the heterogeneity of nodes and edges,heterogeneous network embedding is a very challenging task to embed highly coupled networks into a set of low-dimensional vectors.Existing models either only learn embedding vectors for nodes or only for edges.These two methods of embedding learning are rarely performed in the same model,and they both overlook the internal correlation between nodes and edges.To solve these problems,a node and edge joint embedding model is proposed for Heterogeneous Information Networks(HINs),called NEJE.The NEJE model can better capture the latent structural and semantic information from an HIN through two joint learning strategies:type-level joint learning and element-level joint learning.Firstly,node-type-aware structure learning and edge-type-aware semantic learning are sequentially performed on the original network and its line graph to get the initial embedding of nodes and the embedding of edges.Then,to optimize performance,type-level joint learning is performed through the alternating training of node embedding on the original network and edge embedding on the line graph.Finally,a new homogeneous network is constructed from the original heterogeneous network,and the graph attention model is further used on the new network to perform element-level joint learning.Experiments on three tasks and five public datasets show that our NEJE model performance improves by about 2.83%over other models,and even improves by 6.42%on average for the node clustering task on Digital Bibliography&Library Project(DBLP)dataset.
基金Project supported by the National Basic Research Program (973) of China (No. 2015CB352302) and the National Natural Science Foundation of China (Nos. U1509206 and 61472353)
文摘As a joint-optimization problem which simultaneously fulfills two different but correlated embedding tasks (i.e., entity embedding and relation embedding), knowledge embedding problem is solved in a joint embedding scheme. In this embedding scheme, we design a joint compatibility scoring function to quantitatively evaluate the relational facts with respect to entities and relations, and further incorporate the scoring function into the maxmargin structure learning process that explicitly learns the embedding vectors of entities and relations using the context information of the knowledge base. By optimizing the joint problem, our design is capable of effectively capturing the intrinsic topological structures in the learned embedding spaces. Experimental results demonstrate the effectiveness of our embedding scheme in characterizing the semantic correlations among different relation units, and in relation prediction for knowledge inference.