摘要
为解决现有会话推荐未利用项目的额外属性信息,以及忽略全局项目之间交互问题,提出一种融合全局和属性信息的双图神经网络会话推荐模型。在会话序列中捕获项目显式和隐式信息,将项目之间的交互关系构建成全局图和属性图,在全局图中利用一个门控机制捕获显式信息,在属性图中将一个自注意力机制嵌入到图注意力网络中学习项目隐式信息。利用池化操作将两种信息融合,根据最终嵌入计算预测评分。实验结果表明,模型在3个公开数据集Diginetica、Tmall和30Music上的精确度和平均倒数排名优于新近基线模型,验证了模型的有效性。
To address the problems that the attribute information of extra items is not utilized in the present session recommendation and interactions between global items are ignored,a method for dual graph neural networks combining global and attribute information for session-based recommendation was proposed,where explicit and implicit information about items was captured in the session sequence.The global-aware graph and attribute-aware graph were constructed to represent the interactions between items.The explicit information was captured in the global graph through a gating mechanism,while the implicit information was learned in the attribute graph through a self-attentive mechanism embedded in a graph attention network.The two types of information were fused using the pooling operation and the predicted score was calculated based on the final embedding.Experimental results show that the precision and mean reciprocal rank of this model are shown with better recommendation performance on the three publicly available datasets Diginetica,Tmall and 30Music.This model is verified to be effective.
作者
杨兴耀
齐正
张祖莲
于炯
陈嘉颖
王东晓
YANG Xing-yao;QI Zheng;ZHANG Zu-lian;YU Jiong;CHEN Jia-ying;WANG Dong-xiao(School of Software,Xinjiang University,Urumqi 830091,China;Xinjiang Xingnong Network Information Center,Meteorological Bureau of Xinjiang Uygur Autonomous Region,Urumqi 830002,China)
出处
《计算机工程与设计》
北大核心
2025年第3期770-778,共9页
Computer Engineering and Design
基金
新疆维吾尔自治区自然科学基金面上基金项目(2023D01C17、2023D01A123、2022D01C692)
新疆维吾尔自治区自然科学基金资源共享平台建设基金项目(PT2323)
国家自然科学基金项目(62262064、61862060)
新疆气象局引导基金项目(YD202212)
劳务派遣管理信息化系统基金项目(202212140030)。
关键词
推荐系统
会话推荐
图神经网络
注意力机制
门控机制
图注意力网络
自注意力机制
recommender system
session-based recommendation
graph neural network
attention mechanism
gating mechanism
graph attention networks
self-attention mechanism