期刊文献+

语义增强的多任务对比学习序列推荐模型

Multi-task Contrastive Learning Model With Semantic Enhancement for Sequential Recommendation
在线阅读 下载PDF
导出
摘要 针对序列推荐任务中存在的数据稀疏问题,提出语义增强的多任务对比学习序列推荐模型(multi-task contrastive learning model with semantic enhancement for sequential recommendation,MCLM-SE4SRec),采用多任务联合训练的方式将2个对比学习任务与推荐任务进行结合。数据增强的对比学习任务通过结合项目相关性和序列长度对用户序列执行数据增强操作;语义聚类的对比学习任务从高维语义信息的角度,通过语义信息聚类挖掘潜在的用户序列语义信息,学习到更好的向量表示特征。在数据增强的对比学习任务中,利用负样本选择优化策略,通过对假负例的识别得到更合理的负样例集合,进一步提升模型性能。在3个公开数据集上的实验结果表明,该模型取得了优异的性能。 To address the data sparsity problem in sequential recommendation task,a multi-task contrastive learning model with semantic enhancement for sequential recommendation(MCLM-SE4SRec)is proposed.The method employed multi-task joint training to combine two contrastive learning tasks with recommendation task.The contrastive learning with data-level augmentation task performed data augmentation operations on users'sequences by combining item correlation and sequence length.The contrastive learning with semantic-level clustering task mined users'potential semantic information through semantic information clustering to learn better vector representation features.Additionally,a negative sampling selection optimization strategy was used in the contrastive learning with data-level augmentation task.By identifying false negative samples,a more reasonable set of negative samples was obtained,and the model performance was further improved.The model has achieved excellent performance on three public datasets.
作者 杜永萍 徐钰东 周涛 王禹心 DU Yongping;XU Yudong;ZHOU Tao;WANG Yuxin(Faculty of Information Technology,Beijing University of Technology,Beijing 100124,China)
出处 《北京工业大学学报》 北大核心 2025年第5期560-572,共13页 Journal of Beijing University of Technology
基金 国家自然科学基金资助项目(92267107) 北京市自然科学基金资助项目(4212013)。
关键词 序列推荐 对比学习 语义聚类 数据增强 自监督学习 社交网络 sequential recommendation contrastive learning semantic clustering data augmentation self-supervised learning social network
  • 相关文献

参考文献1

二级参考文献2

共引文献344

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部