期刊文献+

结合对抗训练增强和联合损失微调的脚本事件预测方法 被引量:1

Script Event Prediction Based on Adversarial Training Augmentation and Joint Loss Fine-tuning
在线阅读 下载PDF
导出
摘要 脚本事件预测旨在根据脚本中的历史事件预测最有可能发生的后续事件,这需要有充足的事件样本并拥有学习到更多的上下文信息的能力.以往的方法主要关注局部信息,因而对事件的表征不充分.该文提出了一种基于预训练和微调的方法,通过对抗训练增强样本和联合损失微调来获得更全面的语义信息.首先利用MLM任务和FGM的方法,使模型利用少量信息完成信息间的交互并获取到更多的事件信息.为确保有相同输入的文本能够有一致分布的输出,在微调阶段进行R-Drop调优,进一步提高模型的性能.在广泛使用的《纽约时报》语料库上的实验结果表明,该文提出的方法提升了脚本事件预测的预测性能. Script event prediction aims to predict the most likely subsequent events,which requires a sufficient sample of events and the ability to learn more contextual information.Previous methods mainly focused on local information,so the event representation was inadequate.The proposed method in this work,which involves pre-training and fine-tuning to gather more detailed semantic information by training with adversarial samples and adjusting the loss together.Firstly,we used MLM and FGM to boost the model's ability to interact with information between events and extracted events information when given limited input.In order to ensure that the texts with identical inputs can have consistent output distribution,R-Drop tuning is carried out in the fine-tuning stage to further improve the performance of the model.Experimental results on the widely used New York Times corpus show that the proposed method improves the performance of script event prediction.
作者 刘玉婷 丁鲲 刘茗 王保卫 LIU Yuting;DING Kun;LIU Ming;WANG Baowei(The Sixty-Third Research Institute of National University of Defense Technology,Nanjing 210007,China;School of Computer Science,Nanjing University of Information Science&Technology,Nanjing 210044,China;China Laboratory for Big Data and Decision,National University of Defense Technology,Changsha 410073,China)
出处 《小型微型计算机系统》 北大核心 2025年第2期274-279,共6页 Journal of Chinese Computer Systems
基金 中国博士后科学基金项目(2021MD703983)资助 国家自然科学基金面上项目(61972207)资助 国家自然科学基金-通用联合基金重点项目(U1836208)资助。
关键词 脚本事件预测 上下文信息 对抗训练 联合损失微调 script event prediction context information adversarial training joint loss fine-tuning
  • 相关文献

参考文献2

二级参考文献7

共引文献21

同被引文献1

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部