期刊文献+

基于大语言模型的军事评论文本论辩图构建

Argument Graph Construction for Military Commentary Texts Based on Large Language Models
在线阅读 下载PDF
导出
摘要 军事舆情分析与专家研判场景中,分析人员需从多源异构且立场交织的公开信息中,形成可追溯、可解释且带不确定性刻度的判断。然而,现有的处理流程主要依赖句级抽取和投票汇总,无法有效呈现完整的证据链条与来源强弱。提出了一种基于大语言模型的端到端ARGUS-LLM流水线,结合提示式抽取、跨文档对齐与可信度融合,构建了篇章级论辩图,并通过固定点求解与概率词表实现了不确定性刻度的表达。试验结果表明,相比传统的句级抽取方法,ARGUS-LLM在保持信息压缩的同时,能够更好地汇聚多源支持,抑制孤立的负面主张,避免将相关性误判为因果关系,表现出较低的幻觉与较强的稳健性。 In military public discourse and expert assessments,analysts must derive traceable,interpretable,and uncertainty-aware judgments from multi-source,heterogeneous,and stanceintertwined open information.However,existing workflows mainly rely on sentence-level extraction and voting aggregation.So they fail to effectively represent complete evidence chains and source strengths.An end-to-end ARGUS-LLM pipeline based on large language models(LLM)is presented.Combining prompt-based extraction,cross-document alignment and credibility fusion,document-level argument graphs are constructed.With fixed-point solving and a probability lexicon,express with graded uncertainty is realized.Experimental results show that,compared with traditional sentence extraction methods,while maintaining information compression,ARGUS-LLM can better aggregate multi-source support and suppress isolated adverse claims.Thus,it can avoid to misjudge correlation as causality,demonstrating low hallucination and strong robustness.
作者 李文骏 张志政 童佟 LI Wenjun;ZHANG Zhizheng;TONG Tong(School of Software,Southeast University,Nanjing 211189,China;Key Laboratory of New Generation Artificial Intelligence Technology and Its Interdisciplinary Applications(Southeast University),Ministry of Education,Nanjing 211189,China)
出处 《指挥信息系统与技术》 2025年第6期11-22,共12页 Command Information System and Technology
基金 中央高校基本科研业务费专项资金(2242025K30024)资助项目。
关键词 大语言模型 计算论证 论证挖掘 加权双极论辩图 large language model(LLM) computational argumentation argument mining(AM) weighted bipolar argumentation graph
  • 相关文献

参考文献4

二级参考文献7

共引文献87

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部