期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Knowledge distillation for financial large language models:a systematic review of strategies,applications,and evaluation
1
作者 Jiaqi SHI Xulong ZHANG +2 位作者 Xiaoyang QU Junfei XIE Jianzong WANG 《Frontiers of Information Technology & Electronic Engineering》 2025年第10期1793-1808,共16页
Financial large language models(FinLLMs)offer immense potential for financial applications.While excessive deployment expenditures and considerable inference latency constitute major obstacles,as a prominent compressi... Financial large language models(FinLLMs)offer immense potential for financial applications.While excessive deployment expenditures and considerable inference latency constitute major obstacles,as a prominent compression methodology,knowledge distillation(KD)offers an effective solution to these difficulties.A comprehensive survey is conducted in this work on how KD interacts with FinLLMs,covering three core aspects:strategy,application,and evaluation.At the strategy level,this review introduces a structured taxonomy to comparatively analyze existing distillation pathways.At the application level,this review puts forward a logical upstream–midstream–downstream framework to systematically explain the practical value of distilled models in the financial field.At the evaluation level,to tackle the absence of standards in the financial field,this review constructs a comprehensive evaluation framework that proceeds from multiple dimensions such as financial accuracy,reasoning fidelity,and robustness.In summary,this research aims to provide a clear roadmap for this interdisciplinary field,to accelerate the development of distilled FinLLMs. 展开更多
关键词 Financial large language models(FinLLMs) Knowledge distillation Model compression quantitative trading
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部