期刊文献+

基于存储拉远与无损网络的AI大模型高效训练方案研究

Research on Efficient Training Scheme for AI Large Model Based on Storage Remote and Lossless Network
在线阅读 下载PDF
导出
摘要 随着AI大模型的快速发展,算力需求急剧增加,聚焦车企普遍关注的先传后训方式导致算力资源利用率低、敏感数据出园区导致安全担忧等痛点,提出了一种基于存储拉远和无损网络技术的解决方案,通过构建智算实验网,采用边传边训的创新模式,实现了数据实时处理与模型快速更新。实验结果表明,该方案显著提升了算力使用效率,单任务训练时间缩短50%,数据传输速率提高至7.3 Gbps,且在200 KM拉远条件下仍能保持高效运行。然而,未来仍需关注数据安全与隐私保护等挑战。本研究为应对大模型时代的算力需求提供了新的思路和方法。 With the rapid development of AI large models,the demand for computing power has sharply increased.Focusing on the pain points of low utilization of computing power resources and security concerns caused by sensitive data leaving the park,this ar‐ticle proposes a solution based on storage and lossless network technology.By constructing an intelligent computing experimental network and adopting an innovative mode of simultaneous transmission and training,real-time data processing and rapid model up‐dates have been achieved.The experimental results show that this scheme significantly improves the efficiency of computing power usage,reduces single task training time by 50%,increases data transmission rate to 7.3 Gbps,and can still maintain efficient opera‐tion under 200 KM distance conditions.However,challenges such as data security and privacy protection still need to be addressed in the future.This study provides new ideas and methods for addressing the computing power demand in the era of large models.
作者 祁钰 黄嘉 王立秋 涂艳丽 陈子瑜 QI Yu;HUANG Jia;WANG Li-qiu;TU Yan-li;CHEN Zi-yu(China Mobile Communications Group Design Institute Co.,Ltd.,Chongqing 401120,China)
出处 《电脑与电信》 2025年第1期9-13,共5页 Computer & Telecommunication
关键词 AI 大模型 存储拉远 边传边训 无损网络 AI large models remote storage simultaneous transmission and training lossless network
  • 相关文献

参考文献8

二级参考文献22

共引文献84

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部