期刊文献+

四种迁移学习架构在跨仪器绝缘纸聚合度评估中的效果对比

Comparison Study of Four Transfer Learning Architectures for Degree of Polymerization Assessment of Insulating Paper Across Different Instruments
在线阅读 下载PDF
导出
摘要 近红外光谱(NIRS)分析技术已成为替代传统化学方法实现绝缘纸聚合度(DP)无损检测的重要手段。作为一种典型的数据驱动型计量学方法,NIRS技术在不同仪器之间所采集的光谱数据常存在幅值偏移,严重制约了其在工程应用中的通用性与规模化推广。为此构建了一个涵盖四台典型光谱仪的模型传递场景,分析了仪器间的光谱差异,系统比较了四种主流深度神经网络结构在不同参数配置下的迁移学习微调表现,探讨了网络冻结策略对模型迁移性能的影响,并在既定超参数搜索空间中筛选出最优的微调网络结构和超参数组合。结果表明,不同仪器的光谱幅值存在显著差异,导致源域模型在目标域上的预测性能急剧下降,模型几乎失效。冻结设置可以使神经网络模型微调效果更好,通过冻结前端特征提取层、微调高层决策层参数的策略可在保证稳定性的同时提高迁移适应能力。在ML P、1D-CNN、EOT以及ResNet四种架构中,EOT的源域误差最小但目标域微调效果最差,而ResNet的源域误差高于EOT,但微调效果较好,说明源域训练误差与迁移效果之间并无显著相关性。在所有结构中,基于多尺度Inception模块的三分支ResNet网络在目标域上获得最佳性能,微调后RMSE为78.5,MAPE为8.6%,显著优于其他模型。研究结果为构建具备跨设备通用性的NIRS建模框架提供了科学依据。 Near-infrared spectroscopy(NIRS)has become an important nondestructive technique for assessing the degree of polymerization(DP)of insulating paper as an alternative to traditional chemical methods.As a typical data-driven chemometric approach,NIRS often suffers from amplitude shifts in spectral data collected across different instruments,which severely limit its generalizability and large-scale deployment in engineering applications.In this work,we construct a model transfer scenario involving four representative spectrometers to analyze inter-instrument spectral discrepancies.We systematically compare the transfer learning fine-tuning performance of four mainstream deep neural network architectures under different parameter configurations,investigate the effect of layer-freezing strategies on transfer performance,and identify the optimal network structure and hyperparameter combination within a predefined search space.The results show that significant spectral amplitude differences exist among instruments,leading to drastic degradation of predictive performance when a source-domain model is directly applied to the target domain,rendering the model nearly ineffective.Freezing strategies improve fine-tuning performance by stabilizing the network;specifically,freezing the front-end feature extraction layers while fine-tuning the higher-level decision layers enhances transferability without compromising stability.Among the four tested architectures-MLP,1DCNN,EOT,and ResNet-EOT achieved the lowest error in the source domain but performed worse after fine-tuning in the target domain,whereas ResNet exhibited higher source-domain error than EOT but achieved better fine-tuning performance.This indicates that source-domain training error is not strongly correlated with transfer effectiveness.Overall,a three-branch ResNet network incorporating multi-scale Inception modules achieved the best target-domain performance,with an RMSE of 78.5 and a MAPE of 8.6% after fine-tuning,significantly outperforming the other models.These findings provide theoretical support for constructing NIRS modeling frameworks with cross-instrument generalizability.
作者 李含 孙伟哲 陈希源 张冠军 李元 LI Han;SUN Wei-zhe;CHEN Xi-yuan;ZHANG Guan-jun;LI Yuan(School of Electrical Engineering,Xi'an Jiaotong University,Xi'an 710049,China)
出处 《光谱学与光谱分析》 北大核心 2025年第11期3145-3152,共8页 Spectroscopy and Spectral Analysis
基金 国家自然科学基金项目(52477159)资助。
关键词 近红外光谱 绝缘纸 迁移学习 神经网络 微调 Near-infrared spectroscopy Insulating paper Transfer learning Neural network Fine-tuning
  • 相关文献

参考文献6

二级参考文献194

共引文献186

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部