摘要
虽然现有基于深度学习的压缩感知磁共振成像(CS-MRI)方法已经取得了较好的效果,但这些方法的可解释性仍然面临挑战,并且从理论分析到网络设计的过渡并不够自然。为解决上述问题,提出深度双域几何蒸馏特征自适应网络(DDGD-FANet)。该深度展开网络将磁共振成像重建优化问题迭代展开成3个子部分:数据一致性模块、双域几何蒸馏模块和自适应网络模块,不仅可以补偿重建图像丢失的上下文信息,恢复更多的纹理细节,还可以去除全局伪影,进一步提高重建效果。在公开数据集使用3种不同的采样模式进行实验,结果表明:DDGD-FANet在3种采样模式下均取得了更高的峰值信噪比和结构相似性指数,在笛卡儿10%压缩感知(CS)比率下,峰值信噪比较迭代收缩阈值算法(ISTA-Net+)、快速ISTA(FISTA)-Net和DGDN模型分别提高了5.01 dB、4.81 dB和3.34 dB。
Although the existing compressed sensing-magnetic resonance imaging(CS-MRI)methods based on deep learning have achieved good results,the interpretability of these methods still faces challenges,and the transition from theoretical analysis to network design is not natural enough.In order to solve the above problems,this paper proposed a deep dual-domain geometry distillation feature adaptive network(DDGD-FANet).The deep unfolding network iteratively expanded the MRI reconstruction optimization problem into three sub-modules:data consistency module,dual-domain geometry distillation module,and adaptive network module.It could compensate for the lost context information of the reconstructed image,restore more texture details,remove global artifacts,and further improve the reconstruction effect.Three different sampling modes were used in the public dataset.The results show that DDGD-FANet achieves a higher peak signal-to-noise ratio and structural similarity index in all three sampling modes.At the Cartesian 10%compressed sensing(CS)ratio,the peak signal-to-noise ratio is increased by 5.01 dB,4.81 dB,and 3.34 dB,respectively,higher than that of iterative shrinkage-thresholding algorithm(ISTA)-Net+,fast ISTA(FISTA)-Net,and DGDN models.
作者
朵琳
任勇
许渤雨
杨新
DUO Lin;REN Yong;XU Boyu;YANG Xin(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,China)
出处
《北京航空航天大学学报》
北大核心
2025年第6期1946-1954,共9页
Journal of Beijing University of Aeronautics and Astronautics
基金
国家自然科学基金(61962032)。
关键词
磁共振成像
图像重建
深度学习
几何蒸馏
自适应网络
注意力机制
magnetic resonance imaging
image reconstruction
deep learning
geometry distillation
adaptive network
attention mechanism