摘要
针对现有多聚焦图像融合方法在处理聚散焦边界区域与远离边界区域时,难以同时取得良好效果的问题,提出一种含注意力机制的多任务学习网络,包括特征提取、特征融合、特征重建3个阶段。在特征提取阶段,通过多尺度特征提取模块更全面地捕捉细节信息,并结合高效通道注意力模块保留关键特征。在特征重建阶段,同时完成融合任务和分类任务,并使用双边信息交换模块完成2种任务之间的信息交互,以帮助各任务提取到所需的重要特征。实验结果表明,所提算法在2个数据集中的效果均优于对比算法,在所选的8个评价指标中,均有5个指标取得最优,其中在MFFW数据集中有显著提升,平均梯度(AG)、峰值信噪比(PSNR)、标准差(SD)、空间频率(SF)和结构相似性指数(SSIM)指标相较于次优结果分别提升1.26%、1.22%、0.97%、0.71%和0.46%。此外,所提算法在主观视觉效果上也表现出更优的效果。
To address the limitations of existing multi-focus image fusion methods in simultaneously achieving optimal performance for both focused-defocused boundary regions and far from boundary regions,this paper proposes a multi-task learning network with attention mechanism.The framework comprises three key stages:feature extraction,feature fusion,and feature reconstruction.During feature extraction,a multi-scale feature extraction module comprehensively captures detailed information while an efficient channel attention mechanism preserves critical features.The feature reconstruction stage concurrently performs fusion and classification tasks,facilitated by a bilateral information exchange module that enables effective feature sharing between tasks to help each task extract the necessary important features.Experimental results demonstrate superior performances across two datasets,achieving top results in five out of eight evaluation indicators.Notably,significant improvements are observed on the MFFW dataset,with average gradient(AG),peak signal-to-noise ratio(PSNR),standard deviation(SD),spatial frequency(SF),and structural similarity index(SSIM)metrics increasing by 1.26%,1.22%,0.97%,0.71%,and 0.46%,respectively.Furthermore,the proposed method demonstrates superior performance in subjective visual evaluations.
作者
许蔚科
李伟彤
Xu Weike;Li Weitong(Schoolof Information Engineering,Guangdong University of Technology,Guangzhou 5l0006,Guangdong,China)
出处
《激光与光电子学进展》
北大核心
2025年第22期427-435,共9页
Laser & Optoelectronics Progress
基金
广东省科技计划(2017A010101016)。
关键词
多聚焦图像融合
深度学习
多任务学习网络
注意力机制
multi-focus image fusion
deep learning
multi-task learning network
attention mechanism