摘要
针对基于深度学习的语义分割网络中参数大,难以在移动平台上实施;网络中感受野较小,没有充分利用低层语义信息等问题。提出基于DeepLabV3+改进的神经网络算法。网络编码器在主干网络使用轻量级MobilenetV2替换原网络并增加通道注意力机制模块,大幅降低网络的计算时间;将原ASPP部分替换为密集连接的ASPP结构,扩张率设置为1~8倍,增大感受野的同时控制了计算量,同时针对密集连接ASPP加入了条形池化,在精度提高的同时运算量基本维持不变;并将得到的特征图通过空间和通道注意力机制进行语义信息建模。解码端融合编码端第二次和第三次下采样的语义特征,提升图像的分割精度。实验结果表明,在Camvid与Cityscapes数据集上测试,该网络能够改善DeepLabV3+的不足,且在平均交并比分别达到72.3%、66.41%,在CamVid上用时降低了33.8%。
Due to the large parameters in deep learning based semantic segmentation networks,it is difficult to implement them on mobile platforms;The receptive field in the network is relatively small,and low level semantic information is not fully utilized.Propose an improved neural network algorithm based on DeepLabV3+.The network encoder replaces the original network with lightweight MobilenetV2 and adds a channel attention mechanism module in the backbone network,significantly reducing the network's computation time;Replace the original ASPP structure with a densely connected ASPP structure,set the expansion rate to 1~8 times,increase the receptive field while controlling the computational load,and add bar pooling for densely connected ASPP to improve accuracy while maintaining the same computational load;And the obtained feature maps will be modeled for semantic information through spatial and channel attention mechanisms.The decoding end integrates the semantic features of the second and third downsampling of the encoding end to improve the segmentation accuracy of the image.The experimental results show that the network can improve the shortcomings of DeepLabV3+when tested on the Camvid and Cityscapes datasets,with an average intersection to union ratio of 72.3%and 66.41%,respectively,and a 33.8%reduction in usage time on Camvid.
作者
宋宇
刘伟达
郭杨
梁超
SONG Yu;LIU Weida;GUO Yang;LIANG Chao(School of Computer Science&Engineering,Changchun University of Technology,Changchun 130102,China;Changchun Branch Company,China Mobile Communications Group Jilin Co.Ltd.,Changchun 130000,China)
出处
《长春工业大学学报》
2025年第3期222-232,共11页
Journal of Changchun University of Technology
基金
吉林省科技发展计划项目(20220201030GX)。