Breast cancer has become a killer of women's health nowadays.In order to exploit the potential representational capabilities of the models more comprehensively,we propose a multi-model fusion strategy.Specifically...Breast cancer has become a killer of women's health nowadays.In order to exploit the potential representational capabilities of the models more comprehensively,we propose a multi-model fusion strategy.Specifically,we combine two differently structured deep learning models,ResNet101 and Swin Transformer(SwinT),with the addition of the Convolutional Block Attention Module(CBAM)attention mechanism,which makes full use of SwinT's global context information modeling ability and ResNet101's local feature extraction ability,and additionally the cross entropy loss function is replaced by the focus loss function to solve the problem of unbalanced allocation of breast cancer data sets.The multi-classification recognition accuracies of the proposed fusion model under 40X,100X,200X and 400X BreakHis datasets are 97.50%,96.60%,96.30 and 96.10%,respectively.Compared with a single SwinT model and ResNet 101 model,the fusion model has higher accuracy and better generalization ability,which provides a more effective method for screening,diagnosis and pathological classification of female breast cancer.展开更多
针对背景复杂、尺度变化较大、被遮挡情况下机械外破隐患目标检测精度不高,容易出现错检、漏检的问题,文中提出了一种改进YOLOv7(you only look once version 7)的机械外破隐患目标检测算法。文章在检测头网络中添加Swin Transformer注...针对背景复杂、尺度变化较大、被遮挡情况下机械外破隐患目标检测精度不高,容易出现错检、漏检的问题,文中提出了一种改进YOLOv7(you only look once version 7)的机械外破隐患目标检测算法。文章在检测头网络中添加Swin Transformer注意力机制提高对多尺度特征的提取能力,然后在主干网络中将部分卷积模块替换为深度可分离卷积,降低模型运算成本,采用Focal-EIOU(Focal and enhanced intersection over union)损失函数优化预测框,最后引入Mish激活函数增强网络的泛化能力,提高模型在复杂背景、目标部分被遮挡情况下的检测性能。实验结果表明,改进后的算法较原YOLOv7在准确率、召回率和平均精度均值上分别提高了5.2%、10.6%和5.2%,较其他主流算法在检测精度和模型体积上有着明显的优势,验证了改进方法的有效性,为复杂场景下机械外破隐患目标的边缘识别提供算法支持。展开更多
基金By the National Natural Science Foundation of China(NSFC)(No.61772358),the National Key R&D Program Funded Project(No.2021YFE0105500),and the Jiangsu University‘Blue Project’.
文摘Breast cancer has become a killer of women's health nowadays.In order to exploit the potential representational capabilities of the models more comprehensively,we propose a multi-model fusion strategy.Specifically,we combine two differently structured deep learning models,ResNet101 and Swin Transformer(SwinT),with the addition of the Convolutional Block Attention Module(CBAM)attention mechanism,which makes full use of SwinT's global context information modeling ability and ResNet101's local feature extraction ability,and additionally the cross entropy loss function is replaced by the focus loss function to solve the problem of unbalanced allocation of breast cancer data sets.The multi-classification recognition accuracies of the proposed fusion model under 40X,100X,200X and 400X BreakHis datasets are 97.50%,96.60%,96.30 and 96.10%,respectively.Compared with a single SwinT model and ResNet 101 model,the fusion model has higher accuracy and better generalization ability,which provides a more effective method for screening,diagnosis and pathological classification of female breast cancer.
文摘针对背景复杂、尺度变化较大、被遮挡情况下机械外破隐患目标检测精度不高,容易出现错检、漏检的问题,文中提出了一种改进YOLOv7(you only look once version 7)的机械外破隐患目标检测算法。文章在检测头网络中添加Swin Transformer注意力机制提高对多尺度特征的提取能力,然后在主干网络中将部分卷积模块替换为深度可分离卷积,降低模型运算成本,采用Focal-EIOU(Focal and enhanced intersection over union)损失函数优化预测框,最后引入Mish激活函数增强网络的泛化能力,提高模型在复杂背景、目标部分被遮挡情况下的检测性能。实验结果表明,改进后的算法较原YOLOv7在准确率、召回率和平均精度均值上分别提高了5.2%、10.6%和5.2%,较其他主流算法在检测精度和模型体积上有着明显的优势,验证了改进方法的有效性,为复杂场景下机械外破隐患目标的边缘识别提供算法支持。