Accurately identifying crop pests and diseases ensures agricultural productivity and safety.Although current YOLO-based detection models offer real-time capabilities,their conventional convolutional layers involve hig...Accurately identifying crop pests and diseases ensures agricultural productivity and safety.Although current YOLO-based detection models offer real-time capabilities,their conventional convolutional layers involve high computational redundancy and a fixed receptive field,making it challenging to capture local details and global semantics in complex scenarios simultaneously.This leads to significant issues like missed detections of small targets and heightened sensitivity to background interference.To address these challenges,this paper proposes a lightweight adaptive detection network—StarSpark-AdaptiveNet(SSANet),which optimizes features through a dual-module collaborative mechanism.Specifically,the StarNet module utilizes Depthwise separable convolutions(DW-Conv)and dynamic star operations to establish multi-stage feature extraction pathways,enhancing local detail perception within a lightweight framework.Moreover,the Multi-scale Adaptive Spatial Attention Gate(MASAG)module integrates cross-layer feature fusion and dynamic weight allocation to capture multi-scale global contextual information,effectively suppressing background noise.These modules jointly form a“local enhancement-global calibration”bidirectional optimization mechanism,significantly improving the model’s adaptability to complex disease patterns.Furthermore,the proposed Scale-based Dynamic Loss(SD Loss)dynamically adjusts the weight of scale and localization losses,improving regression stability and localization accuracy,especially for small targets.Experiments on the eggplant fruit disease dataset demonstrate that SSANet achieves an mAP50 of 83.9%and a detection speed of 273.5 FPS with only 2.11 M parameters and 5.1 GFLOPs computational cost,outperforming the baseline YOLO11 model by reducing parameters by 18.1%,increasing mAP50 by 1.3%,and improving inference speed by 9.1%.Ablation studies further confirm the effectiveness and complementarity of the modules.SSANet offers a high-accuracy,low-cost solution suitable for real-time pest and disease detection in crops,facilitating edge device deployment and promoting precision agriculture.展开更多
基金suported by the Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Science and ICT(NRF-2022R1A2C2012243).
文摘Accurately identifying crop pests and diseases ensures agricultural productivity and safety.Although current YOLO-based detection models offer real-time capabilities,their conventional convolutional layers involve high computational redundancy and a fixed receptive field,making it challenging to capture local details and global semantics in complex scenarios simultaneously.This leads to significant issues like missed detections of small targets and heightened sensitivity to background interference.To address these challenges,this paper proposes a lightweight adaptive detection network—StarSpark-AdaptiveNet(SSANet),which optimizes features through a dual-module collaborative mechanism.Specifically,the StarNet module utilizes Depthwise separable convolutions(DW-Conv)and dynamic star operations to establish multi-stage feature extraction pathways,enhancing local detail perception within a lightweight framework.Moreover,the Multi-scale Adaptive Spatial Attention Gate(MASAG)module integrates cross-layer feature fusion and dynamic weight allocation to capture multi-scale global contextual information,effectively suppressing background noise.These modules jointly form a“local enhancement-global calibration”bidirectional optimization mechanism,significantly improving the model’s adaptability to complex disease patterns.Furthermore,the proposed Scale-based Dynamic Loss(SD Loss)dynamically adjusts the weight of scale and localization losses,improving regression stability and localization accuracy,especially for small targets.Experiments on the eggplant fruit disease dataset demonstrate that SSANet achieves an mAP50 of 83.9%and a detection speed of 273.5 FPS with only 2.11 M parameters and 5.1 GFLOPs computational cost,outperforming the baseline YOLO11 model by reducing parameters by 18.1%,increasing mAP50 by 1.3%,and improving inference speed by 9.1%.Ablation studies further confirm the effectiveness and complementarity of the modules.SSANet offers a high-accuracy,low-cost solution suitable for real-time pest and disease detection in crops,facilitating edge device deployment and promoting precision agriculture.