摘要
猪攻击过程中会产生身体形变、遮挡等因素,从而导致猪身份难以识别。提出一种基于改进YOLOv9的深度学习算法识别攻击状态下猪身份。从标记的600段1 s攻击视频中产生18000帧图像作为数据集。首先,采用DualConv替换YOLOv9网络的下采样,在保持精度基础上降低计算量;然后,融合双向特征金字塔改进YOLOv9的颈部网络部分,以提升模型在攻击场景下的特征提取能力;接着,在主干网络的RepNCSPELAN4层后引入局部自注意力机制,以增强模型捕捉局部特征的能力;最后,采用改进的YOLOv9识别猪身份。结果表明,改进的YOLOv9模型识别猪身份的平均精度达93.6%,较基准模型提高3.7百分点,检测速度达31.58帧·s^(-1)。以上表明,改进的YOLOv9算法能有效提升攻击场景下猪身份的识别精度,有助于将攻击识别从群体级细化为个体级。
Body deformation,occlusion and other factors will generate during pig aggression,and this makes it difficult to recognize the identity of pigs.This study proposed a deep learning algorithm based on improved YOLOv9 to recognize pig identities under aggression situation.18000 frames were generated from 600 labelled 1 s aggressive videos as the dataset.Firstly,DualConv was used to replace the subsampling of YOLOv9 network,which reduced the computation on the base of maintaining the accuracy.Secondly,the neck part of YOLOv9 was improved by integrating bidirectional feature pyramid network(Bi-FPN)to improve the feature extraction capability of the model under aggression scene.Then,the partial self-attention(PSA)mechanism was introduced behind RepNCSPELAN4 layer of backbone to enhance the ability of model to capture local features.Finally,the improved YOLOv9 was used to recognize pig identities.The results showed that the average accuracy of this improved YOLOv9 model was 93.6%,which was 3.7 percentage point higher than that of the benchmark model,and the detecting speed was 31.58 f·s^(-1).Above results indicated that the improved YOLOv9 could effectively improve the identification accuracy of pigs in aggression scene,and helped refine the aggression recognition from the group level to the individual level.
作者
陈晨
刘浩然
NORTON Tomas
CHEN Chen;LIU Haoran;NORTON Tomas(School of Electrical and Information Engineering,Jiangsu University,Jiangsu Zhenjiang 212013,China;Division of Measure,Model&Manage Bioresponses(M3-Biores),KU Leuven,Leuven 303001,Belgium)
出处
《中国农业科技导报(中英文)》
北大核心
2025年第10期134-143,共10页
Journal of Agricultural Science and Technology
基金
国家自然科学基金项目(32102598)。