This paper describes the design,algorithm development,and experimental verification of a precise spray perception system based on LiDAR were presented to address the issue that the navigation line extraction accuracy ...This paper describes the design,algorithm development,and experimental verification of a precise spray perception system based on LiDAR were presented to address the issue that the navigation line extraction accuracy of self-propelled sprayers during field operations is low,resulting in wheels rolling over the ridges and excessive pesticide waste.A data processing framework was established for the precision spray perception system.Through data preprocessing,adaptive segmentation of crops and ditches,extraction of navigation lines and crop positioning,which were derived from the original LiDAR point cloud species.Data collection and analysis of the field environment of cabbages in different growth cycles were conducted to verify the stability of the precision spraying system.A controllable constant-speed experimental setup was established to compare the performance of LiDAR and depth camera in the same field environment.The experimental results show that at the self-propelled sprayer of speeds of 0.5 and 1 ms−1,the maximum lateral error is 0.112 m in a cabbage ridge environment with inter-row weeds,with an mean absolute lateral error of 0.059 m.The processing speed per frame does not exceed 43 ms.Compared to the machine vision algorithm,this method reduces the average processing time by 122 ms.The proposed system demonstrates superior accuracy,processing time,and robustness in crop identification and navigation line extraction compared to the machine vision system.展开更多
基金funded by the Hunan Provincial Science and Technology Department of Major Project-Ten Major Technical Research Projects(2023NK1020)the Hunan Provincial Department of Science and Technology Key Areas R&D Program(2023NK2010)+5 种基金the Hunan Provincial Department of Education Key Projects(299054)the Chenzhou National Sustainable Development Agenda Innovation Demonstration Zone Construction Special Project(2022sfq20)the National Key R&D Program(2022YFD2002001)supported by the 2023 High-level Guangdong Agricultural Science and Technology Demonstration City Construction Fund MunicipalAcademy Cooperation Project(2320060002384)Changsha Science and Technology Bureau Natural Science Foundation Project(kq2402110)the 2023 Intelligent Agricultural Machinery Equipment Innovation Research and Development Project entitled“Southern Paddy Field Green Manure Ditcher R&D,Manufacturing,Promotion and Application”in Hunan Province.
文摘This paper describes the design,algorithm development,and experimental verification of a precise spray perception system based on LiDAR were presented to address the issue that the navigation line extraction accuracy of self-propelled sprayers during field operations is low,resulting in wheels rolling over the ridges and excessive pesticide waste.A data processing framework was established for the precision spray perception system.Through data preprocessing,adaptive segmentation of crops and ditches,extraction of navigation lines and crop positioning,which were derived from the original LiDAR point cloud species.Data collection and analysis of the field environment of cabbages in different growth cycles were conducted to verify the stability of the precision spraying system.A controllable constant-speed experimental setup was established to compare the performance of LiDAR and depth camera in the same field environment.The experimental results show that at the self-propelled sprayer of speeds of 0.5 and 1 ms−1,the maximum lateral error is 0.112 m in a cabbage ridge environment with inter-row weeds,with an mean absolute lateral error of 0.059 m.The processing speed per frame does not exceed 43 ms.Compared to the machine vision algorithm,this method reduces the average processing time by 122 ms.The proposed system demonstrates superior accuracy,processing time,and robustness in crop identification and navigation line extraction compared to the machine vision system.