Crop leaf area index(LAI)and biomass are two major biophysical parameters to measure crop growth and health condition.Measuring LAI and biomass in field experiments is a destructive method.Therefore,we focused on the ...Crop leaf area index(LAI)and biomass are two major biophysical parameters to measure crop growth and health condition.Measuring LAI and biomass in field experiments is a destructive method.Therefore,we focused on the application of unmanned aerial vehicles(UAVs)in agriculture,which is a cost and labor-efficientmethod.Hence,UAV-captured multispectral images were applied to monitor crop growth,identify plant bio-physical conditions,and so on.In this study,we monitored soybean crops using UAV and field experiments.This experiment was conducted at theMAFES(Mississippi Agricultural and Forestry Experiment Station)Pontotoc Ridge-Flatwoods Branch Experiment Station.It followed a randomized block design with five cover crops:Cereal Rye,Vetch,Wheat,MC:mixed Mustard and Cereal Rye,and native vegetation.Planting was made in the fall,and three fertilizer treatments were applied:Synthetic Fertilizer,Poultry Litter,and none,applied before planting the soybean,in a full factorial combination.We monitored soybean reproductive phases at R3(initial pod development),R5(initial seed development),R6(full seed development),and R7(initial maturity)and used UAV multispectral remote sensing for soybean LAI and biomass estimations.The major goal of this study was to assess LAI and biomass estimations from UAV multispectral images in the reproductive stages when the development of leaves and biomass was stabilized.Wemade about fourteen vegetation indices(VIs)fromUAVmultispectral images at these stages to estimate LAI and biomass.Wemodeled LAI and biomass based on these remotely sensed VIs and ground-truth measurements usingmachine learning methods,including linear regression,Random Forest(RF),and support vector regression(SVR).Thereafter,the models were applied to estimate LAI and biomass.According to the model results,LAI was better estimated at the R6 stage and biomass at the R3 stage.Compared to the other models,the RF models showed better estimation,i.e.,an R^(2) of about 0.58–0.68 with an RMSE(rootmean square error)of 0.52–0.60(m^(2)/m^(2))for the LAI and about 0.44–0.64 for R^(2) and 21–26(g dry weight/5 plants)for RMSE of biomass estimation.We performed a leave-one-out cross-validation.Based on cross-validatedmodels with field experiments,we also found that the R6 stage was the best for estimating LAI,and the R3 stage for estimating crop biomass.The cross-validated RF model showed the estimation ability with an R^(2) about 0.25–0.44 and RMSE of 0.65–0.85(m^(2)/m^(2))for LAI estimation;and R^(2) about 0.1–0.31 and an RMSE of about 28–35(g dry weight/5 plants)for crop biomass estimation.This result will be helpful to promote the use of non-destructive remote sensing methods to determine the crop LAI and biomass status,which may bring more efficient crop production and management.展开更多
Accurate recognition of maize seedlings on the plot scale under the disturbance of weeds is crucial for early seedling replenishment and weed removal.Currently,UAV-based maize seedling recognition depends primarily on...Accurate recognition of maize seedlings on the plot scale under the disturbance of weeds is crucial for early seedling replenishment and weed removal.Currently,UAV-based maize seedling recognition depends primarily on RGB images.The main purpose of this study is to compare the performances of multispectral images and RGB images of unmanned aerial vehicle(UAV)on maize seeding recognition using deep learning algorithms.Additionally,we aim to assess the disturbance of different weed coverage on the recognition of maize seeding.Firstly,principal component analysis was used in multispectral image transformation.Secondly,by introducing the CARAFE sampling operator and a small target detection layer(SLAY),we extracted the contextual information of each pixel to retain weak features in the maize seedling image.Thirdly,the global attention mechanism(GAM)was employed to capture the features of maize seedlings using the dual attention mechanism of spatial and channel information.The CGS-YOLO algorithm was constructed and formed.Finally,we compared the performance of the improved algorithm with a series of deep learning algorithms,including YOLO v3,v5,v6 and v8.The results show that after PCA transformation,the recognition mAP of maize seedlings reaches 82.6%,representing 3.1 percentage points improvement compared to RGB images.Compared with YOLOv8,YOLOv6,YOLOv5,and YOLOv3,the CGS-YOLO algorithm has improved mAP by 3.8,4.2,4.5 and 6.6 percentage points,respectively.With the increase of weed coverage,the recognition effect of maize seedlings gradually decreased.When weed coverage was more than 70%,the mAP difference becomes significant,but CGS-YOLO still maintains a recognition mAP of 72%.Therefore,in maize seedings recognition,UAV-based multispectral images perform better than RGB images.The application of CGS-YOLO deep learning algorithm with UAV multi-spectral images proves beneficial in the recognition of maize seedlings under weed disturbance.展开更多
基金This research was supported in part by a postdoctoral research fellow appointment to the Agricultural Research Service(ARS)Research Participation Program administered by the Oak Ridge Institute for Science and Education(ORISE)through an interagency agreement between the U.S.Department of Energy(DOE)and the U.S.Department of Agriculture(USDA).
文摘Crop leaf area index(LAI)and biomass are two major biophysical parameters to measure crop growth and health condition.Measuring LAI and biomass in field experiments is a destructive method.Therefore,we focused on the application of unmanned aerial vehicles(UAVs)in agriculture,which is a cost and labor-efficientmethod.Hence,UAV-captured multispectral images were applied to monitor crop growth,identify plant bio-physical conditions,and so on.In this study,we monitored soybean crops using UAV and field experiments.This experiment was conducted at theMAFES(Mississippi Agricultural and Forestry Experiment Station)Pontotoc Ridge-Flatwoods Branch Experiment Station.It followed a randomized block design with five cover crops:Cereal Rye,Vetch,Wheat,MC:mixed Mustard and Cereal Rye,and native vegetation.Planting was made in the fall,and three fertilizer treatments were applied:Synthetic Fertilizer,Poultry Litter,and none,applied before planting the soybean,in a full factorial combination.We monitored soybean reproductive phases at R3(initial pod development),R5(initial seed development),R6(full seed development),and R7(initial maturity)and used UAV multispectral remote sensing for soybean LAI and biomass estimations.The major goal of this study was to assess LAI and biomass estimations from UAV multispectral images in the reproductive stages when the development of leaves and biomass was stabilized.Wemade about fourteen vegetation indices(VIs)fromUAVmultispectral images at these stages to estimate LAI and biomass.Wemodeled LAI and biomass based on these remotely sensed VIs and ground-truth measurements usingmachine learning methods,including linear regression,Random Forest(RF),and support vector regression(SVR).Thereafter,the models were applied to estimate LAI and biomass.According to the model results,LAI was better estimated at the R6 stage and biomass at the R3 stage.Compared to the other models,the RF models showed better estimation,i.e.,an R^(2) of about 0.58–0.68 with an RMSE(rootmean square error)of 0.52–0.60(m^(2)/m^(2))for the LAI and about 0.44–0.64 for R^(2) and 21–26(g dry weight/5 plants)for RMSE of biomass estimation.We performed a leave-one-out cross-validation.Based on cross-validatedmodels with field experiments,we also found that the R6 stage was the best for estimating LAI,and the R3 stage for estimating crop biomass.The cross-validated RF model showed the estimation ability with an R^(2) about 0.25–0.44 and RMSE of 0.65–0.85(m^(2)/m^(2))for LAI estimation;and R^(2) about 0.1–0.31 and an RMSE of about 28–35(g dry weight/5 plants)for crop biomass estimation.This result will be helpful to promote the use of non-destructive remote sensing methods to determine the crop LAI and biomass status,which may bring more efficient crop production and management.
基金supported by the Major Science and Technology Project of Guizhou Province([2024]004)Science and Technology Program Project of Guizhou Provincial Tobacco Company of CNTC(2024520000240087).
文摘Accurate recognition of maize seedlings on the plot scale under the disturbance of weeds is crucial for early seedling replenishment and weed removal.Currently,UAV-based maize seedling recognition depends primarily on RGB images.The main purpose of this study is to compare the performances of multispectral images and RGB images of unmanned aerial vehicle(UAV)on maize seeding recognition using deep learning algorithms.Additionally,we aim to assess the disturbance of different weed coverage on the recognition of maize seeding.Firstly,principal component analysis was used in multispectral image transformation.Secondly,by introducing the CARAFE sampling operator and a small target detection layer(SLAY),we extracted the contextual information of each pixel to retain weak features in the maize seedling image.Thirdly,the global attention mechanism(GAM)was employed to capture the features of maize seedlings using the dual attention mechanism of spatial and channel information.The CGS-YOLO algorithm was constructed and formed.Finally,we compared the performance of the improved algorithm with a series of deep learning algorithms,including YOLO v3,v5,v6 and v8.The results show that after PCA transformation,the recognition mAP of maize seedlings reaches 82.6%,representing 3.1 percentage points improvement compared to RGB images.Compared with YOLOv8,YOLOv6,YOLOv5,and YOLOv3,the CGS-YOLO algorithm has improved mAP by 3.8,4.2,4.5 and 6.6 percentage points,respectively.With the increase of weed coverage,the recognition effect of maize seedlings gradually decreased.When weed coverage was more than 70%,the mAP difference becomes significant,but CGS-YOLO still maintains a recognition mAP of 72%.Therefore,in maize seedings recognition,UAV-based multispectral images perform better than RGB images.The application of CGS-YOLO deep learning algorithm with UAV multi-spectral images proves beneficial in the recognition of maize seedlings under weed disturbance.