Organ instance segmentation of 3D plant point clouds is a crucial prerequisite for organ-level phenotype esti-mation.However,most current cloud segmentation methods are usually designed for specific crop,hardly fit fo...Organ instance segmentation of 3D plant point clouds is a crucial prerequisite for organ-level phenotype esti-mation.However,most current cloud segmentation methods are usually designed for specific crop,hardly fit for both monocotyledonous and dicotyledonous crops which have significant structural differences.This study therefore proposed a two-stage method with higher generalization ability for single-plant organ instance seg-mentation based on PointNeXt and Quickshift++.The effectiveness of this method was tested on different types of crops.The dataset includes point clouds of 122 self-acquired sugarcanes,49 open-accessed maizes,and 77 open-accessed tomatoes.The improved PointNeXt model was trained to implement the semantic segmentation of stems and leaves.The average mOA and mIoU on the test set reaches 96.96%and 87.15%,respectively.The Quickshift++algorithm was then applied to encode the global spatial structure and local connections of plants for rapid localization and segmentation of leaf instance.Our approach outperformed four SOTA methods,ASIS,JSNet,DFSP,and PSegNet in terms of both quantitative and qualitative segmentation results,achieving average values for mPrec,mRec,mF1,and mIoU of 93.32%,85.60%,87.94%,and 81.46%,respectively.The proposed method also yields excellent results for several other plants in their early stages,indicating its generalization ability and applicability for organ instance segmentation for different plants,thus providing a powerful tool for plant phenotypic research.展开更多
The Sentinel-2 satellites are providing an unparalleled wealth of high-resolution remotely sensed information with a short revisit cycle, which is ideal for mapping burned areas both accurately and timely. This paper ...The Sentinel-2 satellites are providing an unparalleled wealth of high-resolution remotely sensed information with a short revisit cycle, which is ideal for mapping burned areas both accurately and timely. This paper proposes an automated methodology for mapping burn scars using pairs of Sentinel-2 imagery, exploiting the state-of-the-art eXtreme Gradient Boosting (XGB) machine learning framework. A large database of 64 reference wildfire perimeters in Greece from 2016 to 2019 is used to train the classifier. An empirical methodology for appropriately sampling the training patterns from this database is formulated, which guarantees the effectiveness of the approach and its computational efficiency. A difference (pre-fire minus post-fire) spectral index is used for this purpose, upon which we appropriately identify the clear and fuzzy value ranges. To reduce the data volume, a super-pixel segmentation of the images is also employed, implemented via the QuickShift algorithm. The cross-validation results showcase the effectiveness of the proposed algorithm, with the average commission and omission errors being 9% and 2%, respectively, and the average Matthews correlation coefficient (MCC) equal to 0.93.展开更多
基金This work was supported by the Science Technology Major Project of Guangxi,China(Gui Ke Nong AB24153010,Gui Ke AA22117004)the National Natural Science Foundation of China(31760342)the Innovation Project of Guangxi Graduate Education,YCSW2023028.
文摘Organ instance segmentation of 3D plant point clouds is a crucial prerequisite for organ-level phenotype esti-mation.However,most current cloud segmentation methods are usually designed for specific crop,hardly fit for both monocotyledonous and dicotyledonous crops which have significant structural differences.This study therefore proposed a two-stage method with higher generalization ability for single-plant organ instance seg-mentation based on PointNeXt and Quickshift++.The effectiveness of this method was tested on different types of crops.The dataset includes point clouds of 122 self-acquired sugarcanes,49 open-accessed maizes,and 77 open-accessed tomatoes.The improved PointNeXt model was trained to implement the semantic segmentation of stems and leaves.The average mOA and mIoU on the test set reaches 96.96%and 87.15%,respectively.The Quickshift++algorithm was then applied to encode the global spatial structure and local connections of plants for rapid localization and segmentation of leaf instance.Our approach outperformed four SOTA methods,ASIS,JSNet,DFSP,and PSegNet in terms of both quantitative and qualitative segmentation results,achieving average values for mPrec,mRec,mF1,and mIoU of 93.32%,85.60%,87.94%,and 81.46%,respectively.The proposed method also yields excellent results for several other plants in their early stages,indicating its generalization ability and applicability for organ instance segmentation for different plants,thus providing a powerful tool for plant phenotypic research.
文摘The Sentinel-2 satellites are providing an unparalleled wealth of high-resolution remotely sensed information with a short revisit cycle, which is ideal for mapping burned areas both accurately and timely. This paper proposes an automated methodology for mapping burn scars using pairs of Sentinel-2 imagery, exploiting the state-of-the-art eXtreme Gradient Boosting (XGB) machine learning framework. A large database of 64 reference wildfire perimeters in Greece from 2016 to 2019 is used to train the classifier. An empirical methodology for appropriately sampling the training patterns from this database is formulated, which guarantees the effectiveness of the approach and its computational efficiency. A difference (pre-fire minus post-fire) spectral index is used for this purpose, upon which we appropriately identify the clear and fuzzy value ranges. To reduce the data volume, a super-pixel segmentation of the images is also employed, implemented via the QuickShift algorithm. The cross-validation results showcase the effectiveness of the proposed algorithm, with the average commission and omission errors being 9% and 2%, respectively, and the average Matthews correlation coefficient (MCC) equal to 0.93.