期刊文献+
共找到27篇文章
< 1 2 >
每页显示 20 50 100
Efficient sampling strategy driven surrogate-based multi-objective optimization for broadband microwave metamaterial absorbers 被引量:1
1
作者 LIU Sixing PEI Changbao +3 位作者 YE Xiaodong WANG Hao WU Fan TAO Shifei 《Journal of Systems Engineering and Electronics》 CSCD 2024年第6期1388-1396,共9页
Multi-objective optimization(MOO)for the microwave metamaterial absorber(MMA)normally adopts evolutionary algo-rithms,and these optimization algorithms require many objec-tive function evaluations.To remedy this issue... Multi-objective optimization(MOO)for the microwave metamaterial absorber(MMA)normally adopts evolutionary algo-rithms,and these optimization algorithms require many objec-tive function evaluations.To remedy this issue,a surrogate-based MOO algorithm is proposed in this paper where Kriging models are employed to approximate objective functions.An efficient sampling strategy is presented to sequentially capture promising samples in the design region for exact evaluations.Firstly,new sample points are generated by the MOO on surro-gate models.Then,new samples are captured by exploiting each objective function.Furthermore,a weighted sum of the improvement of hypervolume(IHV)and the distance to sampled points is calculated to select the new sample.Compared with two well-known MOO algorithms,the proposed algorithm is vali-dated by benchmark problems.In addition,two broadband MMAs are applied to verify the feasibility and efficiency of the proposed algorithm. 展开更多
关键词 multi-objective optimization(MOO) Kriging model microwave metamaterial absorber(MMA) surrogate models sampling strategy
在线阅读 下载PDF
Sampling Strategy Within a Wild Soybean Population Based on Its Genetic Variation Detected by ISSR Markers 被引量:29
2
作者 金燕 张文驹 +1 位作者 傅大煦 卢宝荣 《Acta Botanica Sinica》 CSCD 2003年第8期995-1002,共8页
In order to determine an appropriate sampling strategy for the effective conservation of wild soybean (Glycine soja Sieb. et Zucc.) in China, a natural population from Jiangwan Airport in Shanghai was studied for its ... In order to determine an appropriate sampling strategy for the effective conservation of wild soybean (Glycine soja Sieb. et Zucc.) in China, a natural population from Jiangwan Airport in Shanghai was studied for its genetic diversity through the inter-simple sequence repeat (ISSR) marker analysis of a sample set consisting of 100 randomly collected individuals. A relatively large genetic diversity was detected among the samples based on estimation of DNA products amplified from 15 selected ISSR primers, with the similarity coefficient varying from 0.17 to 0.89. The mean expected heterozygosity (He) was 0.171 4 per locus, and Shannon index (1) was 0.271 4. The Principal Coordinate Analysis (PCA) further indicated that genetic diversity of the Jiangwan wild soybean population was not evenly distributed, instead, was presented by a mosaic or clustered distribution pattern. Correlation study between genetic diversity and number of samples demonstrated that genetic diversity increased dramatically with the increase of number of samples within 40 individuals, but the increase became slow and rapidly reached a plateau when more than 40 individuals were included in the analysis. It is concluded that (i) a sample set of approximately 35-45 individuals should be included to represent possibly high genetic diversity when conservation of a wild soybean population ex situ is undertaken; and (ii) collection of wild soybean samples should be spread out as wide as possible within a population, and a certain distance should be kept as intervals among individuals for sampling. 展开更多
关键词 Glycine soja genetic diversity molecular markers population structure sampling strategy
在线阅读 下载PDF
Multi-Distributed Sampling Method to Optimize Physical-Informed Neural Networks for Solving Optical Solitons
3
作者 Huasen Zhou Zhiyang Zhang +2 位作者 Muwei Liu Fenghua Qi Wenjun Liu 《Chinese Physics Letters》 2025年第7期1-9,共9页
Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neur... Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neural networks(PINN)provide a new way to solve the nonlinear Schrodinger equation describing the soliton evolution by fusing data-driven and physical constraints.However,the grid point sampling strategy of traditional PINN suffers from high computational complexity and unstable gradient flow,which makes it difficult to capture the physical details efficiently.In this paper,we propose a residual-based adaptive multi-distribution(RAMD)sampling method to optimize the PINN training process by dynamically constructing a multi-modal loss distribution.With a 50%reduction in the number of grid points,RAMD significantly reduces the relative error of PINN and,in particular,optimizes the solution error of the(2+1)Ginzburg–Landau equation from 4.55%to 1.98%.RAMD breaks through the lack of physical constraints in the purely data-driven model by the innovative combination of multi-modal distribution modeling and autonomous sampling control for the design of all-optical communication devices.RAMD provides a high-precision numerical simulation tool for the design of all-optical communication devices,optimization of nonlinear laser devices,and other studies. 展开更多
关键词 multi distributed sampling nonlinear schrodinger equation describing soliton evolution residual based adaptive grid point sampling strategy optical solitonsas optical communicationsphysics informed physical informed neural networks ultrafast laser systems
原文传递
Two Performance Indicators Assisted Infill Strategy for Expensive Many⁃Objective Optimization
4
作者 Yi Zhao Jianchao Zeng Ying Tan 《Journal of Harbin Institute of Technology(New Series)》 2025年第5期24-40,共17页
In recent years,surrogate models derived from genuine data samples have proven to be efficient in addressing optimization challenges that are costly or time⁃intensive.However,the individuals in the population become i... In recent years,surrogate models derived from genuine data samples have proven to be efficient in addressing optimization challenges that are costly or time⁃intensive.However,the individuals in the population become indistinguishable as the curse of dimensionality increases in the objective space and the accumulation of surrogate approximated errors.Therefore,in this paper,each objective function is modeled using a radial basis function approach,and the optimal solution set of the surrogate model is located by the multi⁃objective evolutionary algorithm of strengthened dominance relation.The original objective function values of the true evaluations are converted to two indicator values,and then the surrogate models are set up for the two performance indicators.Finally,an adaptive infill sampling strategy that relies on approximate performance indicators is proposed to assist in selecting individuals for real evaluations from the potential optimal solution set.The algorithm is contrasted against several advanced surrogate⁃assisted evolutionary algorithms on two suites of test cases,and the experimental findings prove that the approach is competitive in solving expensive many⁃objective optimization problems. 展开更多
关键词 expensive multi⁃objective optimization problems infill sample strategy evolutionary optimization algorithm
在线阅读 下载PDF
Face Expression Recognition on Uncertainty-Based Robust Sample Selection Strategy
5
作者 Yuqi Wang Wei Jiang 《Journal of Electronic Research and Application》 2025年第2期211-215,共5页
In the task of Facial Expression Recognition(FER),data uncertainty has been a critical factor affecting performance,typically arising from the ambiguity of facial expressions,low-quality images,and the subjectivity of... In the task of Facial Expression Recognition(FER),data uncertainty has been a critical factor affecting performance,typically arising from the ambiguity of facial expressions,low-quality images,and the subjectivity of annotators.Tracking the training history reveals that misclassified samples often exhibit high confidence and excessive uncertainty in the early stages of training.To address this issue,we propose an uncertainty-based robust sample selection strategy,which combines confidence error with RandAugment to improve image diversity,effectively reducing overfitting caused by uncertain samples during deep learning model training.To validate the effectiveness of the proposed method,extensive experiments were conducted on FER public benchmarks.The accuracy obtained were 89.08%on RAF-DB,63.12%on AffectNet,and 88.73%on FERPlus. 展开更多
关键词 Facial expression recognition UNCERTAINTY Sample selection strategy
在线阅读 下载PDF
A Heterogeneous Sampling Strategy to Model Earthquake‑Triggered Landslides 被引量:4
6
作者 Hui Yang Peijun Shi +2 位作者 Duncan Quincey Wenwen Qi Wentao Yang 《International Journal of Disaster Risk Science》 SCIE CSCD 2023年第4期636-648,共13页
Regional modeling of landslide hazards is an essential tool for the assessment and management of risk in mountain environments.Previous studies that have focused on modeling earthquake-triggered landslides report high... Regional modeling of landslide hazards is an essential tool for the assessment and management of risk in mountain environments.Previous studies that have focused on modeling earthquake-triggered landslides report high prediction accuracies.However,it is common to use a validation strategy with an equal number of landslide and non-landslide samples,scattered homogeneously across the study area.Consequently,there are overestimations in the epicenter area,and the spatial pattern of modeled locations does not agree well with real events.In order to improve landslide hazard mapping,we proposed a spatially heterogeneous non-landslide sampling strategy by considering local ratios of landslide to non-landslide area.Coseismic landslides triggered by the 2008 Wenchuan Earthquake on the eastern Tibetan Plateau were used as an example.To assess the performance of the new strategy,we trained two random forest models that shared the same hyperparameters.The frst was trained using samples from the new heterogeneous strategy,and the second used the traditional approach.In each case the spatial match between modeled and measured(interpreted)landslides was examined by scatterplot,with a 2 km-by-2 km fshnet.Although the traditional approach achieved higher AUC_(ROC)(0.95)accuracy than the proposed one(0.85),the coefcient of determination(R^(2))for the new strategy(0.88)was much higher than for the traditional strategy(0.55).Our results indicate that the proposed strategy outperforms the traditional one when comparing against landslide inventory data.Our work demonstrates that higher prediction accuracies in landslide hazard modeling may be deceptive,and validation of the modeled spatial pattern should be prioritized.The proposed method may also be used to improve the mapping of precipitation-induced landslides.Application of the proposed strategy could beneft precise assessment of landslide risks in mountain environments. 展开更多
关键词 Earthquake-triggered landslides Landslide hazard modeling Machine learning Model validation sampling strategy Tibetan Plateau
原文传递
Adaptive inter-intradomain alignment network with class-aware sampling strategy for rolling bearing fault diagnosis 被引量:1
7
作者 GAO QinHe HUANG Tong +4 位作者 ZHAO Ke SHAO HaiDong JIN Bo LIU ZhiHao WANG Dong 《Science China(Technological Sciences)》 SCIE EI CAS CSCD 2023年第10期2862-2870,共9页
Existing unsupervised domain adaptation approaches primarily focus on reducing the data distribution gap between the source and target domains,often neglecting the influence of class information,leading to inaccurate ... Existing unsupervised domain adaptation approaches primarily focus on reducing the data distribution gap between the source and target domains,often neglecting the influence of class information,leading to inaccurate alignment outcomes.Guided by this observation,this paper proposes an adaptive inter-intra-domain discrepancy method to quantify the intra-class and inter-class discrepancies between the source and target domains.Furthermore,an adaptive factor is introduced to dynamically assess their relative importance.Building upon the proposed adaptive inter-intradomain discrepancy approach,we develop an inter-intradomain alignment network with a class-aware sampling strategy(IDAN-CSS)to distill the feature representations.The classaware sampling strategy,integrated within IDAN-CSS,facilitates more efficient training.Through multiple transfer diagnosis cases,we comprehensively demonstrate the feasibility and effectiveness of the proposed IDAN-CSS model. 展开更多
关键词 unsupervised domain adaptation inter-class domain discrepancy intra-class domain discrepancy class-aware sampling strategy
原文传递
Sampling strategy for wild soybean(Glycine soja)populations based on their genetic diversity and fine-scale spatial genetic structure 被引量:1
8
作者 ZHU Weiyue ZHOU Taoying +1 位作者 ZHONG Ming LU Baorong 《Frontiers in Biology》 CSCD 2007年第4期397-402,共6页
A total of 892 individuals sampled from a wild soybean population in a natural reserve near the Yellow River estuary located in Kenli of Shandong Province(China)were investigated.Seventeen SSR(simple sequence repeat)p... A total of 892 individuals sampled from a wild soybean population in a natural reserve near the Yellow River estuary located in Kenli of Shandong Province(China)were investigated.Seventeen SSR(simple sequence repeat)primer pairs from cultivated soybeans were used to estimate the genetic diversity of the population and its variation pattern versus changes of the sample size(sub-samples),in addition to investigating the fine-scale spatial genetic structure within the population.The results showed relatively high genetic diversity of the population with the mean value of allele number(A)being 2.88,expected heterozygosity(He)0.431,Shannon diversity index(I)0.699,and percentage of poly-morphic loci(P)100%.Sub-samples of different sizes(ten groups)were randomly drawn from the population and their genetic diversity was calculated by computer simulation.The regression model of the four diversity indexes with the change of sample sizes was computed.As a result,27-52 individuals can reach 95%of total genetic variability of the population.Spatial autocorrelation analysis revealed that the genetic patch size of this wild soybean population is about 18 m.The study provided a scientific basis for the sampling strategy of wild soybean populations. 展开更多
关键词 sampling strategy genetic diversity fine-scale spatial structure wild soybean simple sequence repeat(SSR)
原文传递
How do the landslide and non-landslide sampling strategies impact landslide susceptibility assessment? d A catchment-scale case study from China 被引量:3
9
作者 Zizheng Guo Bixia Tian +2 位作者 Yuhang Zhu Jun He Taili Zhang 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第3期877-894,共18页
The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenz... The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM. 展开更多
关键词 Landslide susceptibility sampling strategy Machine learning Random forest China
在线阅读 下载PDF
A simulation study of sampling time point designing for therapeutic drug monitoring based on a population pharmacokinetic model
10
作者 张扬 周颖 +3 位作者 张相林 刘晓 崔一民 卢炜 《Journal of Chinese Pharmaceutical Sciences》 CAS 2007年第4期241-251,共11页
Aim To develop a method to estimate population pharmacokinetic parameters with the limited sampling time points provided clinically during therapeutic drug monitoring. Methods Various simulations were attempted using ... Aim To develop a method to estimate population pharmacokinetic parameters with the limited sampling time points provided clinically during therapeutic drug monitoring. Methods Various simulations were attempted using a one-compartment open model with the first order absorption to determine PK parameter estimates with different sampling strategies as a validation of the method. The estimated parameters were further verified by comparing to the observed values. Results The samples collected at the single time point close to the non-informative sampling time point designed by this method led to bias and inaccurate parameter estimations. Furthermore, the relationship between the estimated non-informative sampling time points and the values of the parameter was examined. The non-informative sampling time points have been developed under some typical occasions and the results were plotted to show the tendency. As a result, one non-informative time point was demonstrated to be appropriate for clearance and two for both volume of distribution and constant of absorption in the present study. It was found that the estimates of the non-informative sampling time points developed in the method increase with increases of volume of distribution and the decrease of clearance and constant of absorption. Conclusion A rational sampling strategy during therapeutic drug monitoring can be established using the method present in the study. 展开更多
关键词 Therapeutic drug monitoring sampling strategy Non-informative time point Population pharmacokinetics
暂未订购
Comparison of different sampling strategies for debris flow susceptibility mapping: A case study using the centroids of the scarp area, flowing area and accumulation area of debris flow watersheds 被引量:3
11
作者 GAO Rui-yuan WANG Chang-ming LIANG Zhu 《Journal of Mountain Science》 SCIE CSCD 2021年第6期1476-1488,共13页
The quality of debris flow susceptibility mapping varies with sampling strategies. This paper aims at comparing three sampling strategies and determining the optimal one to sample the debris flow watersheds. The three... The quality of debris flow susceptibility mapping varies with sampling strategies. This paper aims at comparing three sampling strategies and determining the optimal one to sample the debris flow watersheds. The three sampling strategies studied were the centroid of the scarp area(COSA), the centroid of the flowing area(COFA), and the centroid of the accumulation area(COAA) of debris flow watersheds. An inventory consisting of 150 debris flow watersheds and 12 conditioning factors were prepared for research. Firstly, the information gain ratio(IGR) method was used to analyze the predictive ability of the conditioning factors. Subsequently, 12 conditioning factors were involved in the modeling of artificial neural network(ANN), random forest(RF) and support vector machine(SVM). Then, the receiver operating characteristic curves(ROC) and the area under curves(AUC) were used to evaluate the model performance. Finally, a scoring system was used to score the quality of the debris flow susceptibility maps. Samples obtained from the accumulation area have the strongest predictive ability and can make the models achieve the best performance. The AUC values corresponding to the best model performance on the validation dataset were 0.861, 0.804 and 0.856 for SVM, ANN and RF respectively. The sampling strategy of the centroid of the scarp area is optimal with the highest quality of debris flow susceptibility maps having scores of 373470, 393241 and 362485 for SVM, ANN and RF respectively. 展开更多
关键词 Debris flow Artificial neural network Support vector machine Random forest SUSCEPTIBILITY sampling strategy
原文传递
A study on sampling strategies in the figure cognitive process
12
作者 曹立人 苏昊 曹珍副 《Journal of Zhejiang University Science》 CSCD 2004年第9期1160-1164,共5页
This study was aimed at investigating the sampling strategies for 2 types of figures: 3-D cubes and human faces. The research was focused on: (a) from where the sampling process started; (b) in what order the figures&... This study was aimed at investigating the sampling strategies for 2 types of figures: 3-D cubes and human faces. The research was focused on: (a) from where the sampling process started; (b) in what order the figures' features were sampled. The study consisted of 2 experiments: (a) sampling strategies for 3-D cubes; (b) sampling strategies for human faces. The results showed that: (a), for 3-D cubes, the first sampling was mostly located at the outline parts, rarely at the center part; while for human faces, the first sampling was mostly located at the hair and outline parts, rarely at the mouth or cheek parts, in most cases, the first sampling-position had no significant effects on cognitive performance and that (b), the sampling order, both for 3-D cubes and for human faces, was determined by the degree of difference among the sampled-features. 展开更多
关键词 sampling strategy Figure cognition 3-D cubes figures Human faces
在线阅读 下载PDF
GD-YOLO:A Network with Gather and Distribution Mechanism for Infrared Image Detection of Electrical Equipment
13
作者 Junpeng Wu Xingfan Jiang 《Computers, Materials & Continua》 2025年第4期897-915,共19页
As technologies related to power equipment fault diagnosis and infrared temperature measurement continue to advance,the classification and identification of infrared temperature measurement images have become crucial ... As technologies related to power equipment fault diagnosis and infrared temperature measurement continue to advance,the classification and identification of infrared temperature measurement images have become crucial in effective intelligent fault diagnosis of various electrical equipment.In response to the increasing demand for sufficient feature fusion in current real-time detection and low detection accuracy in existing networks for Substation fault diagnosis,we introduce an innovative method known as Gather and Distribution Mechanism-You Only Look Once(GD-YOLO).Firstly,a partial convolution group is designed based on different convolution kernels.We combine the partial convolution group with deep convolution to propose a new Grouped Channel-wise Spatial Convolution(GCSConv)that compensates for the information loss caused by spatial channel convolution.Secondly,the Gather and Distribute Mechanism,which addresses the fusion problem of different dimensional features,has been implemented by aligning and sharing information through aggregation and distribution mechanisms.Thirdly,considering the limitations in current bounding box regression and the imbalance between complex and simple samples,Maximum Possible Distance Intersection over Union(MPDIoU)and Adaptive SlideLoss is incorporated into the loss function,allowing samples near the Intersection over Union(IoU)to receive more attention through the dynamic variation of the mean Intersection over Union.The GD-YOLO algorithm can surpass YOLOv5,YOLOv7,and YOLOv8 in infrared image detection for electrical equipment,achieving a mean Average Precision(mAP)of 88.9%,with accuracy improvements of 3.7%,4.3%,and 3.1%,respectively.Additionally,the model delivers a frame rate of 48 FPS,which aligns with the precision and velocity criteria necessary for the detection of infrared images in power equipment. 展开更多
关键词 Infrared image detection aggregation and distribution mechanism sample imbalance strategy lightweight structure
在线阅读 下载PDF
Semi-Supervised New Intention Discovery for Syntactic Elimination and Fusion in Elastic Neighborhoods
14
作者 Di Wu Liming Feng Xiaoyu Wang 《Computers, Materials & Continua》 2025年第4期977-999,共23页
Semi-supervised new intent discovery is a significant research focus in natural language understanding.To address the limitations of current semi-supervised training data and the underutilization of implicit informati... Semi-supervised new intent discovery is a significant research focus in natural language understanding.To address the limitations of current semi-supervised training data and the underutilization of implicit information,a Semi-supervised New Intent Discovery for Elastic Neighborhood Syntactic Elimination and Fusion model(SNID-ENSEF)is proposed.Syntactic elimination contrast learning leverages verb-dominant syntactic features,systematically replacing specific words to enhance data diversity.The radius of the positive sample neighborhood is elastically adjusted to eliminate invalid samples and improve training efficiency.A neighborhood sample fusion strategy,based on sample distribution patterns,dynamically adjusts neighborhood size and fuses sample vectors to reduce noise and improve implicit information utilization and discovery accuracy.Experimental results show that SNID-ENSEF achieves average improvements of 0.88%,1.27%,and 1.30%in Normalized Mutual Information(NMI),Accuracy(ACC),and Adjusted Rand Index(ARI),respectively,outperforming PTJN,DPN,MTP-CLNN,and DWG models on the Banking77,StackOverflow,and Clinc150 datasets.The code is available at https://github.com/qsdesz/SNID-ENSEF,accessed on 16 January 2025. 展开更多
关键词 Natural language understanding semi-supervised new intent discovery syntactic elimination contrast learning neighborhood sample fusion strategies bidirectional encoder representations from transformers(BERT)
在线阅读 下载PDF
Sampling Strategies for Soil Available K and P at Field Scale 被引量:10
15
作者 SHIZHOU J.S.BAILEY 《Pedosphere》 SCIE CAS CSCD 2000年第4期309-315,共7页
Field nutrient distribution maps obtained from the study on soil variations within fields are the basis of precision agriculture. The quality of these maps for management depends on the accuracy of the predicted value... Field nutrient distribution maps obtained from the study on soil variations within fields are the basis of precision agriculture. The quality of these maps for management depends on the accuracy of the predicted values, which depends on the initial sampling. To produce reliable predictions efficiently the minimal sampling size and combination should be decided firstly, which could avoid the misspent funds for field sampling work. A 7.9 hectare silage field close to the Agricultural Research institute at Hillsborough, Northern Ireland, was selected for the study. Soil samples were collected from the field at 25 m intervals in a rectangular grid to provide a database of selected soil properties. Different data combinations were subsequently abstracted from this database for comparison purposes, and ordinary kriging used to produce interpolated soil maps. These predicted data groups were compared using least significant difference (LSD) test method. The results showed that the 62 sampling sizes of triangle arrangement for soil available K were sufficient to reach the required accuracy. The triangular sample combination proved to be superior to a rectangular one of similar sample size. 展开更多
关键词 glass field INTERPOLATION sampling strategies spatial variability
在线阅读 下载PDF
Effects of sampling strategies and DNA extraction methods on eDNA metabarcoding: A case study of estuarine fish diversity monitoring 被引量:4
16
作者 Hui-Ting Ruan Rui-Li Wang +4 位作者 Hong-Ting Li Li Liu Tian-Xu Kuang Min Li Ke-Shu Zou 《Zoological Research》 SCIE CAS CSCD 2022年第2期192-204,共13页
Environmental DNA(eDNA)integrated with metabarcoding is a promising and powerful tool for species composition and biodiversity assessment in aquatic ecosystems and is increasingly applied to evaluate fish diversity.To... Environmental DNA(eDNA)integrated with metabarcoding is a promising and powerful tool for species composition and biodiversity assessment in aquatic ecosystems and is increasingly applied to evaluate fish diversity.To date,however,no standardized eDNA-based protocol has been established to monitor fish diversity.In this study,we investigated and compared two filtration methods and three DNA extraction methods using three filtration water volumes to determine a suitable approach for eDNA-based fish diversity monitoring in the Pearl River Estuary(PRE),a highly anthropogenically disturbed estuarine ecosystem.Compared to filtration-based precipitation,direct filtration was a more suitable method for eDNA metabarcoding in the PRE.The combined use of DNeasy Blood and Tissue Kit(BT)and traditional phenol/chloroform(PC)extraction produced higher DNA yields,amplicon sequence variants(ASVs),and Shannon diversity indices,and generated more homogeneous and consistent community composition among replicates.Compared to the other combined protocols,the PC and BT methods obtained better species detection,higher fish diversity,and greater consistency for the filtration water volumes of 1000 and 2000 mL,respectively.All eDNA metabarcoding protocols were more sensitive than bottom trawling in the PRE fish surveys and combining two techniques yielded greater taxonomic diversity.Furthermore,combining traditional methods with eDNA analysis enhanced accuracy.These results indicate that methodological decisions related to eDNA metabarcoding should be made with caution for fish community monitoring in estuarine ecosystems. 展开更多
关键词 eDNA metabarcoding Fish diversity sampling strategies DNA extraction Estuarine ecosystem
在线阅读 下载PDF
Evaluation of sampling strategies to estimate crown biomass 被引量:4
17
作者 Krishna P Poudel Hailemariam Temesgen Andrew N Gray 《Forest Ecosystems》 SCIE CAS CSCD 2015年第1期20-30,共11页
Background:Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree.Crown biomass estimation is useful for different purposes including ... Background:Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree.Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products,fuel load assessments and fire management strategies,and wildfire modeling.However,crown biomass is difficult to predict because of the variability within and among species and sites.Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies.In this study,we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass.Methods:Using data collected from 20 destructively sampled trees,we evaluated 11 different sampling strategies using six evaluation statistics:bias,relative bias,root mean square error(RMSE),relative RMSE,amount of biomass sampled,and relative biomass sampled.We also evaluated the performance of the selected sampling strategies when different numbers of branches(3,6,9,and 12)are selected from each tree.Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass.Results:Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled.However,the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled.Under the stratified sampling strategy,selecting unequal number of branches per stratum produced approximately similar results to simple random sampling,but it further decreased RMSE when information on branch diameter is used in the design and estimation phases.Conclusions:Use of auxiliary information in design or estimation phase reduces the RMSE produced by a sampling strategy.However,this is attained by having to sample larger amount of biomass.Based on our finding we would recommend sampling nine branches per tree to be reasonably efficient and limit the amount of fieldwork. 展开更多
关键词 Aboveground biomass Crown sampling strategies Pacific Northwest
在线阅读 下载PDF
SGT-Net: A Transformer-Based Stratified Graph Convolutional Network for 3D Point Cloud Semantic Segmentation
18
作者 Suyi Liu Jianning Chi +2 位作者 Chengdong Wu Fang Xu Xiaosheng Yu 《Computers, Materials & Continua》 SCIE EI 2024年第6期4471-4489,共19页
In recent years,semantic segmentation on 3D point cloud data has attracted much attention.Unlike 2D images where pixels distribute regularly in the image domain,3D point clouds in non-Euclidean space are irregular and... In recent years,semantic segmentation on 3D point cloud data has attracted much attention.Unlike 2D images where pixels distribute regularly in the image domain,3D point clouds in non-Euclidean space are irregular and inherently sparse.Therefore,it is very difficult to extract long-range contexts and effectively aggregate local features for semantic segmentation in 3D point cloud space.Most current methods either focus on local feature aggregation or long-range context dependency,but fail to directly establish a global-local feature extractor to complete the point cloud semantic segmentation tasks.In this paper,we propose a Transformer-based stratified graph convolutional network(SGT-Net),which enlarges the effective receptive field and builds direct long-range dependency.Specifically,we first propose a novel dense-sparse sampling strategy that provides dense local vertices and sparse long-distance vertices for subsequent graph convolutional network(GCN).Secondly,we propose a multi-key self-attention mechanism based on the Transformer to further weight augmentation for crucial neighboring relationships and enlarge the effective receptive field.In addition,to further improve the efficiency of the network,we propose a similarity measurement module to determine whether the neighborhood near the center point is effective.We demonstrate the validity and superiority of our method on the S3DIS and ShapeNet datasets.Through ablation experiments and segmentation visualization,we verify that the SGT model can improve the performance of the point cloud semantic segmentation. 展开更多
关键词 3D point cloud semantic segmentation long-range contexts global-local feature graph convolutional network dense-sparse sampling strategy
在线阅读 下载PDF
上一页 1 2 下一页 到第
使用帮助 返回顶部