Big Basal Area Factor (Big BAF) and Point-3P are two-stage sampling methods. In the first stage the sampling units, in both methods, are Bitterlich points where the selection of the trees is proportional to their basa...Big Basal Area Factor (Big BAF) and Point-3P are two-stage sampling methods. In the first stage the sampling units, in both methods, are Bitterlich points where the selection of the trees is proportional to their basal area. In the second stage, sampling units are trees which are a subset of the first stage trees. In the Big BAF method, the probability of selecting trees in the second stage is made proportional to the two BAFs’ ratio, with a basal area factor larger than that of the first stage. In the Point-3P method the probability of selecting trees, in the second stage, is based on the height prediction and use of a specific random number table. Estimates of the forest stands’ volume and their sampling errors are based on the theory of the product of two random variables. The increasing error in the second stage is small, but the total cost of measuring the trees is much smaller than simply using the first stage, with all the trees measured. In general, the two sampling methods are modern and cost-effective approaches that can be applied in forest stand inventories for forest management purposes and are receiving the growing interest of researchers in the current decade.展开更多
Predictive maintenance(PdM)is vital for ensuring the reliability,safety,and cost efficiency of heavyduty vehicle fleets.However,real-world sensor data are often highly imbalanced,noisy,and temporally irregular,posing ...Predictive maintenance(PdM)is vital for ensuring the reliability,safety,and cost efficiency of heavyduty vehicle fleets.However,real-world sensor data are often highly imbalanced,noisy,and temporally irregular,posing significant challenges to model robustness and deployment.Using multivariate time-series data from Scania trucks,this study proposes a novel PdM framework that integrates efficient feature summarization with cost-sensitive hierarchical classification.First,the proposed last_k_summary method transforms recent operational records into compact statistical and trend-based descriptors while preserving missingness,allowing LightGBM to leverage its inherent split rules without ad-hoc imputation.Then,a two-stage LightGBM framework is developed for fault detection and severity classification:Stage A performs safety-prioritized fault screening(normal vs.fault)with a false-negativeweighted objective,and Stage B refines the detected faults into four severity levels through a cascaded hierarchy of binary classifiers.Under the official cost matrix of the IDA Industrial Challenge,the framework achieves total misclassification costs of 36,113(validation)and 36,314(test),outperforming XGBoost and Bi-LSTM by 3.8%-13.5%while maintaining high recall for the safety-critical class(0.83 validation,0.77 test).These results demonstrate that the proposed approach not only improves predictive accuracy but also provides a practical and deployable PdM solution that reduces maintenance cost,enhances fleet safety,and supports data-driven decision-making in industrial environments.展开更多
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re...Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.展开更多
Background: Remote sensing-based inventories are essential in estimating forest cover in tropical and subtropical countries, where ground inventories cannot be performed periodically at a large scale owing to high cos...Background: Remote sensing-based inventories are essential in estimating forest cover in tropical and subtropical countries, where ground inventories cannot be performed periodically at a large scale owing to high costs and forest inaccessibility(e.g. REDD projects) and are mandatory for constructing historical records that can be used as forest cover baselines. Given the conditions of such inventories, the survey area is partitioned into a grid of imagery segments of pre-fixed size where the proportion of forest cover can be measured within segments using a combination of unsupervised(automated or semi-automated) classification of satellite imagery and manual(i.e. visual on-screen)enhancements. Because visual on-screen operations are time expensive procedures, manual classification can be performed only for a sample of imagery segments selected at a first stage, while forest cover within each selected segment is estimated at a second stage from a sample of pixels selected within the segment. Because forest cover data arising from unsupervised satellite imagery classification may be freely available(e.g. Landsat imagery)over the entire survey area(wall-to-wall data) and are likely to be good proxies of manually classified cover data(sample data), they can be adopted as suitable auxiliary information.Methods: The question is how to choose the sample areas where manual classification is carried out. We have investigated the efficiency of one-per-stratum stratified sampling for selecting segments and pixels, where to carry out manual classification and to determine the efficiency of the difference estimator for exploiting auxiliary information at the estimation level. The performance of this strategy is compared with simple random sampling without replacement.Results: Our results were obtained theoretically from three artificial populations constructed from the Landsat classification(forest/non forest) available at pixel level for a study area located in central Italy, assuming three levels of error rates of the unsupervised classification of satellite imagery. The exploitation of map data as auxiliary information in the difference estimator proves to be highly effective with respect to the Horvitz-Thompson estimator,in which no auxiliary information is exploited. The use of one-per-stratum stratified sampling provides relevant improvement with respect to the use of simple random sampling without replacement.Conclusions: The use of one-per-stratum stratified sampling with many imagery segments selected at the first stage and few pixels within at the second stage- jointly with a difference estimator- proves to be a suitable strategy to estimate forest cover by remote sensing-based inventories.展开更多
Two-stage adaptive cluster sampling and two-stage conventional sampling designs were used to estimate population total of Fringe-Eared Oryx that are clustered and sparsely distributed. The study region was Amboseli-We...Two-stage adaptive cluster sampling and two-stage conventional sampling designs were used to estimate population total of Fringe-Eared Oryx that are clustered and sparsely distributed. The study region was Amboseli-West Kilimanjaro and Magadi-Natron cross boarder landscape between Kenya and Tanzania. The study region was partitioned into different primary sampling units with different secondary sampling units that were of different sizes. Results show that two-stage adaptive cluster sampling design is efficient compared to simple random sampling and the conventional two- stage sampling design. The design is less variable compared to the conventional two-stage sampling design.展开更多
A new unified constitutive model was developed to predict the two-stage creep-aging(TSCA)behavior of Al-Zn-Mg-Cu alloys.The particular bimodal precipitation feature was analyzed and modeled by considering the primary ...A new unified constitutive model was developed to predict the two-stage creep-aging(TSCA)behavior of Al-Zn-Mg-Cu alloys.The particular bimodal precipitation feature was analyzed and modeled by considering the primary micro-variables evolution at different temperatures and their interaction.The dislocation density was incorporated into the model to capture the effect of creep deformation on precipitation.Quantitative transmission electron microscopy and experimental data obtained from a previous study were used to calibrate the model.Subsequently,the developed constitutive model was implemented in the finite element(FE)software ABAQUS via the user subroutines for TSCA process simulation and the springback prediction of an integral panel.A TSCA test was performed.The result shows that the maximum radius deviation between the formed plate and the simulation results is less than 0.4 mm,thus validating the effectiveness of the developed constitutive model and FE model.展开更多
One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific object...One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific objectives,measurement targets,and measurement requirements for the proposed Gas and Ion Analyzer(GIA).The GIA is designed for in-situ mass spectrometry of neutral gases and low-energy ions,such as hydrogen,carbon,and oxygen,in the vicinity of 311P.Ion sampling techniques are essential for the GIA's Time-of-Flight(TOF)mass analysis capabilities.In this paper,we present an enhanced ion sampling technique through the development of an ion attraction model and an ion source model.The ion attraction model demonstrates that adjusting attraction grid voltage can enhance the detection efficiency of low-energy ions and mitigate the repulsive force of ions during sampling,which is influenced by the satellite's surface positive charging.The ion source model simulates the processes of gas ionization and ion multiplication.Simulation results indicate that the GIA can achieve a lower pressure limit below 10-13Pa and possess a dynamic range exceeding 10~9.These performances ensure the generation of ions with stable and consistent current,which is crucial for high-resolution and broad dynamic range mass spectrometer analysis.Preliminary testing experiments have verified GIA's capability to detect gas compositions such as H2O and N2.In-situ measurements near 311P using GIA are expected to significantly contribute to our understanding of asteroid activity mechanisms,the evolution of the atmospheric and ionized environments of main-belt comets,the interactions with solar wind,and the origin of Earth's water.展开更多
Electro-Optic Sampling(EOS)detection technique has been widely used in terahertz science and tech⁃nology,and it also can measure the field time waveform of the few-cycle laser pulse.Its frequency response and band lim...Electro-Optic Sampling(EOS)detection technique has been widely used in terahertz science and tech⁃nology,and it also can measure the field time waveform of the few-cycle laser pulse.Its frequency response and band limitation are determined directly by the electro-optic crystal and duration of the probe laser pulse.Here,we investigate the performance of the EOS with thin GaSe crystal in the measurement of the mid-infrared few-cycle la⁃ser pulse.The shift of the central frequency and change of the bandwidth induced by the EOS detection are calcu⁃lated,and then the pulse distortions induced in this detection process are discussed.It is found that this technique produces a red-shift of the central frequency and narrowing of the bandwidth.These changings decrease when the laser wavelength increases from 2μm to 10μm.This work can help to estimate the performance of the EOS de⁃tection technique in the mid-infrared band and offer a reference for the related experiment as well.展开更多
Nonperiodic interrupted sampling repeater jamming(ISRJ)against inverse synthetic aperture radar(ISAR)can obtain two-dimensional blanket jamming performance by joint fast and slow time domain interrupted modulation,whi...Nonperiodic interrupted sampling repeater jamming(ISRJ)against inverse synthetic aperture radar(ISAR)can obtain two-dimensional blanket jamming performance by joint fast and slow time domain interrupted modulation,which is obviously dif-ferent from the conventional multi-false-target deception jam-ming.In this paper,a suppression method against this kind of novel jamming is proposed based on inter-pulse energy function and compressed sensing theory.By utilizing the discontinuous property of the jamming in slow time domain,the unjammed pulse is separated using the intra-pulse energy function diffe-rence.Based on this,the two-dimensional orthogonal matching pursuit(2D-OMP)algorithm is proposed.Further,it is proposed to reconstruct the ISAR image with the obtained unjammed pulse sequence.The validity of the proposed method is demon-strated via the Yake-42 plane data simulations.展开更多
Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived chall...Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.展开更多
Lexical analysis is a fundamental task in natural language processing,which involves several subtasks,such as word segmentation(WS),part-of-speech(POS)tagging,and named entity recognition(NER).Recent works have shown ...Lexical analysis is a fundamental task in natural language processing,which involves several subtasks,such as word segmentation(WS),part-of-speech(POS)tagging,and named entity recognition(NER).Recent works have shown that taking advantage of relatedness between these subtasks can be beneficial.This paper proposes a unified neural framework to address these subtasks simultaneously.Apart from the sequence tagging paradigm,the proposed method tackles the multitask lexical analysis via two-stage sequence span classification.Firstly,the model detects the word and named entity boundaries by multilabel classification over character spans in a sentence.Then,the authors assign POS labels and entity labels for words and named entities by multi-class classification,respectively.Furthermore,a Gated Task Transformation(GTT)is proposed to encourage the model to share valuable features between tasks.The performance of the proposed model was evaluated on Chinese and Thai public datasets,demonstrating state-of-the-art results.展开更多
Weighted exponential distribution W ED(α,λ)with shape parameterαand scale parameterλpossesses some good properties and can be used as a good fit to survival time data compared to other distributions such as gamma,...Weighted exponential distribution W ED(α,λ)with shape parameterαand scale parameterλpossesses some good properties and can be used as a good fit to survival time data compared to other distributions such as gamma,Weibull,or generalized exponential distribution.In this article,we proved the existence and uniqueness of the maximum likelihood estimator(MLE)of the parameters of W ED(α,λ)in simple random sampling(SRS)and provided explicit expressions for the Fisher information number in SRS.Moreover,we also proved the existence and uniqueness of the MLE of the parameters of W ED(α,λ)in ranked set sampling(RSS)and provided explicit expressions for the Fisher information number in RSS.Simulation studies show that these MLEs in RSS can be real competitors for those in SRS.展开更多
Applying bio-oxidation waste solution(BOS)to chemical-biological two-stage oxidation process can significantly improve the bio-oxidation efficiency of arsenopyrite.This study aims to clarify the enhanced oxidation mec...Applying bio-oxidation waste solution(BOS)to chemical-biological two-stage oxidation process can significantly improve the bio-oxidation efficiency of arsenopyrite.This study aims to clarify the enhanced oxidation mechanism of arsenopyrite by evaluating the effects of physical and chemical changes of arsenopyrite in BOS chemical oxidation stage on mineral dissolution kinetics,as well as microbial growth activity and community structure composition in bio-oxidation stage.The results showed that the chemical oxidation contributed to destroying the physical and chemical structure of arsenopyrite surface and reducing the particle size,and led to the formation of nitrogenous substances on mineral surface.These chemical oxidation behaviors effectively promoted Fe^(3+)cycling in the bio-oxidation system and weakened the inhibitory effect of the sulfur film on ionic diffusion,thereby enhancing the dissolution kinetics of the arsenopyrite.Therefore,the bio-oxidation efficiency of arsenopyrite was significantly increased in the two-stage oxidation process.After 18 d,the two-stage oxidation process achieved total extraction rates of(88.8±2.0)%,(86.7±1.3)%,and(74.7±3.0)%for As,Fe,and S elements,respectively.These values represented a significant increase of(50.8±3.4)%,(47.1±2.7)%,and(46.0±0.7)%,respectively,compared to the one-stage bio-oxidation process.展开更多
Selection of negative samples significantly influences landslide susceptibility assessment,especially when establishing the relationship between landslides and environmental factors in regions with complex geological ...Selection of negative samples significantly influences landslide susceptibility assessment,especially when establishing the relationship between landslides and environmental factors in regions with complex geological conditions.Traditional sampling strategies commonly used in landslide susceptibility models can lead to a misrepresentation of the distribution of negative samples,causing a deviation from actual geological conditions.This,in turn,negatively affects the discriminative ability and generalization performance of the models.To address this issue,we propose a novel approach for selecting negative samples to enhance the quality of machine learning models.We choose the Liangshan Yi Autonomous Prefecture,located in southwestern Sichuan,China,as the case study.This area,characterized by complex terrain,frequent tectonic activities,and steep slope erosion,experiences recurrent landslides,making it an ideal setting for validating our proposed method.We calculate the contribution values of environmental factors using the relief algorithm to construct the feature space,apply the Target Space Exteriorization Sampling(TSES)method to select negative samples,calculate landslide probability values by Random Forest(RF)modeling,and then create regional landslide susceptibility maps.We evaluate the performance of the RF model optimized by the Environmental Factor Selection-based TSES(EFSTSES)method using standard performance metrics.The results indicated that the model achieved an accuracy(ACC)of 0.962,precision(PRE)of 0.961,and an area under the curve(AUC)of 0.962.These findings demonstrate that the EFSTSES-based model effectively mitigates the negative sample imbalance issue,enhances the differentiation between landslide and non-landslide samples,and reduces misclassification,particularly in geologically complex areas.These improvements offer valuable insights for disaster prevention,land use planning,and risk mitigation strategies.展开更多
Exploring the factors driving the decoupling of China’s sulfur dioxide(SO_(2))emissions from economic growth(DEI)is crucial for achieving sustainable development.By analyzing the decoupling indicators and driving fac...Exploring the factors driving the decoupling of China’s sulfur dioxide(SO_(2))emissions from economic growth(DEI)is crucial for achieving sustainable development.By analyzing the decoupling indicators and driving factors at both the generation and treatment stages of SO_(2),more effective targeted mitigation strategies can be developed.We employ the Tapio decoupling model and propose a two-stage method to examine the decoupling issues related to SO_(2).Our findings indicate that:①DEI shows a steady and significant improvement,with SO_(2)emission intensity identified as the primary driver.②for the decoupling of economic growth and SO_(2)generation,energy scale serves as the largest stimulator,while the effect of energy intensity changes from negative to positive,and pollution intensity is first positive and then negative.③For the decoupling of SO_(2)generation and SO_(2)removal,treatment efficiency leads as the largest promoter,followed by treatment intensity.Based on these results,this study recommends that China focuses more on enhancing clean energy utilization and the effectiveness of treatment processes.展开更多
BACKGROUND Two-stage revision is the most common treatment for chronic periprosthetic joint infection of the hip,involving a resection arthroplasty with or without placement of an antibiotic-loaded spacer,followed by ...BACKGROUND Two-stage revision is the most common treatment for chronic periprosthetic joint infection of the hip,involving a resection arthroplasty with or without placement of an antibiotic-loaded spacer,followed by antibiotic therapy before reimplantation.AIM To compare the outcomes and complications of two consecutive treatment protocols for two-stage revision arthroplasty of the infected hip:One using Girdlestone with an antibiotic holiday,the other using custom-made articulating spacers(CUMARS)without an antibiotic holiday.METHODS In this retrospective study,two consecutive cohorts were compared.Group A(2017-2020)underwent two-stage revision with a Girdlestone and an antibiotic holiday before reimplantation,while Group B(2020-2023)received CUMARS whenever possible,and no antibiotic holiday,or a Girdlestone if indicated.The primary outcome was successful infection eradication after one year.Secondary outcomes included surgical duration,length of hospital stay,weight-bearing allowance,discharge destination,and complications.RESULTS A total of 98 patients were included:39 patients in Group A and 59 patients in Group B.Successful infection eradication after one year was achieved in 69%of Group A and 83%of Group B(P=0.164).Patients in Group B were more frequently allowed to bear weight(64%vs 18%,P<0.001),had a shorter in-hospital stay(9 vs 16 days,P<0.001),and were more often discharged home after the first surgery(48%vs 24%,P=0.048).No significant differences were found in(mechanical)complications.CONCLUSION A protocol including CUMARS is a safe and effective treatment,offering faster recovery,shorter length of hospital stay,and enabling more patients to return home during the interval.This reduces strain on patients and the healthcare system,potentially saving costs,without compromising infection control or increasing(mechanical)complications.展开更多
Micro-nano Earth Observation Satellite(MEOS)constellation has the advantages of low construction cost,short revisit cycle,and high functional density,which is considered a promising solution for serving rapidly growin...Micro-nano Earth Observation Satellite(MEOS)constellation has the advantages of low construction cost,short revisit cycle,and high functional density,which is considered a promising solution for serving rapidly growing observation demands.The observation Scheduling Problem in the MEOS constellation(MEOSSP)is a challenging issue due to the large number of satellites and tasks,as well as complex observation constraints.To address the large-scale and complicated MEOSSP,we develop a Two-Stage Scheduling Algorithm based on the Pointer Network with Attention mechanism(TSSA-PNA).In TSSA-PNA,the MEOS observation scheduling is decomposed into a task allocation stage and a single-MEOS scheduling stage.In the task allocation stage,an adaptive task allocation algorithm with four problem-specific allocation operators is proposed to reallocate the unscheduled tasks to new MEOSs.Regarding the single-MEOS scheduling stage,we design a pointer network based on the encoder-decoder architecture to learn the optimal singleMEOS scheduling solution and introduce the attention mechanism into the encoder to improve the learning efficiency.The Pointer Network with Attention mechanism(PNA)can generate the single-MEOS scheduling solution quickly in an end-to-end manner.These two decomposed stages are performed iteratively to search for the solution with high profit.A greedy local search algorithm is developed to improve the profits further.The performance of the PNA and TSSA-PNA on singleMEOS and multi-MEOS scheduling problems are evaluated in the experiments.The experimental results demonstrate that PNA can obtain the approximate solution for the single-MEOS scheduling problem in a short time.Besides,the TSSA-PNA can achieve higher observation profits than the existing scheduling algorithms within the acceptable computational time for the large-scale MEOS scheduling problem.展开更多
Task-oriented point cloud sampling aims to select a representative subset from the input,tailored to specific application scenarios and task requirements.However,existing approaches rarely tackle the problem of redund...Task-oriented point cloud sampling aims to select a representative subset from the input,tailored to specific application scenarios and task requirements.However,existing approaches rarely tackle the problem of redundancy caused by local structural similarities in 3D objects,which limits the performance of sampling.To address this issue,this paper introduces a novel task-oriented point cloud masked autoencoder-based sampling network(Point-MASNet),inspired by the masked autoencoder mechanism.Point-MASNet employs a voxel-based random non-overlapping masking strategy,which allows the model to selectively learn and capture distinctive local structural features from the input data.This approach effectively mitigates redundancy and enhances the representativeness of the sampled subset.In addition,we propose a lightweight,symmetrically structured keypoint reconstruction network,designed as an autoencoder.This network is optimized to efficiently extract latent features while enabling refined reconstructions.Extensive experiments demonstrate that Point-MASNet achieves competitive sampling performance across classification,registration,and reconstruction tasks.展开更多
A comprehensive fishery-independent survey generally incorporates various specialized surveys and integrates different survey objectives to maximize benefits while accounting for cost limitations.It is important to ev...A comprehensive fishery-independent survey generally incorporates various specialized surveys and integrates different survey objectives to maximize benefits while accounting for cost limitations.It is important to evaluate the adaptability of the comprehensive survey for different taxon to get the optimal design.However,the validity and adaptability of ichthyoplankton sampling incorporated in a comprehensive fishery-independent survey program in estimating abundance of ichthyoplankton species is little known.This study included ichthyoplankton sampling in an integrated survey and assessed the appropriateness of survey design.The Kriging interpolation based on Gaussian models was used to estimate the values at unsurveyed locations based on the original ichthyoplankton survey data in the Haizhou Bay as the“true”values.The sampling performances of the ongoing stratified random sampling(StRS),simple random sampling(SRS),cluster sampling(CS),hexagonal systematic sampling(SYS h),and regular systematic sampling(SYS r)with different sample sizes in estimating ichthyoplankton abundance were compared in relative estimation error(REE),relative bias(RB),and coefficient of variation(CV)by computer simulation.The ongoing StRS performed better than CS and SRS,but not as good as the two systematic sampling methods,and the current sample size in StRS design was insufficient to estimate ichthyoplankton abundance.The average REE values(meanREE)were significantly smaller in two systematic sampling designs than those in other three sampling designs,and the two systematic sampling designs could maintain good inter-annual stability of sampling performances.It is suggested that incorporating ichthyoplankton survey directly into stratified random fishery-independent surveys could not achieve the desired level of accuracy for survey objectives,but the accuracy can be improved by setting additional stations.The assessment framework presented in this study serves as a reference for evaluating the adaptability of integrated surveys to different objectives in other waters.展开更多
文摘Big Basal Area Factor (Big BAF) and Point-3P are two-stage sampling methods. In the first stage the sampling units, in both methods, are Bitterlich points where the selection of the trees is proportional to their basal area. In the second stage, sampling units are trees which are a subset of the first stage trees. In the Big BAF method, the probability of selecting trees in the second stage is made proportional to the two BAFs’ ratio, with a basal area factor larger than that of the first stage. In the Point-3P method the probability of selecting trees, in the second stage, is based on the height prediction and use of a specific random number table. Estimates of the forest stands’ volume and their sampling errors are based on the theory of the product of two random variables. The increasing error in the second stage is small, but the total cost of measuring the trees is much smaller than simply using the first stage, with all the trees measured. In general, the two sampling methods are modern and cost-effective approaches that can be applied in forest stand inventories for forest management purposes and are receiving the growing interest of researchers in the current decade.
基金supported by the GRRC program of Gyeonggi province[GRRC KGU 2023-B01,Research on Intelligent Industrial Data Analytics].
文摘Predictive maintenance(PdM)is vital for ensuring the reliability,safety,and cost efficiency of heavyduty vehicle fleets.However,real-world sensor data are often highly imbalanced,noisy,and temporally irregular,posing significant challenges to model robustness and deployment.Using multivariate time-series data from Scania trucks,this study proposes a novel PdM framework that integrates efficient feature summarization with cost-sensitive hierarchical classification.First,the proposed last_k_summary method transforms recent operational records into compact statistical and trend-based descriptors while preserving missingness,allowing LightGBM to leverage its inherent split rules without ad-hoc imputation.Then,a two-stage LightGBM framework is developed for fault detection and severity classification:Stage A performs safety-prioritized fault screening(normal vs.fault)with a false-negativeweighted objective,and Stage B refines the detected faults into four severity levels through a cascaded hierarchy of binary classifiers.Under the official cost matrix of the IDA Industrial Challenge,the framework achieves total misclassification costs of 36,113(validation)and 36,314(test),outperforming XGBoost and Bi-LSTM by 3.8%-13.5%while maintaining high recall for the safety-critical class(0.83 validation,0.77 test).These results demonstrate that the proposed approach not only improves predictive accuracy but also provides a practical and deployable PdM solution that reduces maintenance cost,enhances fleet safety,and supports data-driven decision-making in industrial environments.
基金supported in part by the Research Fund of Key Lab of Education Blockchain and Intelligent Technology,Ministry of Education(EBME25-F-08).
文摘Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.
文摘Background: Remote sensing-based inventories are essential in estimating forest cover in tropical and subtropical countries, where ground inventories cannot be performed periodically at a large scale owing to high costs and forest inaccessibility(e.g. REDD projects) and are mandatory for constructing historical records that can be used as forest cover baselines. Given the conditions of such inventories, the survey area is partitioned into a grid of imagery segments of pre-fixed size where the proportion of forest cover can be measured within segments using a combination of unsupervised(automated or semi-automated) classification of satellite imagery and manual(i.e. visual on-screen)enhancements. Because visual on-screen operations are time expensive procedures, manual classification can be performed only for a sample of imagery segments selected at a first stage, while forest cover within each selected segment is estimated at a second stage from a sample of pixels selected within the segment. Because forest cover data arising from unsupervised satellite imagery classification may be freely available(e.g. Landsat imagery)over the entire survey area(wall-to-wall data) and are likely to be good proxies of manually classified cover data(sample data), they can be adopted as suitable auxiliary information.Methods: The question is how to choose the sample areas where manual classification is carried out. We have investigated the efficiency of one-per-stratum stratified sampling for selecting segments and pixels, where to carry out manual classification and to determine the efficiency of the difference estimator for exploiting auxiliary information at the estimation level. The performance of this strategy is compared with simple random sampling without replacement.Results: Our results were obtained theoretically from three artificial populations constructed from the Landsat classification(forest/non forest) available at pixel level for a study area located in central Italy, assuming three levels of error rates of the unsupervised classification of satellite imagery. The exploitation of map data as auxiliary information in the difference estimator proves to be highly effective with respect to the Horvitz-Thompson estimator,in which no auxiliary information is exploited. The use of one-per-stratum stratified sampling provides relevant improvement with respect to the use of simple random sampling without replacement.Conclusions: The use of one-per-stratum stratified sampling with many imagery segments selected at the first stage and few pixels within at the second stage- jointly with a difference estimator- proves to be a suitable strategy to estimate forest cover by remote sensing-based inventories.
文摘Two-stage adaptive cluster sampling and two-stage conventional sampling designs were used to estimate population total of Fringe-Eared Oryx that are clustered and sparsely distributed. The study region was Amboseli-West Kilimanjaro and Magadi-Natron cross boarder landscape between Kenya and Tanzania. The study region was partitioned into different primary sampling units with different secondary sampling units that were of different sizes. Results show that two-stage adaptive cluster sampling design is efficient compared to simple random sampling and the conventional two- stage sampling design. The design is less variable compared to the conventional two-stage sampling design.
基金supported by the National Key R&D Program of China(No.2021YFB3400900)the National Natural Science Foundation of China(Nos.52175373,52205435)+1 种基金Natural Science Foundation of Hunan Province,China(No.2022JJ40621)the Innovation Fund of National Commercial Aircraft Manufacturing Engineering Technology Center,China(No.COMACSFGS-2022-1875)。
文摘A new unified constitutive model was developed to predict the two-stage creep-aging(TSCA)behavior of Al-Zn-Mg-Cu alloys.The particular bimodal precipitation feature was analyzed and modeled by considering the primary micro-variables evolution at different temperatures and their interaction.The dislocation density was incorporated into the model to capture the effect of creep deformation on precipitation.Quantitative transmission electron microscopy and experimental data obtained from a previous study were used to calibrate the model.Subsequently,the developed constitutive model was implemented in the finite element(FE)software ABAQUS via the user subroutines for TSCA process simulation and the springback prediction of an integral panel.A TSCA test was performed.The result shows that the maximum radius deviation between the formed plate and the simulation results is less than 0.4 mm,thus validating the effectiveness of the developed constitutive model and FE model.
基金Supported by the National Natural Science Foundation of China(42474239,41204128)China National Space Administration(Pre-research project on Civil Aerospace Technologies No.D010301)Strategic Priority Research Program of the Chinese Academy of Sciences(XDA17010303)。
文摘One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific objectives,measurement targets,and measurement requirements for the proposed Gas and Ion Analyzer(GIA).The GIA is designed for in-situ mass spectrometry of neutral gases and low-energy ions,such as hydrogen,carbon,and oxygen,in the vicinity of 311P.Ion sampling techniques are essential for the GIA's Time-of-Flight(TOF)mass analysis capabilities.In this paper,we present an enhanced ion sampling technique through the development of an ion attraction model and an ion source model.The ion attraction model demonstrates that adjusting attraction grid voltage can enhance the detection efficiency of low-energy ions and mitigate the repulsive force of ions during sampling,which is influenced by the satellite's surface positive charging.The ion source model simulates the processes of gas ionization and ion multiplication.Simulation results indicate that the GIA can achieve a lower pressure limit below 10-13Pa and possess a dynamic range exceeding 10~9.These performances ensure the generation of ions with stable and consistent current,which is crucial for high-resolution and broad dynamic range mass spectrometer analysis.Preliminary testing experiments have verified GIA's capability to detect gas compositions such as H2O and N2.In-situ measurements near 311P using GIA are expected to significantly contribute to our understanding of asteroid activity mechanisms,the evolution of the atmospheric and ionized environments of main-belt comets,the interactions with solar wind,and the origin of Earth's water.
基金Supported by the National Natural Science Foundation of China(12064028)Jiangxi Provincial Natural Science Foundation(20232BAB201045).
文摘Electro-Optic Sampling(EOS)detection technique has been widely used in terahertz science and tech⁃nology,and it also can measure the field time waveform of the few-cycle laser pulse.Its frequency response and band limitation are determined directly by the electro-optic crystal and duration of the probe laser pulse.Here,we investigate the performance of the EOS with thin GaSe crystal in the measurement of the mid-infrared few-cycle la⁃ser pulse.The shift of the central frequency and change of the bandwidth induced by the EOS detection are calcu⁃lated,and then the pulse distortions induced in this detection process are discussed.It is found that this technique produces a red-shift of the central frequency and narrowing of the bandwidth.These changings decrease when the laser wavelength increases from 2μm to 10μm.This work can help to estimate the performance of the EOS de⁃tection technique in the mid-infrared band and offer a reference for the related experiment as well.
基金supported by the National Natural Science Foundation of China(62001481,61890542,62071475)the Natural Science Foundation of Hunan Province(2022JJ40561)the Research Program of National University of Defense Technology(ZK22-46).
文摘Nonperiodic interrupted sampling repeater jamming(ISRJ)against inverse synthetic aperture radar(ISAR)can obtain two-dimensional blanket jamming performance by joint fast and slow time domain interrupted modulation,which is obviously dif-ferent from the conventional multi-false-target deception jam-ming.In this paper,a suppression method against this kind of novel jamming is proposed based on inter-pulse energy function and compressed sensing theory.By utilizing the discontinuous property of the jamming in slow time domain,the unjammed pulse is separated using the intra-pulse energy function diffe-rence.Based on this,the two-dimensional orthogonal matching pursuit(2D-OMP)algorithm is proposed.Further,it is proposed to reconstruct the ISAR image with the obtained unjammed pulse sequence.The validity of the proposed method is demon-strated via the Yake-42 plane data simulations.
文摘Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.
基金supported by National Natural Science Foundation of China(Grant No.62266028,62266027,U21B2027,and U24A20334)Major Science and Technology Programs in Yunnan Province(Grant No.202302AD080003,202402AG050007,and 202303AP140008)+1 种基金Yunnan Province Basic Research Program(Grant No.202301AS070047,202301AT070471,and 202401BC070021)Kunming University of Science and Technology's"Double First-rate"construction joint project(Grant No.202201BE070001-021).
文摘Lexical analysis is a fundamental task in natural language processing,which involves several subtasks,such as word segmentation(WS),part-of-speech(POS)tagging,and named entity recognition(NER).Recent works have shown that taking advantage of relatedness between these subtasks can be beneficial.This paper proposes a unified neural framework to address these subtasks simultaneously.Apart from the sequence tagging paradigm,the proposed method tackles the multitask lexical analysis via two-stage sequence span classification.Firstly,the model detects the word and named entity boundaries by multilabel classification over character spans in a sentence.Then,the authors assign POS labels and entity labels for words and named entities by multi-class classification,respectively.Furthermore,a Gated Task Transformation(GTT)is proposed to encourage the model to share valuable features between tasks.The performance of the proposed model was evaluated on Chinese and Thai public datasets,demonstrating state-of-the-art results.
基金Supported by the National Science Foundation of China(11901236,12261036)Scientific Research Fund of Hunan Provincial Education Department(21A0328)+2 种基金Provincial Natural Science Foundation of Hunan(2022JJ30469)Young Core Teacher Foundation of Hunan Province([2020]43)Provincial Postgraduate Innovation Foundation of Hunan(CX20221113)。
文摘Weighted exponential distribution W ED(α,λ)with shape parameterαand scale parameterλpossesses some good properties and can be used as a good fit to survival time data compared to other distributions such as gamma,Weibull,or generalized exponential distribution.In this article,we proved the existence and uniqueness of the maximum likelihood estimator(MLE)of the parameters of W ED(α,λ)in simple random sampling(SRS)and provided explicit expressions for the Fisher information number in SRS.Moreover,we also proved the existence and uniqueness of the MLE of the parameters of W ED(α,λ)in ranked set sampling(RSS)and provided explicit expressions for the Fisher information number in RSS.Simulation studies show that these MLEs in RSS can be real competitors for those in SRS.
基金Project(52274348)supported by the National Natural Science Foundation of ChinaProject(2022JH1/10400024)supported by the Major Projects for the“Revealed Top”Science and Technology of Liaoning Province,China。
文摘Applying bio-oxidation waste solution(BOS)to chemical-biological two-stage oxidation process can significantly improve the bio-oxidation efficiency of arsenopyrite.This study aims to clarify the enhanced oxidation mechanism of arsenopyrite by evaluating the effects of physical and chemical changes of arsenopyrite in BOS chemical oxidation stage on mineral dissolution kinetics,as well as microbial growth activity and community structure composition in bio-oxidation stage.The results showed that the chemical oxidation contributed to destroying the physical and chemical structure of arsenopyrite surface and reducing the particle size,and led to the formation of nitrogenous substances on mineral surface.These chemical oxidation behaviors effectively promoted Fe^(3+)cycling in the bio-oxidation system and weakened the inhibitory effect of the sulfur film on ionic diffusion,thereby enhancing the dissolution kinetics of the arsenopyrite.Therefore,the bio-oxidation efficiency of arsenopyrite was significantly increased in the two-stage oxidation process.After 18 d,the two-stage oxidation process achieved total extraction rates of(88.8±2.0)%,(86.7±1.3)%,and(74.7±3.0)%for As,Fe,and S elements,respectively.These values represented a significant increase of(50.8±3.4)%,(47.1±2.7)%,and(46.0±0.7)%,respectively,compared to the one-stage bio-oxidation process.
基金supported by Natural Science Research Project of Anhui Educational Committee(2023AH030041)National Natural Science Foundation of China(42277136)Anhui Province Young and Middle-aged Teacher Training Action Project(DTR2023018).
文摘Selection of negative samples significantly influences landslide susceptibility assessment,especially when establishing the relationship between landslides and environmental factors in regions with complex geological conditions.Traditional sampling strategies commonly used in landslide susceptibility models can lead to a misrepresentation of the distribution of negative samples,causing a deviation from actual geological conditions.This,in turn,negatively affects the discriminative ability and generalization performance of the models.To address this issue,we propose a novel approach for selecting negative samples to enhance the quality of machine learning models.We choose the Liangshan Yi Autonomous Prefecture,located in southwestern Sichuan,China,as the case study.This area,characterized by complex terrain,frequent tectonic activities,and steep slope erosion,experiences recurrent landslides,making it an ideal setting for validating our proposed method.We calculate the contribution values of environmental factors using the relief algorithm to construct the feature space,apply the Target Space Exteriorization Sampling(TSES)method to select negative samples,calculate landslide probability values by Random Forest(RF)modeling,and then create regional landslide susceptibility maps.We evaluate the performance of the RF model optimized by the Environmental Factor Selection-based TSES(EFSTSES)method using standard performance metrics.The results indicated that the model achieved an accuracy(ACC)of 0.962,precision(PRE)of 0.961,and an area under the curve(AUC)of 0.962.These findings demonstrate that the EFSTSES-based model effectively mitigates the negative sample imbalance issue,enhances the differentiation between landslide and non-landslide samples,and reduces misclassification,particularly in geologically complex areas.These improvements offer valuable insights for disaster prevention,land use planning,and risk mitigation strategies.
基金the National Natural Science Foundation of China[Grant No.52270183].
文摘Exploring the factors driving the decoupling of China’s sulfur dioxide(SO_(2))emissions from economic growth(DEI)is crucial for achieving sustainable development.By analyzing the decoupling indicators and driving factors at both the generation and treatment stages of SO_(2),more effective targeted mitigation strategies can be developed.We employ the Tapio decoupling model and propose a two-stage method to examine the decoupling issues related to SO_(2).Our findings indicate that:①DEI shows a steady and significant improvement,with SO_(2)emission intensity identified as the primary driver.②for the decoupling of economic growth and SO_(2)generation,energy scale serves as the largest stimulator,while the effect of energy intensity changes from negative to positive,and pollution intensity is first positive and then negative.③For the decoupling of SO_(2)generation and SO_(2)removal,treatment efficiency leads as the largest promoter,followed by treatment intensity.Based on these results,this study recommends that China focuses more on enhancing clean energy utilization and the effectiveness of treatment processes.
文摘BACKGROUND Two-stage revision is the most common treatment for chronic periprosthetic joint infection of the hip,involving a resection arthroplasty with or without placement of an antibiotic-loaded spacer,followed by antibiotic therapy before reimplantation.AIM To compare the outcomes and complications of two consecutive treatment protocols for two-stage revision arthroplasty of the infected hip:One using Girdlestone with an antibiotic holiday,the other using custom-made articulating spacers(CUMARS)without an antibiotic holiday.METHODS In this retrospective study,two consecutive cohorts were compared.Group A(2017-2020)underwent two-stage revision with a Girdlestone and an antibiotic holiday before reimplantation,while Group B(2020-2023)received CUMARS whenever possible,and no antibiotic holiday,or a Girdlestone if indicated.The primary outcome was successful infection eradication after one year.Secondary outcomes included surgical duration,length of hospital stay,weight-bearing allowance,discharge destination,and complications.RESULTS A total of 98 patients were included:39 patients in Group A and 59 patients in Group B.Successful infection eradication after one year was achieved in 69%of Group A and 83%of Group B(P=0.164).Patients in Group B were more frequently allowed to bear weight(64%vs 18%,P<0.001),had a shorter in-hospital stay(9 vs 16 days,P<0.001),and were more often discharged home after the first surgery(48%vs 24%,P=0.048).No significant differences were found in(mechanical)complications.CONCLUSION A protocol including CUMARS is a safe and effective treatment,offering faster recovery,shorter length of hospital stay,and enabling more patients to return home during the interval.This reduces strain on patients and the healthcare system,potentially saving costs,without compromising infection control or increasing(mechanical)complications.
基金supported by the National Natural Science Foundation of China(No.62101587)the National Funded Postdoctoral Researcher Program of China(No.GZC20233578)。
文摘Micro-nano Earth Observation Satellite(MEOS)constellation has the advantages of low construction cost,short revisit cycle,and high functional density,which is considered a promising solution for serving rapidly growing observation demands.The observation Scheduling Problem in the MEOS constellation(MEOSSP)is a challenging issue due to the large number of satellites and tasks,as well as complex observation constraints.To address the large-scale and complicated MEOSSP,we develop a Two-Stage Scheduling Algorithm based on the Pointer Network with Attention mechanism(TSSA-PNA).In TSSA-PNA,the MEOS observation scheduling is decomposed into a task allocation stage and a single-MEOS scheduling stage.In the task allocation stage,an adaptive task allocation algorithm with four problem-specific allocation operators is proposed to reallocate the unscheduled tasks to new MEOSs.Regarding the single-MEOS scheduling stage,we design a pointer network based on the encoder-decoder architecture to learn the optimal singleMEOS scheduling solution and introduce the attention mechanism into the encoder to improve the learning efficiency.The Pointer Network with Attention mechanism(PNA)can generate the single-MEOS scheduling solution quickly in an end-to-end manner.These two decomposed stages are performed iteratively to search for the solution with high profit.A greedy local search algorithm is developed to improve the profits further.The performance of the PNA and TSSA-PNA on singleMEOS and multi-MEOS scheduling problems are evaluated in the experiments.The experimental results demonstrate that PNA can obtain the approximate solution for the single-MEOS scheduling problem in a short time.Besides,the TSSA-PNA can achieve higher observation profits than the existing scheduling algorithms within the acceptable computational time for the large-scale MEOS scheduling problem.
基金supported by the National Key Research and Development Program of China(2022YFB3103500)the National Natural Science Foundation of China(62473033,62571027)+1 种基金in part by the Beijing Natural Science Foundation(L231012)the State Scholarship Fund from the China Scholarship Council.
文摘Task-oriented point cloud sampling aims to select a representative subset from the input,tailored to specific application scenarios and task requirements.However,existing approaches rarely tackle the problem of redundancy caused by local structural similarities in 3D objects,which limits the performance of sampling.To address this issue,this paper introduces a novel task-oriented point cloud masked autoencoder-based sampling network(Point-MASNet),inspired by the masked autoencoder mechanism.Point-MASNet employs a voxel-based random non-overlapping masking strategy,which allows the model to selectively learn and capture distinctive local structural features from the input data.This approach effectively mitigates redundancy and enhances the representativeness of the sampled subset.In addition,we propose a lightweight,symmetrically structured keypoint reconstruction network,designed as an autoencoder.This network is optimized to efficiently extract latent features while enabling refined reconstructions.Extensive experiments demonstrate that Point-MASNet achieves competitive sampling performance across classification,registration,and reconstruction tasks.
基金Supported by the National Key R&D Program of China(No.2022YFD2401301)the Special Financial Fund of Spawning Ground Survey in the Bohai Sea and the Yellow Sea from the Ministry of Agriculture and Rural Affairs,China(No.125C0505)。
文摘A comprehensive fishery-independent survey generally incorporates various specialized surveys and integrates different survey objectives to maximize benefits while accounting for cost limitations.It is important to evaluate the adaptability of the comprehensive survey for different taxon to get the optimal design.However,the validity and adaptability of ichthyoplankton sampling incorporated in a comprehensive fishery-independent survey program in estimating abundance of ichthyoplankton species is little known.This study included ichthyoplankton sampling in an integrated survey and assessed the appropriateness of survey design.The Kriging interpolation based on Gaussian models was used to estimate the values at unsurveyed locations based on the original ichthyoplankton survey data in the Haizhou Bay as the“true”values.The sampling performances of the ongoing stratified random sampling(StRS),simple random sampling(SRS),cluster sampling(CS),hexagonal systematic sampling(SYS h),and regular systematic sampling(SYS r)with different sample sizes in estimating ichthyoplankton abundance were compared in relative estimation error(REE),relative bias(RB),and coefficient of variation(CV)by computer simulation.The ongoing StRS performed better than CS and SRS,but not as good as the two systematic sampling methods,and the current sample size in StRS design was insufficient to estimate ichthyoplankton abundance.The average REE values(meanREE)were significantly smaller in two systematic sampling designs than those in other three sampling designs,and the two systematic sampling designs could maintain good inter-annual stability of sampling performances.It is suggested that incorporating ichthyoplankton survey directly into stratified random fishery-independent surveys could not achieve the desired level of accuracy for survey objectives,but the accuracy can be improved by setting additional stations.The assessment framework presented in this study serves as a reference for evaluating the adaptability of integrated surveys to different objectives in other waters.