In this study we have proposed a modified ratio type estimator for population variance of the study variable y under simple random sampling without replacement making use of coefficient of kurtosis and median of an au...In this study we have proposed a modified ratio type estimator for population variance of the study variable y under simple random sampling without replacement making use of coefficient of kurtosis and median of an auxiliary variable x. The estimator’s properties have been derived up to first order of Taylor’s series expansion. The efficiency conditions derived theoretically under which the proposed estimator performs better than existing estimators. Empirical studies have been done using real populations to demonstrate the performance of the developed estimator in comparison with the existing estimators. The proposed estimator as illustrated by the empirical studies performs better than the existing estimators under some specified conditions i.e. it has the smallest Mean Squared Error and the highest Percentage Relative Efficiency. The developed estimator therefore is suitable to be applied to situations in which the variable of interest has a positive correlation with the auxiliary variable.展开更多
This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss functi...This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss function. We propose to apply Gibbs sampling procedure to draw Markov Chain Monte Carlo (MCMC) samples, and they have in turn, been used to compute the Bayes estimates with the help of importance sampling technique. We have performed a simulation study in order to compare the proposed Bayes estimators with the maximum likelihood estimators. We further consider two sample Bayes prediction to predicting future order statistics and upper record values from Burr type XII distribution based on progressive first failure censored data. The predictive densities are obtained and used to determine prediction intervals for unobserved order statistics and upper record values. A real life data set is used to illustrate the results derived.展开更多
Adaptive cluster sampling (ACS) has been a very important tool in estimation of population parameters of rare and clustered population. The fundamental idea behind this sampling plan is to decide on an initial sample ...Adaptive cluster sampling (ACS) has been a very important tool in estimation of population parameters of rare and clustered population. The fundamental idea behind this sampling plan is to decide on an initial sample from a defined population and to keep on sampling within the vicinity of the units that satisfy the condition that at least one characteristic of interest exists in a unit selected in the initial sample. Despite being an important tool for sampling rare and clustered population, adaptive cluster sampling design is unable to control the final sample size when no prior knowledge of the population is available. Thus adaptive cluster sampling with data-driven stopping rule (ACS’) was proposed to control the final sample size when prior knowledge of population structure is not available. This study examined the behavior of the HT, and HH estimator under the ACS design and ACS’ design using artificial population that is designed to have all the characteristics of a rare and clustered population. The efficiencies of the HT and HH estimator were used to determine the most efficient design in estimation of population mean in rare and clustered population. Results of both the simulated data and the real data show that the adaptive cluster sampling with stopping rule is more efficient for estimation of rare and clustered population than ordinary adaptive cluster sampling.展开更多
This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. Th...This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. The demand for forensic analysis increased for all disciplines, especially biology/DNA between 2014 and 2015. While the average distribution of case files was about 42.5%, 40.6% and 17% for the three disciplines, the distribution of samples was rather different being 12%, 82.5% and 5.5% for samples requiring forensic biology, chemistry and toxicology analysis, respectively. Results show that most of the analysis workload was on forensic chemistry analysis. The cost of analysis for case files and the corresponding sample influx varied in the ratio of 35:6:1 and 28:12:1 for forensic chemistry, biology/DNA and toxicology for year 2014 for 2015, respectively. In the two consecutive years, the cost for forensic chemistry analysis was comparatively very high, necessitating re-sampling. The time series of sample influx in all disciplines are strongly stochastic, with higher magnitude for chemistry, biology/DNA and toxicology, in this order. The PDFs of sample influx data are highly skewed to the right, especially forensic toxicology and biology/DNA with peaks at 1 and 3 samples per case file. The arbitrary re-sampling plans were best suited to forensic chemistry case files (where re-sampling conditions apply). The locus of arbitrary number of samples to take from the submitted forensic samples was used to establish the minimum and scientifically acceptable samples by applying minimization function developed in this paper. The cost minimization function was also developed based on the average cost per sample and choice of re-sampling plans depending on the range of sample influx, from which the savings were determined and maximized. Thus, the study gives a forensic scientist a business model and scientific decision making tool on minimum number of samples to analyze focusing on savings on analysis cost.展开更多
Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived chall...Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.展开更多
It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal...It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal wetlands. To solve this problem, this study tried to design an optimal, efficient and practical sampling strategy by comprehensively focusing on the three main parts of the entire process(to optimize the sampling method, to determine the minimum sampling effort and to explore the proper sampling interval) in a typical intertidal wetland of the Changjiang(Yangtze) Estuary, China. Transect sampling was selected and optimized by stratification based on pronounced habitat types(tidal flat, tidal creek, salt marsh vegetation). This type of sampling is also termed within-transect stratification sampling. The optimal sampling intervals and the minimum sample effort were determined by two beneficial numerical methods: Monte Carlo simulations and accumulative species curves. The results show that the within-transect stratification sampling with typical habitat types was effective for encompassing 81% of the species, suggesting that this type of sampling design can largely reduce the sampling effort and labor. The optimal sampling intervals and minimum sampling efforts for three habitats were determined: sampling effort must exceed 1.8 m^2 by 10 m intervals in the salt marsh vegetation, 2 m^2 by 10 m intervals in the tidal flat, and 3 m^2 by 1 m intervals in the tidal creek habitat. It was suggested that the differences were influenced by the mobility range of the dominant species and the habitats' physical differences(e.g., tidal water, substrate, vegetation cover). The optimized sampling strategy could provide good precision in the richness estimation of macrobenthos and balance the sampling effort. Moreover, the conclusions presented here provide a reference for recommendations to consider before macrobenthic surveys take place in estuarine wetlands. The sampling strategy, focusing on the three key parts of the sampling design, had a good operational effect and could be used as a guide for field sampling for habitat management or ecosystem assessment.展开更多
This paper is an extension of Hanif, Hamad and Shahbaz estimator [1] for two-phase sampling. The aim of this paper is to develop a regression type estimator with two auxiliary variables for two-phase sampling when we ...This paper is an extension of Hanif, Hamad and Shahbaz estimator [1] for two-phase sampling. The aim of this paper is to develop a regression type estimator with two auxiliary variables for two-phase sampling when we don’t have any type of information about auxiliary variables at population level. To avoid multi-collinearity, it is assumed that both auxiliary variables have minimum correlation. Mean square error and bias of proposed estimator in two-phase sampling is derived. Mean square error of proposed estimator shows an improvement over other well known estimators under the same case.展开更多
One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific object...One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific objectives,measurement targets,and measurement requirements for the proposed Gas and Ion Analyzer(GIA).The GIA is designed for in-situ mass spectrometry of neutral gases and low-energy ions,such as hydrogen,carbon,and oxygen,in the vicinity of 311P.Ion sampling techniques are essential for the GIA's Time-of-Flight(TOF)mass analysis capabilities.In this paper,we present an enhanced ion sampling technique through the development of an ion attraction model and an ion source model.The ion attraction model demonstrates that adjusting attraction grid voltage can enhance the detection efficiency of low-energy ions and mitigate the repulsive force of ions during sampling,which is influenced by the satellite's surface positive charging.The ion source model simulates the processes of gas ionization and ion multiplication.Simulation results indicate that the GIA can achieve a lower pressure limit below 10-13Pa and possess a dynamic range exceeding 10~9.These performances ensure the generation of ions with stable and consistent current,which is crucial for high-resolution and broad dynamic range mass spectrometer analysis.Preliminary testing experiments have verified GIA's capability to detect gas compositions such as H2O and N2.In-situ measurements near 311P using GIA are expected to significantly contribute to our understanding of asteroid activity mechanisms,the evolution of the atmospheric and ionized environments of main-belt comets,the interactions with solar wind,and the origin of Earth's water.展开更多
A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, i...A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.展开更多
There exists a great variety of posturographic parameters which complicates the evaluation of center of pressure (COP) data. Hence, recommendations were given to use a set of complementary parameters to explain most o...There exists a great variety of posturographic parameters which complicates the evaluation of center of pressure (COP) data. Hence, recommendations were given to use a set of complementary parameters to explain most of the variance. However, it is unknown whether a dual task paradigm leads to different parametrization sets. On account of this problem an exploratory factor analysis approach was conducted in a dual task experiment. 16 healthy subjects stood on a force plate performing a posture-cognition dual task (DT, focus of attention on a secondary task) with respect to different sampling durations. The subjects were not aware of being measured in contrast to a baseline task condition (BT, internal focus of attention) in the previously published part I. In compareson to BT a different factor loading pattern appears. In addition, factor loadings are strongly affected by different sampling durations. DT reveals a change of factor loading structure with longer sampling durations compared to BT. Specific recommendations concerning a framework of posturographic parametrization are given.展开更多
Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neur...Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neural networks(PINN)provide a new way to solve the nonlinear Schrodinger equation describing the soliton evolution by fusing data-driven and physical constraints.However,the grid point sampling strategy of traditional PINN suffers from high computational complexity and unstable gradient flow,which makes it difficult to capture the physical details efficiently.In this paper,we propose a residual-based adaptive multi-distribution(RAMD)sampling method to optimize the PINN training process by dynamically constructing a multi-modal loss distribution.With a 50%reduction in the number of grid points,RAMD significantly reduces the relative error of PINN and,in particular,optimizes the solution error of the(2+1)Ginzburg–Landau equation from 4.55%to 1.98%.RAMD breaks through the lack of physical constraints in the purely data-driven model by the innovative combination of multi-modal distribution modeling and autonomous sampling control for the design of all-optical communication devices.RAMD provides a high-precision numerical simulation tool for the design of all-optical communication devices,optimization of nonlinear laser devices,and other studies.展开更多
Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random samp...Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.展开更多
After reading the above mentioned article of [1], we identified a mistake considering the results of the paragraph “3.6. Nonlinear Parameters AP” and the related Table 5 (both on p. 512). Unfortunately, published Ta...After reading the above mentioned article of [1], we identified a mistake considering the results of the paragraph “3.6. Nonlinear Parameters AP” and the related Table 5 (both on p. 512). Unfortunately, published Table 5 is a duplicate of Table 4, and therefore it is not possible for the reader to comprehend any underlying interrelations. To correct this mistake, we would like to offer the corrected table (Table 5) as follows.展开更多
This study investigates the choice of posturographic parameter sets with respect to the influence of different sampling durations (30 s, 60 s, 300 s). Center of pressure (COP) data are derived from 16 healthy subjects...This study investigates the choice of posturographic parameter sets with respect to the influence of different sampling durations (30 s, 60 s, 300 s). Center of pressure (COP) data are derived from 16 healthy subjects standing quietly on a force plate. They were advised to focus on the postural control process ( i.e. internal focus of attention). 33 common linear and 10 nonlinear parameters are calculated and grouped into five classes. Component structure in each group is obtained via exploratory factor analysis. We demonstrate that COP evaluation—irrespective of sampling duration—necessitates a set of diverse parameters to explain more variance of the data. Further more, parameter sets are uniformly invariant towards sampling durations and display a consistent factor loading pattern. These findings pose a structure for COP parametrization. Hence, specific recommendations are preserved in order to avoid redundancy or misleading basis for inter-study comparisons. The choice of 11 parameters from the groups is recommended as a framework for future research in posturography.展开更多
Sent out at 1:31am GMT+8 on May 29 by a Long March-3B carrier rocket from the Xichang Satellite Launch Center in Sichuan province,China,Tianwen-2,the second mission of China’s Planetary Exploration Program,correctly ...Sent out at 1:31am GMT+8 on May 29 by a Long March-3B carrier rocket from the Xichang Satellite Launch Center in Sichuan province,China,Tianwen-2,the second mission of China’s Planetary Exploration Program,correctly entered the transfer trajectory toward an asteroid named 2016HO3 after flying for 18 minutes.Its solar wings unfolded properly,signaling a successful start,and primed for the next stage of its mission.展开更多
Studies in tobacco fields were conducted in 1993. The results showed that the distribution pattern of the larva was aggregative,and the aggregation did not change with the densities of population of the larva. The cha...Studies in tobacco fields were conducted in 1993. The results showed that the distribution pattern of the larva was aggregative,and the aggregation did not change with the densities of population of the larva. The characteristics of the vertical distribution of the larva on tobacco plants was more in the lower leaves than in the upper. The difference of population density among the tobacco fields with an elevation of 490 meters and 900 meters was not significant. The number of sampling was given under different precisions by using two-stage sampling technique. The average of leaf area loss caused by the larva in tobacco fields was 12.654 cm2.展开更多
Strong-field terahertz(THz) radiation holds significant potential in non-equilibrium state manipulation, electron acceleration, and biomedical effects. However, distortion-free detection of strong-field THz waveforms ...Strong-field terahertz(THz) radiation holds significant potential in non-equilibrium state manipulation, electron acceleration, and biomedical effects. However, distortion-free detection of strong-field THz waveforms remains an essential challenge in THz science and technology. To address this issue, we propose a ferromagnetic detection scheme based on Zeeman torque sampling, achieving distortion-free strong-field THz waveform detection in Py films. Thickness-dependent characterization(3–21 nm) identifies peak detection performance at 21 nm within the investigated range. Furthermore, by structurally engineering the Py ferromagnetic layer, we demonstrate strong-field THz detection in symmetric Ta(3 nm)/Py(9 nm)/Ta(3 nm) heterostructure while simultaneously resolving Zeeman torque responses and collective spin-wave dynamics in asymmetric W(4 nm)/Py(9 nm)/Pt(2 nm)heterostructure. We calculated spin wave excitations and spin orbit torque distributions in asymmetric heterostructures, along with spin wave excitations in symmetric modes. This approach overcomes the sensitivity limitations of conventional techniques in strong-field conditions.展开更多
In recent years,deep learning has been introduced into the field of Single-pixel imaging(SPI),garnering significant attention.However,conventional networks still exhibit limitations in preserving image details.To addr...In recent years,deep learning has been introduced into the field of Single-pixel imaging(SPI),garnering significant attention.However,conventional networks still exhibit limitations in preserving image details.To address this issue,we integrate Large Kernel Convolution(LKconv)into the U-Net framework,proposing an enhanced network structure named U-LKconv network,which significantly enhances the capability to recover image details even under low sampling conditions.展开更多
Fine needle aspiration (FNA) is currently the standard of care for sampling pancreatic solid masses by using endoscopic ultrasound (EUS).The accuracy of the technique is reported to be high,especially if coupled with ...Fine needle aspiration (FNA) is currently the standard of care for sampling pancreatic solid masses by using endoscopic ultrasound (EUS).The accuracy of the technique is reported to be high,especially if coupled with the rapid on site evaluation (ROSE),and it has a high safety profile.However,FNA presents some limitations,such as the small amount of tissue that can be collected and the inability of obtaining a core tissue with intact histological architecture,which is relevant to perform immunohistochemical analysis,molecular profiling and,therefore,targeted therapies.Moreover,the presence of the ROSE by an expert cytopathologist is very important to maximize the diagnostic yield of FNA technique;however,it is not widely available,especially in small centers.Hence,the introduction of EUS fine needle biopsy (FNB) with a new generation of needles,which show a high safety profile too and a satisfying diagnostic accuracy even in the absence of ROSE,could be the key to overcome the limitations of FNA.However,FNB has not yet shown diagnostic superiority over FNA.Considering all the technical aspects of FNA and FNB,the different types of needle currently available,comparisons in term of diagnostic yield,and the different techniques of sampling,a tailored approach should be used in order to determine the needle that is most appropriate for the different specific scenarios.展开更多
Objectives: To classify community pharmacies (CPs) in Riyadh, Saudi Arabia, in terms of the quality of medicines sold by them, using?the lot quality assurance sampling (LQAS) technique with a predefined threshold. Met...Objectives: To classify community pharmacies (CPs) in Riyadh, Saudi Arabia, in terms of the quality of medicines sold by them, using?the lot quality assurance sampling (LQAS) technique with a predefined threshold. Methods: Riyadh CPs were divided into 2 categories (“lots” for the purpose of LQAS), i.e., chain and independent CPs. Upper and lower rate thresholds for CPs that sell low-quality medicines were predefined as 20% and 5%, respectively. Consumer and provider risks were predefined as 0.05 and 0.10, respectively. The calculated number of randomly selected CPs required in each lot was 36;then, sale of low-quality medicines in >3 CPs implies a prevalence of >20% of such CPs according to LQAS. A randomly selected brand of amoxicillin (selected as a quality indicator of medicines because it is both widely counterfeited and heat-sensitive) was purchased from each pharmacy by a “mystery shopper”, checked for authenticity, and analyzed for drug content and content uniformity using a validated HPLC method. Results: Substandard amoxicillin was purchased in 9 pharmacies (4 chains and 5 independent). Both lots were thus rejected as unacceptable, which may indicate that consumers in Riyadh are at risk of purchasing substandard medicines at CPs. Conclusions: The quality of medicines sold in CPs in Riyadh did not meet our acceptability criterion, and appropriate intervention by decision makers is recommended. LQAS proved to be a practical, economical, and statistically valid sampling method for surveying the quality of medicines. It should enable decision makers to allocate resources for improvement more efficiently.展开更多
文摘In this study we have proposed a modified ratio type estimator for population variance of the study variable y under simple random sampling without replacement making use of coefficient of kurtosis and median of an auxiliary variable x. The estimator’s properties have been derived up to first order of Taylor’s series expansion. The efficiency conditions derived theoretically under which the proposed estimator performs better than existing estimators. Empirical studies have been done using real populations to demonstrate the performance of the developed estimator in comparison with the existing estimators. The proposed estimator as illustrated by the empirical studies performs better than the existing estimators under some specified conditions i.e. it has the smallest Mean Squared Error and the highest Percentage Relative Efficiency. The developed estimator therefore is suitable to be applied to situations in which the variable of interest has a positive correlation with the auxiliary variable.
文摘This paper deals with Bayesian inference and prediction problems of the Burr type XII distribution based on progressive first failure censored data. We consider the Bayesian inference under a squared error loss function. We propose to apply Gibbs sampling procedure to draw Markov Chain Monte Carlo (MCMC) samples, and they have in turn, been used to compute the Bayes estimates with the help of importance sampling technique. We have performed a simulation study in order to compare the proposed Bayes estimators with the maximum likelihood estimators. We further consider two sample Bayes prediction to predicting future order statistics and upper record values from Burr type XII distribution based on progressive first failure censored data. The predictive densities are obtained and used to determine prediction intervals for unobserved order statistics and upper record values. A real life data set is used to illustrate the results derived.
文摘Adaptive cluster sampling (ACS) has been a very important tool in estimation of population parameters of rare and clustered population. The fundamental idea behind this sampling plan is to decide on an initial sample from a defined population and to keep on sampling within the vicinity of the units that satisfy the condition that at least one characteristic of interest exists in a unit selected in the initial sample. Despite being an important tool for sampling rare and clustered population, adaptive cluster sampling design is unable to control the final sample size when no prior knowledge of the population is available. Thus adaptive cluster sampling with data-driven stopping rule (ACS’) was proposed to control the final sample size when prior knowledge of population structure is not available. This study examined the behavior of the HT, and HH estimator under the ACS design and ACS’ design using artificial population that is designed to have all the characteristics of a rare and clustered population. The efficiencies of the HT and HH estimator were used to determine the most efficient design in estimation of population mean in rare and clustered population. Results of both the simulated data and the real data show that the adaptive cluster sampling with stopping rule is more efficient for estimation of rare and clustered population than ordinary adaptive cluster sampling.
文摘This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. The demand for forensic analysis increased for all disciplines, especially biology/DNA between 2014 and 2015. While the average distribution of case files was about 42.5%, 40.6% and 17% for the three disciplines, the distribution of samples was rather different being 12%, 82.5% and 5.5% for samples requiring forensic biology, chemistry and toxicology analysis, respectively. Results show that most of the analysis workload was on forensic chemistry analysis. The cost of analysis for case files and the corresponding sample influx varied in the ratio of 35:6:1 and 28:12:1 for forensic chemistry, biology/DNA and toxicology for year 2014 for 2015, respectively. In the two consecutive years, the cost for forensic chemistry analysis was comparatively very high, necessitating re-sampling. The time series of sample influx in all disciplines are strongly stochastic, with higher magnitude for chemistry, biology/DNA and toxicology, in this order. The PDFs of sample influx data are highly skewed to the right, especially forensic toxicology and biology/DNA with peaks at 1 and 3 samples per case file. The arbitrary re-sampling plans were best suited to forensic chemistry case files (where re-sampling conditions apply). The locus of arbitrary number of samples to take from the submitted forensic samples was used to establish the minimum and scientifically acceptable samples by applying minimization function developed in this paper. The cost minimization function was also developed based on the average cost per sample and choice of re-sampling plans depending on the range of sample influx, from which the savings were determined and maximized. Thus, the study gives a forensic scientist a business model and scientific decision making tool on minimum number of samples to analyze focusing on savings on analysis cost.
文摘Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.
基金The Special Scientific Research Funds for Central Non-profit Institutes(East China Sea Fisheries Research Institute)under contract No.2016T08the National Natural Science Foundation of China under contract No.31400410
文摘It is a challenge in the field sampling to face conflict between the statistical requirements and the logistical constraints when explicitly estimating the macrobenthos species richness in the heterogeneous intertidal wetlands. To solve this problem, this study tried to design an optimal, efficient and practical sampling strategy by comprehensively focusing on the three main parts of the entire process(to optimize the sampling method, to determine the minimum sampling effort and to explore the proper sampling interval) in a typical intertidal wetland of the Changjiang(Yangtze) Estuary, China. Transect sampling was selected and optimized by stratification based on pronounced habitat types(tidal flat, tidal creek, salt marsh vegetation). This type of sampling is also termed within-transect stratification sampling. The optimal sampling intervals and the minimum sample effort were determined by two beneficial numerical methods: Monte Carlo simulations and accumulative species curves. The results show that the within-transect stratification sampling with typical habitat types was effective for encompassing 81% of the species, suggesting that this type of sampling design can largely reduce the sampling effort and labor. The optimal sampling intervals and minimum sampling efforts for three habitats were determined: sampling effort must exceed 1.8 m^2 by 10 m intervals in the salt marsh vegetation, 2 m^2 by 10 m intervals in the tidal flat, and 3 m^2 by 1 m intervals in the tidal creek habitat. It was suggested that the differences were influenced by the mobility range of the dominant species and the habitats' physical differences(e.g., tidal water, substrate, vegetation cover). The optimized sampling strategy could provide good precision in the richness estimation of macrobenthos and balance the sampling effort. Moreover, the conclusions presented here provide a reference for recommendations to consider before macrobenthic surveys take place in estuarine wetlands. The sampling strategy, focusing on the three key parts of the sampling design, had a good operational effect and could be used as a guide for field sampling for habitat management or ecosystem assessment.
文摘This paper is an extension of Hanif, Hamad and Shahbaz estimator [1] for two-phase sampling. The aim of this paper is to develop a regression type estimator with two auxiliary variables for two-phase sampling when we don’t have any type of information about auxiliary variables at population level. To avoid multi-collinearity, it is assumed that both auxiliary variables have minimum correlation. Mean square error and bias of proposed estimator in two-phase sampling is derived. Mean square error of proposed estimator shows an improvement over other well known estimators under the same case.
基金Supported by the National Natural Science Foundation of China(42474239,41204128)China National Space Administration(Pre-research project on Civil Aerospace Technologies No.D010301)Strategic Priority Research Program of the Chinese Academy of Sciences(XDA17010303)。
文摘One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific objectives,measurement targets,and measurement requirements for the proposed Gas and Ion Analyzer(GIA).The GIA is designed for in-situ mass spectrometry of neutral gases and low-energy ions,such as hydrogen,carbon,and oxygen,in the vicinity of 311P.Ion sampling techniques are essential for the GIA's Time-of-Flight(TOF)mass analysis capabilities.In this paper,we present an enhanced ion sampling technique through the development of an ion attraction model and an ion source model.The ion attraction model demonstrates that adjusting attraction grid voltage can enhance the detection efficiency of low-energy ions and mitigate the repulsive force of ions during sampling,which is influenced by the satellite's surface positive charging.The ion source model simulates the processes of gas ionization and ion multiplication.Simulation results indicate that the GIA can achieve a lower pressure limit below 10-13Pa and possess a dynamic range exceeding 10~9.These performances ensure the generation of ions with stable and consistent current,which is crucial for high-resolution and broad dynamic range mass spectrometer analysis.Preliminary testing experiments have verified GIA's capability to detect gas compositions such as H2O and N2.In-situ measurements near 311P using GIA are expected to significantly contribute to our understanding of asteroid activity mechanisms,the evolution of the atmospheric and ionized environments of main-belt comets,the interactions with solar wind,and the origin of Earth's water.
文摘A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.
文摘There exists a great variety of posturographic parameters which complicates the evaluation of center of pressure (COP) data. Hence, recommendations were given to use a set of complementary parameters to explain most of the variance. However, it is unknown whether a dual task paradigm leads to different parametrization sets. On account of this problem an exploratory factor analysis approach was conducted in a dual task experiment. 16 healthy subjects stood on a force plate performing a posture-cognition dual task (DT, focus of attention on a secondary task) with respect to different sampling durations. The subjects were not aware of being measured in contrast to a baseline task condition (BT, internal focus of attention) in the previously published part I. In compareson to BT a different factor loading pattern appears. In addition, factor loadings are strongly affected by different sampling durations. DT reveals a change of factor loading structure with longer sampling durations compared to BT. Specific recommendations concerning a framework of posturographic parametrization are given.
基金supported by the National Key R&D Program of China(Grant No.2022YFA1604200)National Natural Science Foundation of China(Grant No.12261131495)+1 种基金Beijing Municipal Science and Technology Commission,Adminitrative Commission of Zhongguancun Science Park(Grant No.Z231100006623006)Institute of Systems Science,Beijing Wuzi University(Grant No.BWUISS21)。
文摘Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neural networks(PINN)provide a new way to solve the nonlinear Schrodinger equation describing the soliton evolution by fusing data-driven and physical constraints.However,the grid point sampling strategy of traditional PINN suffers from high computational complexity and unstable gradient flow,which makes it difficult to capture the physical details efficiently.In this paper,we propose a residual-based adaptive multi-distribution(RAMD)sampling method to optimize the PINN training process by dynamically constructing a multi-modal loss distribution.With a 50%reduction in the number of grid points,RAMD significantly reduces the relative error of PINN and,in particular,optimizes the solution error of the(2+1)Ginzburg–Landau equation from 4.55%to 1.98%.RAMD breaks through the lack of physical constraints in the purely data-driven model by the innovative combination of multi-modal distribution modeling and autonomous sampling control for the design of all-optical communication devices.RAMD provides a high-precision numerical simulation tool for the design of all-optical communication devices,optimization of nonlinear laser devices,and other studies.
基金the Ministry of Agriculture and Forestry key project“Puuta liikkeelle ja uusia tuotteita metsästä”(“Wood on the move and new products from forest”)Academy of Finland(project numbers 295100 , 306875).
文摘Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.
文摘After reading the above mentioned article of [1], we identified a mistake considering the results of the paragraph “3.6. Nonlinear Parameters AP” and the related Table 5 (both on p. 512). Unfortunately, published Table 5 is a duplicate of Table 4, and therefore it is not possible for the reader to comprehend any underlying interrelations. To correct this mistake, we would like to offer the corrected table (Table 5) as follows.
文摘This study investigates the choice of posturographic parameter sets with respect to the influence of different sampling durations (30 s, 60 s, 300 s). Center of pressure (COP) data are derived from 16 healthy subjects standing quietly on a force plate. They were advised to focus on the postural control process ( i.e. internal focus of attention). 33 common linear and 10 nonlinear parameters are calculated and grouped into five classes. Component structure in each group is obtained via exploratory factor analysis. We demonstrate that COP evaluation—irrespective of sampling duration—necessitates a set of diverse parameters to explain more variance of the data. Further more, parameter sets are uniformly invariant towards sampling durations and display a consistent factor loading pattern. These findings pose a structure for COP parametrization. Hence, specific recommendations are preserved in order to avoid redundancy or misleading basis for inter-study comparisons. The choice of 11 parameters from the groups is recommended as a framework for future research in posturography.
文摘Sent out at 1:31am GMT+8 on May 29 by a Long March-3B carrier rocket from the Xichang Satellite Launch Center in Sichuan province,China,Tianwen-2,the second mission of China’s Planetary Exploration Program,correctly entered the transfer trajectory toward an asteroid named 2016HO3 after flying for 18 minutes.Its solar wings unfolded properly,signaling a successful start,and primed for the next stage of its mission.
文摘Studies in tobacco fields were conducted in 1993. The results showed that the distribution pattern of the larva was aggregative,and the aggregation did not change with the densities of population of the larva. The characteristics of the vertical distribution of the larva on tobacco plants was more in the lower leaves than in the upper. The difference of population density among the tobacco fields with an elevation of 490 meters and 900 meters was not significant. The number of sampling was given under different precisions by using two-stage sampling technique. The average of leaf area loss caused by the larva in tobacco fields was 12.654 cm2.
基金supported by the Scientific Research Innovation Capability Support Project for Young Faculty (Grant No.ZYGXQNJSKYCXNLZCXMI3)the National Key Research and Development Program of China (Grant No.2022YFA1604402)+1 种基金the National Natural Science Foundation of China (Grant Nos.U23A6002,92250307,and 52225106)the Beijing Municipal Science and Technology Commission,Administrative Commission of Zhongguancun Science Park (Grant No.Z25110000692500)。
文摘Strong-field terahertz(THz) radiation holds significant potential in non-equilibrium state manipulation, electron acceleration, and biomedical effects. However, distortion-free detection of strong-field THz waveforms remains an essential challenge in THz science and technology. To address this issue, we propose a ferromagnetic detection scheme based on Zeeman torque sampling, achieving distortion-free strong-field THz waveform detection in Py films. Thickness-dependent characterization(3–21 nm) identifies peak detection performance at 21 nm within the investigated range. Furthermore, by structurally engineering the Py ferromagnetic layer, we demonstrate strong-field THz detection in symmetric Ta(3 nm)/Py(9 nm)/Ta(3 nm) heterostructure while simultaneously resolving Zeeman torque responses and collective spin-wave dynamics in asymmetric W(4 nm)/Py(9 nm)/Pt(2 nm)heterostructure. We calculated spin wave excitations and spin orbit torque distributions in asymmetric heterostructures, along with spin wave excitations in symmetric modes. This approach overcomes the sensitivity limitations of conventional techniques in strong-field conditions.
文摘In recent years,deep learning has been introduced into the field of Single-pixel imaging(SPI),garnering significant attention.However,conventional networks still exhibit limitations in preserving image details.To address this issue,we integrate Large Kernel Convolution(LKconv)into the U-Net framework,proposing an enhanced network structure named U-LKconv network,which significantly enhances the capability to recover image details even under low sampling conditions.
文摘Fine needle aspiration (FNA) is currently the standard of care for sampling pancreatic solid masses by using endoscopic ultrasound (EUS).The accuracy of the technique is reported to be high,especially if coupled with the rapid on site evaluation (ROSE),and it has a high safety profile.However,FNA presents some limitations,such as the small amount of tissue that can be collected and the inability of obtaining a core tissue with intact histological architecture,which is relevant to perform immunohistochemical analysis,molecular profiling and,therefore,targeted therapies.Moreover,the presence of the ROSE by an expert cytopathologist is very important to maximize the diagnostic yield of FNA technique;however,it is not widely available,especially in small centers.Hence,the introduction of EUS fine needle biopsy (FNB) with a new generation of needles,which show a high safety profile too and a satisfying diagnostic accuracy even in the absence of ROSE,could be the key to overcome the limitations of FNA.However,FNB has not yet shown diagnostic superiority over FNA.Considering all the technical aspects of FNA and FNB,the different types of needle currently available,comparisons in term of diagnostic yield,and the different techniques of sampling,a tailored approach should be used in order to determine the needle that is most appropriate for the different specific scenarios.
文摘Objectives: To classify community pharmacies (CPs) in Riyadh, Saudi Arabia, in terms of the quality of medicines sold by them, using?the lot quality assurance sampling (LQAS) technique with a predefined threshold. Methods: Riyadh CPs were divided into 2 categories (“lots” for the purpose of LQAS), i.e., chain and independent CPs. Upper and lower rate thresholds for CPs that sell low-quality medicines were predefined as 20% and 5%, respectively. Consumer and provider risks were predefined as 0.05 and 0.10, respectively. The calculated number of randomly selected CPs required in each lot was 36;then, sale of low-quality medicines in >3 CPs implies a prevalence of >20% of such CPs according to LQAS. A randomly selected brand of amoxicillin (selected as a quality indicator of medicines because it is both widely counterfeited and heat-sensitive) was purchased from each pharmacy by a “mystery shopper”, checked for authenticity, and analyzed for drug content and content uniformity using a validated HPLC method. Results: Substandard amoxicillin was purchased in 9 pharmacies (4 chains and 5 independent). Both lots were thus rejected as unacceptable, which may indicate that consumers in Riyadh are at risk of purchasing substandard medicines at CPs. Conclusions: The quality of medicines sold in CPs in Riyadh did not meet our acceptability criterion, and appropriate intervention by decision makers is recommended. LQAS proved to be a practical, economical, and statistically valid sampling method for surveying the quality of medicines. It should enable decision makers to allocate resources for improvement more efficiently.