This paper introduces a new method, E-Bayesian estimation method, to estimate the reliability in zero-failure data. The definition of E-Bayesian estimation of the reliability is given. Based on the definition,the form...This paper introduces a new method, E-Bayesian estimation method, to estimate the reliability in zero-failure data. The definition of E-Bayesian estimation of the reliability is given. Based on the definition,the formulas of E-Bayesian estimation and hierarchical Bayesian estimation of the reliability are provided, and property of the E-Bayesian estimation, i.e. relation between E-Bayesian estimation and hierarchical Bayesian estimation, is discussed. Calculations performed on practical problems show that the proposed new method is feasible and easy to operate.展开更多
By using the technique of integration within an ordered product (IWOP) of operator we derive Wigner function of density operator for negative binomial distribution of radiation field in the mixed state case, then we...By using the technique of integration within an ordered product (IWOP) of operator we derive Wigner function of density operator for negative binomial distribution of radiation field in the mixed state case, then we derive the Wigner function of squeezed number state, which yields negative binomial distribution by virtue of the entangled state representation and the entangled Wigner operator.展开更多
About 170 nations have been affected by the COvid VIrus Disease-19(COVID-19)epidemic.On governing bodies across the globe,a lot of stress is created by COVID-19 as there is a continuous rise in patient count testing p...About 170 nations have been affected by the COvid VIrus Disease-19(COVID-19)epidemic.On governing bodies across the globe,a lot of stress is created by COVID-19 as there is a continuous rise in patient count testing positive,and they feel challenging to tackle this situation.Most researchers concentrate on COVID-19 data analysis using the machine learning paradigm in these situations.In the previous works,Long Short-Term Memory(LSTM)was used to predict future COVID-19 cases.According to LSTM network data,the outbreak is expected tofinish by June 2020.However,there is a chance of an over-fitting problem in LSTM and true positive;it may not produce the required results.The COVID-19 dataset has lower accuracy and a higher error rate in the existing system.The proposed method has been introduced to overcome the above-mentioned issues.For COVID-19 prediction,a Linear Decreasing Inertia Weight-based Cat Swarm Optimization with Half Binomial Distribution based Convolutional Neural Network(LDIWCSO-HBDCNN)approach is presented.In this suggested research study,the COVID-19 predicting dataset is employed as an input,and the min-max normalization approach is employed to normalize it.Optimum features are selected using Linear Decreasing Inertia Weight-based Cat Swarm Optimization(LDIWCSO)algorithm,enhancing the accuracy of classification.The Cat Swarm Optimization(CSO)algorithm’s convergence is enhanced using inertia weight in the LDIWCSO algorithm.It is used to select the essential features using the bestfitness function values.For a specified time across India,death and confirmed cases are predicted using the Half Binomial Distribution based Convolutional Neural Network(HBDCNN)technique based on selected features.As demonstrated by empirical observations,the proposed system produces significant performance in terms of f-measure,recall,precision,and accuracy.展开更多
The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random t...The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result,the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated,and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures.展开更多
In the reputation modeling of wireless sensor networks(WSNs) many literatures have proposed creative reputation indirect update methods,such as reputation integration,discounting,aging to eliminate,and filtering mal...In the reputation modeling of wireless sensor networks(WSNs) many literatures have proposed creative reputation indirect update methods,such as reputation integration,discounting,aging to eliminate,and filtering malicious reputation information. However,few have discussed the reputation direct update. In this paper,based on sound statistical theories,a negative binominal distribution method in the reputation direct update for WSNs is proposed. Results show that the proposed method is more suitable and time-saving for the reputation update of the resource constraint WSNs and can improve the computation power efficiency as well.展开更多
In this article, the zero-inflated non-central negative binomial(ZINNB) distribution is introduced. Some of its basic properties are obtained. In addition, we use the maximum likelihood estimation method to estimate t...In this article, the zero-inflated non-central negative binomial(ZINNB) distribution is introduced. Some of its basic properties are obtained. In addition, we use the maximum likelihood estimation method to estimate the parameters of the ZINNB distribution, and illustrate its application by fitting the actual data sets.展开更多
It is proposed that ratios of the sixth order to the second order cumulant(C_(6)/C_(2))of conserved quantities are sensitive to the chiral crossover transition.Recently,the negative C_(6)/C_(2) was obtained in both th...It is proposed that ratios of the sixth order to the second order cumulant(C_(6)/C_(2))of conserved quantities are sensitive to the chiral crossover transition.Recently,the negative C_(6)/C_(2) was obtained in both theoretical Lattice QCD and experiments at■.In this study,we investigate the behavior of net-proton C_(6)/C_(2) in statistical Binomial distribution(BD)at■in Au+Au collisions.With the BD parameters extracted from RHIC/STAR,it is found that C_(6)/C_(2) can be negative.Furthermore,the obtained C_(6)/C_(2) becomes smaller when applying the same magnitude of experimental statistics and calculation method to simulations.In 0-10%centrality,there is a significant difference between the simulated result and theoretical expectation.Based on the extracted parameters and experimentally collected statistics,the baseline of net-proton C_(6)/C_(2) in BD is presented.展开更多
The secure and normal operation of distributed networks is crucial for accurate parameter estimation.However,distributed networks are frequently susceptible to Byzantine attacks.Considering real-life scenarios,this pa...The secure and normal operation of distributed networks is crucial for accurate parameter estimation.However,distributed networks are frequently susceptible to Byzantine attacks.Considering real-life scenarios,this paper investigates a probability Byzantine(PB)attack,utilizing a Bernoulli distribution to simulate the attack probability.Historically,additional detection mechanisms are used to mitigate such attacks,leading to increased energy consumption and burdens on distributed nodes,consequently diminishing operational efficiency.Differing from these approaches,an adaptive updating distributed estimation algorithm is proposed to mitigate the impact of PB attacks.In the proposed algorithm,a penalty strategy is initially incorporated during data updates to weaken the influence of the attack.Subsequently,an adaptive fusion weight is employed during data fusion to merge the estimations.Additionally,the reason why this penalty term weakens the attack has been analyzed,and the performance of the proposed algorithm is validated through simulation experiments.展开更多
A complex mechatronics system Bayesian plan of demonstration test is studied based on the mixed beta distribution. During product design and improvement various information is appropriately considered by introducing i...A complex mechatronics system Bayesian plan of demonstration test is studied based on the mixed beta distribution. During product design and improvement various information is appropriately considered by introducing inheritance factor, moreover, the inheritance factor is thought as a random variable, and the Bayesian decision of the qualification test plan is obtained, and the correctness of a Bayesian model presented is verified. The results show that the quantity of the test is too conservative according to classical methods under small binomial samples. Although traditional Bayesian analysis can consider test information of related or similar products, it ignores differences between such products. The method has solved the above problem, furthermore, considering the requirement in many practical projects, the differences among this method, the classical method and Bayesian with beta distribution are compared according to the plan of reliability acceptance test.展开更多
This paper presents four methods of constructing the confidence interval for the proportion <i><span style="font-family:Verdana;">p</span></i><span style="font-family:;" ...This paper presents four methods of constructing the confidence interval for the proportion <i><span style="font-family:Verdana;">p</span></i><span style="font-family:;" "=""><span style="font-family:Verdana;"> of the binomial distribution. Evidence in the literature indicates the standard Wald confidence interval for the binomial proportion is inaccurate, especially for extreme values of </span><i><span style="font-family:Verdana;">p</span></i><span style="font-family:Verdana;">. Even for moderately large sample sizes, the coverage probabilities of the Wald confidence interval prove to be erratic for extreme values of </span><i><span style="font-family:Verdana;">p</span></i><span style="font-family:Verdana;">. Three alternative confidence intervals, namely, Wilson confidence interval, Clopper-Pearson interval, and likelihood interval</span></span><span style="font-family:Verdana;">,</span><span style="font-family:Verdana;"> are compared to the Wald confidence interval on the basis of coverage probability and expected length by means of simulation.</span>展开更多
The rharginal recursive equations on excess-of-loss reinsurance treaty are investignted, under the assumption that the number of claims belongs to the family consisting of Poisson, binomial and negative binomial, and ...The rharginal recursive equations on excess-of-loss reinsurance treaty are investignted, under the assumption that the number of claims belongs to the family consisting of Poisson, binomial and negative binomial, and that the severity distribution has bounded continuous density function. On conditional of the numbers of claims associated with the reinsurer and the cedent, some recursive equations are obtained for the marginal distributions of the total payments of the reinsurer and the cedent.展开更多
This paper begins with the overthrow of the concept of combining ability in crossbreeding by the concept of heritability.The reason is that general combining ability changes with the number and kind of pure strains in...This paper begins with the overthrow of the concept of combining ability in crossbreeding by the concept of heritability.The reason is that general combining ability changes with the number and kind of pure strains in the foundation stock and hence special combining ability changes also,so that work with different kinds of pure strains in the foundation stock cannot be compared.Hence combining ability is useless as a parameter to predict the amount of heterosis expected in the next generation.On the other hand,since each cross has a separate heritability,it can be applied to a cross population just as successfully as in purebreeding.Since the same concept holds in both cases,resort to any other concept would be superfluous.That's why combining ability must be rejected.Another reason(not given in the full text)is,an infinite number of pure strains would be required in the foundation stock for its results to be comparable with those of the heritability theory,which disposes of its utility altogether.The main content of the thesis is then the centennial enigma of heterosis can be resolved by Descarte's theoretic method of deduction.Accordingly we start from the definition of heterosis.H=F¡-MP,where H is heterosis,F,is the first generation offspring,MP is the mean of the parents or midparent,and from the use of a binomial random variable and its extention to the multinomial case derive the basic relations of heterosis with its components.Starting with second degree statistics,we obtain Vn=Vr,-2cov(F,,MP)+Vup,where V and cov stand for variance and covariance.The equations of heterosis are v„=(1/2)Na²+(1/4)Nd’+Vr(F,)=additive dominance F,epistasis Vup=(1/2)Na’+(1/2)V1,additive parental epistasis V„=(1/4)Nd’+V(F)+(1/2)V1,dominance F,epistasis parental epistasis.where N is number of genes controlling a trait,a=(P1-P,)12,d is deviation from midparent,while the variance components are all indicated by their names under the repective terms.It turns out that all these can be easily computed from the data so that the problem becomes a simple one which any college student may solve.In other words,the right answers are found when the right questions are asked.Who had ever shown that the heritability principle is inapplicable in crossbreeding,e.g.,in a crossing of two pure strains?From this cue arose the realization that the F,of a cross of two pure strains must also be a Mendelian population,with p and q both equal to 1/2 which simplifies the algebra outright.This Heritability Theory of Heterosis,or HTH in capital letters,re-sts on 2 initial anguments:1)Since 0.5+0.5=1,crossing two pure strains gives a population which is only a special case of pure-breeding,thereforea heritability coefficient must exist for the F1;2)Our problem reduces to that of finding that coefficient;the an-swer is given by the additive component divided by Ve.,i.e.,(1/2)No'1 Vp..which is readily found from the solution of the het-erosis equations.Thus the elemnal enigma of heterosis is resolved!This happened at the end of the 20th century.We now come to the second point of the discovery,the new genetic parameter crossheritability which will rise in size with the increase of the number of times it's used and form the link between breeding and evolution.The advent of the Age of Evolution Engineering in the 21st century marks a totally new era,showing that artificial will ultimately supercede natural selection,with the long span of time element eliminated.For agriculture at least,it means there is no limit to the increase of food supply by the new method,with the concentra-tion of desirable genes by hybridization in place of the old theory of their fixation.Genetic gain is achieved through artificial selec-tion,with an 80%saving of time,labor and cost by adoption of the new method.Applied to a further increase in all kinds of agri-cultural products including hybrd rice,it means that a huge eacalation,in fact a New Green Revolution,on a much langer scale than that of any such before,is in view,provided it is adopted in our research and educational institutions as early as possible,ere its spread elsewhere.The possibilities from the evolution point of view can only be pictured by science fiction.展开更多
The photon polarization law p(theta) = sin(2)theta is derived from a simple informational consideration by two methods: The first is via an intuitive principle of minimum Fisher information, the second is via a symmet...The photon polarization law p(theta) = sin(2)theta is derived from a simple informational consideration by two methods: The first is via an intuitive principle of minimum Fisher information, the second is via a symmetry and invariance argument. The results demonstrate that in photon polarization, Nature has a tendency to hide herself as deep as possible while obeying some regular conditions.展开更多
Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a...Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a major cause of damage in forests in this region particularly during the regeneration phase. The model developed in this study is based on a systematic sample inventory of forest cockchafer larvae by excavation across the Hessian Ried. These forest cockchafer larvae data were characterized by excess zeros and overdispersion. Methods: Using specific generalized additive regression models, different discrete distributions, including the Poisson, negative binomial and zero-inflated Poisson distributions, were compared. The methodology employed allowed the simultaneous estimation of non-linear model effects of causal covariates and, to account for spatial autocorrelation, of a 2-dimensional spatial trend function. In the validation of the models, both the Akaike information criterion (AIC) and more detailed graphical procedures based on randomized quantile residuals were used. Results: The negative binomial distribution was superior to the Poisson and the zero-inflated Poisson distributions, providing a near perfect fit to the data, which was proven in an extensive validation process. The causal predictors found to affect the density of larvae significantly were distance to water table and percentage of pure clay layer in the soil to a depth of I m. Model predictions showed that larva density increased with an increase in distance to the water table up to almost 4 m, after which it remained constant, and with a reduction in the percentage of pure clay layer. However this latter correlation was weak and requires further investigation. The 2-dimensional trend function indicated a strong spatial effect, and thus explained by far the highest proportion of variation in larva density. Conclusions: As such the model can be used to support forest practitioners in their decision making for regeneration and forest protection planning in the Hessian predicting future spatial patterns of the larva density is still comparatively weak. Ried. However, the application of the model for somewhat limited because the causal effects are展开更多
With the support of the Fundamental Reliability Theoretical Research (FRTR) Foundation of the Quality Control Bureau of Ministry of Astronautics (MOA), PRC, 9 Chinese institutes and universities have worked for years ...With the support of the Fundamental Reliability Theoretical Research (FRTR) Foundation of the Quality Control Bureau of Ministry of Astronautics (MOA), PRC, 9 Chinese institutes and universities have worked for years on reliability statistics problems pending to be solved in space research and development. This paper gives a brief review of our main research results, including (1) Results on Normal Distributions; (2) Results on Weibull Distributions; (3) Results on the Synthesisof System Reliability-Theoretical Method; (4) Results on the Synthesis of System Reliability-Approximation Method: Binomial Distribution, Exponential Distribution, Weibull Distribution, Parallel System, General Cases; (5) Structual Reliability; (6) Zero-Failure Reliability Estimation; (7) Storage Life and Others. All these results can be acquired from the Quality Control Bureau of the Ministry of Aero-Space Industry (MAS).展开更多
In this work, we show that when there is insufficient equipment for detecting a disease whose prevalence is t% in a sub-population of size N, it is optimal to divide the N samples into n groups of size r each and then...In this work, we show that when there is insufficient equipment for detecting a disease whose prevalence is t% in a sub-population of size N, it is optimal to divide the N samples into n groups of size r each and then, the value <img src="Edit_ce149849-3742-48fe-820b-02ccc0c92d83.bmp" alt="" /> allows systematic screening of all N individuals by performing less than N tests (In this expression, <img src="Edit_987eb236-a883-4894-ba2d-52bde5f35056.bmp" alt="" /> represents the floor function<sup>1</sup> of x ∈ R). Based on this result and on certain functions of the R software, we subsequently propose a probabilistic strategy capable of optimizing the screening tests under certain conditions.展开更多
Various models have been proposed in the literature to study non-negative integer-valued time series. In this paper, we study estimators for the generalized Poisson autoregressive process of order 1, a model developed...Various models have been proposed in the literature to study non-negative integer-valued time series. In this paper, we study estimators for the generalized Poisson autoregressive process of order 1, a model developed by Alzaid and Al-Osh [1]. We compare three estimation methods, the methods of moments, quasi-likelihood and conditional maximum likelihood and study their asymptotic properties. To compare the bias of the estimators in small samples, we perform a simulation study for various parameter values. Using the theory of estimating equations, we obtain expressions for the variance-covariance matrices of those three estimators, and we compare their asymptotic efficiency. Finally, we apply the methods derived in the paper to a real time series.展开更多
The theory of evolution was advanced by Darwin in 1859, prior to Mendel’s experiments demonstrating the particulate nature of inheritance. The modern synthesis was formulated in the early 1940s, well before the conce...The theory of evolution was advanced by Darwin in 1859, prior to Mendel’s experiments demonstrating the particulate nature of inheritance. The modern synthesis was formulated in the early 1940s, well before the concept of coded information was understood. This paper outlines four mathematical challenges to the modern synthesis, which are based on current understanding of the proposed mechanisms of evolutionary change within the constraints of experimental molecular biology.展开更多
In this paper an effort has been made to study the general characteristics of slow particles produced in the interactions of 32S-Em at 200 AGeV to extract the information about the mechanism of particle production. Th...In this paper an effort has been made to study the general characteristics of slow particles produced in the interactions of 32S-Em at 200 AGeV to extract the information about the mechanism of particle production. The results have been compared with the experimental results obtained by other workers. The multiplicity distributions of the slow target associated particles (black, grey and heavy tracks) produced by 32S-beam with different targets have been studied. Also several types of correlations among them have been investigated. The variation of the produced particles with projectile mass number and target size has been studied. Also the multiplicity distributions of slow particles with NBD fits are presented and scaling multiplicity distributions of slow particles produced have been studied in order to check the validity of KNO-scaling.展开更多
In this paper,we study a robust estimation method for the observation-driven integervalued time-series models in which the conditional probability mass of current observations is assumed to follow a negative binomial ...In this paper,we study a robust estimation method for the observation-driven integervalued time-series models in which the conditional probability mass of current observations is assumed to follow a negative binomial distribution.Maximum likelihood estimator is highly affected by the outliers.We resort to the minimum density power divergence estimator as a robust estimator and showthat it is strongly consistent and asymptotically normal under some regularity conditions.Simulation results are provided to illustrate the performance of the estimator.An application is performed on data for campylobacteriosis infections.展开更多
基金the Ningbo University of Technology Science Foundation and Ningbo Natural Science Foundation(No.2013A610108)
文摘This paper introduces a new method, E-Bayesian estimation method, to estimate the reliability in zero-failure data. The definition of E-Bayesian estimation of the reliability is given. Based on the definition,the formulas of E-Bayesian estimation and hierarchical Bayesian estimation of the reliability are provided, and property of the E-Bayesian estimation, i.e. relation between E-Bayesian estimation and hierarchical Bayesian estimation, is discussed. Calculations performed on practical problems show that the proposed new method is feasible and easy to operate.
基金the Natural Science Foundation of Heze University of Shandong Province of China under Grant Nos.XY07WL01 and XY05WL01the University Experimental Technology Foundation of Shandong Province of China under Grant No.S04W138
文摘By using the technique of integration within an ordered product (IWOP) of operator we derive Wigner function of density operator for negative binomial distribution of radiation field in the mixed state case, then we derive the Wigner function of squeezed number state, which yields negative binomial distribution by virtue of the entangled state representation and the entangled Wigner operator.
文摘About 170 nations have been affected by the COvid VIrus Disease-19(COVID-19)epidemic.On governing bodies across the globe,a lot of stress is created by COVID-19 as there is a continuous rise in patient count testing positive,and they feel challenging to tackle this situation.Most researchers concentrate on COVID-19 data analysis using the machine learning paradigm in these situations.In the previous works,Long Short-Term Memory(LSTM)was used to predict future COVID-19 cases.According to LSTM network data,the outbreak is expected tofinish by June 2020.However,there is a chance of an over-fitting problem in LSTM and true positive;it may not produce the required results.The COVID-19 dataset has lower accuracy and a higher error rate in the existing system.The proposed method has been introduced to overcome the above-mentioned issues.For COVID-19 prediction,a Linear Decreasing Inertia Weight-based Cat Swarm Optimization with Half Binomial Distribution based Convolutional Neural Network(LDIWCSO-HBDCNN)approach is presented.In this suggested research study,the COVID-19 predicting dataset is employed as an input,and the min-max normalization approach is employed to normalize it.Optimum features are selected using Linear Decreasing Inertia Weight-based Cat Swarm Optimization(LDIWCSO)algorithm,enhancing the accuracy of classification.The Cat Swarm Optimization(CSO)algorithm’s convergence is enhanced using inertia weight in the LDIWCSO algorithm.It is used to select the essential features using the bestfitness function values.For a specified time across India,death and confirmed cases are predicted using the Half Binomial Distribution based Convolutional Neural Network(HBDCNN)technique based on selected features.As demonstrated by empirical observations,the proposed system produces significant performance in terms of f-measure,recall,precision,and accuracy.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61372156 and 61405053)the Natural Science Foundation of Zhejiang Province of China(Grant No.LZ13F04001)
文摘The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result,the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated,and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures.
基金supported by the National Natural Science Foundation of China under Grant No. 6107311
文摘In the reputation modeling of wireless sensor networks(WSNs) many literatures have proposed creative reputation indirect update methods,such as reputation integration,discounting,aging to eliminate,and filtering malicious reputation information. However,few have discussed the reputation direct update. In this paper,based on sound statistical theories,a negative binominal distribution method in the reputation direct update for WSNs is proposed. Results show that the proposed method is more suitable and time-saving for the reputation update of the resource constraint WSNs and can improve the computation power efficiency as well.
文摘In this article, the zero-inflated non-central negative binomial(ZINNB) distribution is introduced. Some of its basic properties are obtained. In addition, we use the maximum likelihood estimation method to estimate the parameters of the ZINNB distribution, and illustrate its application by fitting the actual data sets.
基金Supported by the postdoctral science and technology project of Hubei Province(2018Z27)the Fundamental Research Funds for the Central Universities(CCNU19ZN019)。
文摘It is proposed that ratios of the sixth order to the second order cumulant(C_(6)/C_(2))of conserved quantities are sensitive to the chiral crossover transition.Recently,the negative C_(6)/C_(2) was obtained in both theoretical Lattice QCD and experiments at■.In this study,we investigate the behavior of net-proton C_(6)/C_(2) in statistical Binomial distribution(BD)at■in Au+Au collisions.With the BD parameters extracted from RHIC/STAR,it is found that C_(6)/C_(2) can be negative.Furthermore,the obtained C_(6)/C_(2) becomes smaller when applying the same magnitude of experimental statistics and calculation method to simulations.In 0-10%centrality,there is a significant difference between the simulated result and theoretical expectation.Based on the extracted parameters and experimentally collected statistics,the baseline of net-proton C_(6)/C_(2) in BD is presented.
文摘The secure and normal operation of distributed networks is crucial for accurate parameter estimation.However,distributed networks are frequently susceptible to Byzantine attacks.Considering real-life scenarios,this paper investigates a probability Byzantine(PB)attack,utilizing a Bernoulli distribution to simulate the attack probability.Historically,additional detection mechanisms are used to mitigate such attacks,leading to increased energy consumption and burdens on distributed nodes,consequently diminishing operational efficiency.Differing from these approaches,an adaptive updating distributed estimation algorithm is proposed to mitigate the impact of PB attacks.In the proposed algorithm,a penalty strategy is initially incorporated during data updates to weaken the influence of the attack.Subsequently,an adaptive fusion weight is employed during data fusion to merge the estimations.Additionally,the reason why this penalty term weakens the attack has been analyzed,and the performance of the proposed algorithm is validated through simulation experiments.
基金National Advanced Research Project of China(No.51319030302)National Advanced Research Foundation of China(No.9140A 19030506KG0166)
文摘A complex mechatronics system Bayesian plan of demonstration test is studied based on the mixed beta distribution. During product design and improvement various information is appropriately considered by introducing inheritance factor, moreover, the inheritance factor is thought as a random variable, and the Bayesian decision of the qualification test plan is obtained, and the correctness of a Bayesian model presented is verified. The results show that the quantity of the test is too conservative according to classical methods under small binomial samples. Although traditional Bayesian analysis can consider test information of related or similar products, it ignores differences between such products. The method has solved the above problem, furthermore, considering the requirement in many practical projects, the differences among this method, the classical method and Bayesian with beta distribution are compared according to the plan of reliability acceptance test.
文摘This paper presents four methods of constructing the confidence interval for the proportion <i><span style="font-family:Verdana;">p</span></i><span style="font-family:;" "=""><span style="font-family:Verdana;"> of the binomial distribution. Evidence in the literature indicates the standard Wald confidence interval for the binomial proportion is inaccurate, especially for extreme values of </span><i><span style="font-family:Verdana;">p</span></i><span style="font-family:Verdana;">. Even for moderately large sample sizes, the coverage probabilities of the Wald confidence interval prove to be erratic for extreme values of </span><i><span style="font-family:Verdana;">p</span></i><span style="font-family:Verdana;">. Three alternative confidence intervals, namely, Wilson confidence interval, Clopper-Pearson interval, and likelihood interval</span></span><span style="font-family:Verdana;">,</span><span style="font-family:Verdana;"> are compared to the Wald confidence interval on the basis of coverage probability and expected length by means of simulation.</span>
基金Project supported by the National Natural Science Foundation of China (Nos. 10471008, 19831020)
文摘The rharginal recursive equations on excess-of-loss reinsurance treaty are investignted, under the assumption that the number of claims belongs to the family consisting of Poisson, binomial and negative binomial, and that the severity distribution has bounded continuous density function. On conditional of the numbers of claims associated with the reinsurer and the cedent, some recursive equations are obtained for the marginal distributions of the total payments of the reinsurer and the cedent.
文摘This paper begins with the overthrow of the concept of combining ability in crossbreeding by the concept of heritability.The reason is that general combining ability changes with the number and kind of pure strains in the foundation stock and hence special combining ability changes also,so that work with different kinds of pure strains in the foundation stock cannot be compared.Hence combining ability is useless as a parameter to predict the amount of heterosis expected in the next generation.On the other hand,since each cross has a separate heritability,it can be applied to a cross population just as successfully as in purebreeding.Since the same concept holds in both cases,resort to any other concept would be superfluous.That's why combining ability must be rejected.Another reason(not given in the full text)is,an infinite number of pure strains would be required in the foundation stock for its results to be comparable with those of the heritability theory,which disposes of its utility altogether.The main content of the thesis is then the centennial enigma of heterosis can be resolved by Descarte's theoretic method of deduction.Accordingly we start from the definition of heterosis.H=F¡-MP,where H is heterosis,F,is the first generation offspring,MP is the mean of the parents or midparent,and from the use of a binomial random variable and its extention to the multinomial case derive the basic relations of heterosis with its components.Starting with second degree statistics,we obtain Vn=Vr,-2cov(F,,MP)+Vup,where V and cov stand for variance and covariance.The equations of heterosis are v„=(1/2)Na²+(1/4)Nd’+Vr(F,)=additive dominance F,epistasis Vup=(1/2)Na’+(1/2)V1,additive parental epistasis V„=(1/4)Nd’+V(F)+(1/2)V1,dominance F,epistasis parental epistasis.where N is number of genes controlling a trait,a=(P1-P,)12,d is deviation from midparent,while the variance components are all indicated by their names under the repective terms.It turns out that all these can be easily computed from the data so that the problem becomes a simple one which any college student may solve.In other words,the right answers are found when the right questions are asked.Who had ever shown that the heritability principle is inapplicable in crossbreeding,e.g.,in a crossing of two pure strains?From this cue arose the realization that the F,of a cross of two pure strains must also be a Mendelian population,with p and q both equal to 1/2 which simplifies the algebra outright.This Heritability Theory of Heterosis,or HTH in capital letters,re-sts on 2 initial anguments:1)Since 0.5+0.5=1,crossing two pure strains gives a population which is only a special case of pure-breeding,thereforea heritability coefficient must exist for the F1;2)Our problem reduces to that of finding that coefficient;the an-swer is given by the additive component divided by Ve.,i.e.,(1/2)No'1 Vp..which is readily found from the solution of the het-erosis equations.Thus the elemnal enigma of heterosis is resolved!This happened at the end of the 20th century.We now come to the second point of the discovery,the new genetic parameter crossheritability which will rise in size with the increase of the number of times it's used and form the link between breeding and evolution.The advent of the Age of Evolution Engineering in the 21st century marks a totally new era,showing that artificial will ultimately supercede natural selection,with the long span of time element eliminated.For agriculture at least,it means there is no limit to the increase of food supply by the new method,with the concentra-tion of desirable genes by hybridization in place of the old theory of their fixation.Genetic gain is achieved through artificial selec-tion,with an 80%saving of time,labor and cost by adoption of the new method.Applied to a further increase in all kinds of agri-cultural products including hybrd rice,it means that a huge eacalation,in fact a New Green Revolution,on a much langer scale than that of any such before,is in view,provided it is adopted in our research and educational institutions as early as possible,ere its spread elsewhere.The possibilities from the evolution point of view can only be pictured by science fiction.
文摘The photon polarization law p(theta) = sin(2)theta is derived from a simple informational consideration by two methods: The first is via an intuitive principle of minimum Fisher information, the second is via a symmetry and invariance argument. The results demonstrate that in photon polarization, Nature has a tendency to hide herself as deep as possible while obeying some regular conditions.
文摘Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a major cause of damage in forests in this region particularly during the regeneration phase. The model developed in this study is based on a systematic sample inventory of forest cockchafer larvae by excavation across the Hessian Ried. These forest cockchafer larvae data were characterized by excess zeros and overdispersion. Methods: Using specific generalized additive regression models, different discrete distributions, including the Poisson, negative binomial and zero-inflated Poisson distributions, were compared. The methodology employed allowed the simultaneous estimation of non-linear model effects of causal covariates and, to account for spatial autocorrelation, of a 2-dimensional spatial trend function. In the validation of the models, both the Akaike information criterion (AIC) and more detailed graphical procedures based on randomized quantile residuals were used. Results: The negative binomial distribution was superior to the Poisson and the zero-inflated Poisson distributions, providing a near perfect fit to the data, which was proven in an extensive validation process. The causal predictors found to affect the density of larvae significantly were distance to water table and percentage of pure clay layer in the soil to a depth of I m. Model predictions showed that larva density increased with an increase in distance to the water table up to almost 4 m, after which it remained constant, and with a reduction in the percentage of pure clay layer. However this latter correlation was weak and requires further investigation. The 2-dimensional trend function indicated a strong spatial effect, and thus explained by far the highest proportion of variation in larva density. Conclusions: As such the model can be used to support forest practitioners in their decision making for regeneration and forest protection planning in the Hessian predicting future spatial patterns of the larva density is still comparatively weak. Ried. However, the application of the model for somewhat limited because the causal effects are
文摘With the support of the Fundamental Reliability Theoretical Research (FRTR) Foundation of the Quality Control Bureau of Ministry of Astronautics (MOA), PRC, 9 Chinese institutes and universities have worked for years on reliability statistics problems pending to be solved in space research and development. This paper gives a brief review of our main research results, including (1) Results on Normal Distributions; (2) Results on Weibull Distributions; (3) Results on the Synthesisof System Reliability-Theoretical Method; (4) Results on the Synthesis of System Reliability-Approximation Method: Binomial Distribution, Exponential Distribution, Weibull Distribution, Parallel System, General Cases; (5) Structual Reliability; (6) Zero-Failure Reliability Estimation; (7) Storage Life and Others. All these results can be acquired from the Quality Control Bureau of the Ministry of Aero-Space Industry (MAS).
文摘In this work, we show that when there is insufficient equipment for detecting a disease whose prevalence is t% in a sub-population of size N, it is optimal to divide the N samples into n groups of size r each and then, the value <img src="Edit_ce149849-3742-48fe-820b-02ccc0c92d83.bmp" alt="" /> allows systematic screening of all N individuals by performing less than N tests (In this expression, <img src="Edit_987eb236-a883-4894-ba2d-52bde5f35056.bmp" alt="" /> represents the floor function<sup>1</sup> of x ∈ R). Based on this result and on certain functions of the R software, we subsequently propose a probabilistic strategy capable of optimizing the screening tests under certain conditions.
文摘Various models have been proposed in the literature to study non-negative integer-valued time series. In this paper, we study estimators for the generalized Poisson autoregressive process of order 1, a model developed by Alzaid and Al-Osh [1]. We compare three estimation methods, the methods of moments, quasi-likelihood and conditional maximum likelihood and study their asymptotic properties. To compare the bias of the estimators in small samples, we perform a simulation study for various parameter values. Using the theory of estimating equations, we obtain expressions for the variance-covariance matrices of those three estimators, and we compare their asymptotic efficiency. Finally, we apply the methods derived in the paper to a real time series.
文摘The theory of evolution was advanced by Darwin in 1859, prior to Mendel’s experiments demonstrating the particulate nature of inheritance. The modern synthesis was formulated in the early 1940s, well before the concept of coded information was understood. This paper outlines four mathematical challenges to the modern synthesis, which are based on current understanding of the proposed mechanisms of evolutionary change within the constraints of experimental molecular biology.
文摘In this paper an effort has been made to study the general characteristics of slow particles produced in the interactions of 32S-Em at 200 AGeV to extract the information about the mechanism of particle production. The results have been compared with the experimental results obtained by other workers. The multiplicity distributions of the slow target associated particles (black, grey and heavy tracks) produced by 32S-beam with different targets have been studied. Also several types of correlations among them have been investigated. The variation of the produced particles with projectile mass number and target size has been studied. Also the multiplicity distributions of slow particles with NBD fits are presented and scaling multiplicity distributions of slow particles produced have been studied in order to check the validity of KNO-scaling.
基金supported by National Natural Science Foundation of China(Nos.11871027,11731015)Science and Technology Developing Plan of Jilin Province(No.20170101057JC)Cultivation Plan for Excellent Young Scholar Candidates of Jilin University.
文摘In this paper,we study a robust estimation method for the observation-driven integervalued time-series models in which the conditional probability mass of current observations is assumed to follow a negative binomial distribution.Maximum likelihood estimator is highly affected by the outliers.We resort to the minimum density power divergence estimator as a robust estimator and showthat it is strongly consistent and asymptotically normal under some regularity conditions.Simulation results are provided to illustrate the performance of the estimator.An application is performed on data for campylobacteriosis infections.