The process of entrainment-mixing between cumulus clouds and the ambient air is important for the development of cumulus clouds.Accurately obtaining the entrainment rate(λ)is particularly important for its parameteri...The process of entrainment-mixing between cumulus clouds and the ambient air is important for the development of cumulus clouds.Accurately obtaining the entrainment rate(λ)is particularly important for its parameterization within the overall cumulus parameterization scheme.In this study,an improved bulk-plume method is proposed by solving the equations of two conserved variables simultaneously to calculateλof cumulus clouds in a large-eddy simulation.The results demonstrate that the improved bulk-plume method is more reliable than the traditional bulk-plume method,becauseλ,as calculated from the improved method,falls within the range ofλvalues obtained from the traditional method using different conserved variables.The probability density functions ofλfor all data,different times,and different heights can be well-fitted by a log-normal distribution,which supports the assumed stochastic entrainment process in previous studies.Further analysis demonstrate that the relationship betweenλand the vertical velocity is better than other thermodynamic/dynamical properties;thus,the vertical velocity is recommended as the primary influencing factor for the parameterization ofλin the future.The results of this study enhance the theoretical understanding ofλand its influencing factors and shed new light on the development ofλparameterization.展开更多
In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distr...In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distributed fusion algorithms,such as the Covariance Union(CU)and Faulttolerant Generalized Convex Combination(FGCC),are only used for the point estimation case where local estimates and their associated error covariances are provided.A treatment with focus on the fault-tolerant distributed fusions of arbitrary local Probability Density Functions(PDFs)is lacking.For this problem,we first propose Kullback–Leibler Divergence(KLD)and reversed KLD induced functional Fuzzy c-Means(FCM)clustering algorithms to soft cluster all local PDFs,respectively.On this basis,two fault-tolerant distributed fusion algorithms of arbitrary local PDFs are then developed.They select the representing PDF of the cluster with the largest sum of memberships as the fused PDF.Numerical examples verify the better fault tolerance of the developed two distributed fusion algorithms.展开更多
The probability distributions of small sample data are difficult to determine,while a large proportion of samples occur in the early failure period,so it is particularly important to make full use of these data in the...The probability distributions of small sample data are difficult to determine,while a large proportion of samples occur in the early failure period,so it is particularly important to make full use of these data in the statistical analysis.Based on gamma distribution,four methods of probability density function(PDF)reconstruction with early failure data are proposed,and then the mean time between failures(MTBF)evaluation expressions are concluded from the reconstructed PDFs.Both theory analysis and an example show that method 2 is the best evaluation method in dealing with early-failure-small-sample data.The reconstruction methods of PDF also have certain guiding significance for other distribution types.展开更多
The probability density functions (pdf’s) and the first order structure functions (SF’s) of the pairwise Euclidean distances between scaled multichannel human EEG signals at different time lags under hypoxia and in ...The probability density functions (pdf’s) and the first order structure functions (SF’s) of the pairwise Euclidean distances between scaled multichannel human EEG signals at different time lags under hypoxia and in resting state at different ages are estimated. It is found that the hyper gamma distribution is a good fit for the empirically derived pdf in all cases. It means that only two parameters (sample mean of EEG Euclidean distances at a given time lag and relevant coefficient of variation) may be used in the approximate classification of empirical pdf’s. Both these parameters tend to increase in the first twenty years of life and tend to decrease as healthy adults getting older. Our findings indicate that such age-related dependence of these parameters looks like as age- related dependence of the total brain white matter volume. It is shown that 15 min hypoxia (8% oxygen in nitrogen) causes a significant (about 50%) decrease of the mean relative displacement EEG value that is typical for the rest state. In some sense the impact of the oxygen deficit looks like the subject getting older during short-term period.展开更多
Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in...Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in GIS does not obey the normal distribution but the p-norm distribution with a determinate parameter. Assuming that the error is random and has the same statistical properties, the probability density function of the normal distribution, Laplace distribution and p-norm distribution are derived based on the arithmetic mean axiom, median axiom and p-median axiom, which means that the normal distribution is only one of these distributions but not the least one. Based on this ideal distribution fitness tests such as Skewness and Kurtosis coefficient test, Pearson chi-square chi(2) test and Kolmogorov test for digitized data are conducted. The results show that the error in map digitization obeys the p-norm distribution whose parameter is close to 1.60. A least p-norm estimation and the least square estimation of digitized data are further analyzed, showing that the least p-norm adjustment is better than the least square adjustment for digitized data processing in GIS.展开更多
Based on the maximum entropy principle, a probability density function (PDF) is derived for the distribution of wave heights in a random wave field, without any more hypothesis. The present PDF, being a non-Rayleigh f...Based on the maximum entropy principle, a probability density function (PDF) is derived for the distribution of wave heights in a random wave field, without any more hypothesis. The present PDF, being a non-Rayleigh form, involves two parameters: the average wave height H— and the state parameter γ. The role of γ in the distribution of wave heights is examined. It is found that γ may be a certain measure of sea state. A least square method for determining γ from measured data is proposed. In virtue of the method, the values of γ are determined for three sea states from the data measured in the East China Sea. The present PDF is compared with the well known Rayleigh PDF of wave height and it is shown that it much better fits the data than the Rayleigh PDF. It is expected that the present PDF would fit some other wave variables, since its derivation is not restricted only to the wave height.展开更多
The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limi...The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFF filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.展开更多
Adaptive digital filtering has traditionally been developed based on the minimum mean square error (MMSE) criterion and has found ever-increasing applications in communications. This paper presents an alternative ad...Adaptive digital filtering has traditionally been developed based on the minimum mean square error (MMSE) criterion and has found ever-increasing applications in communications. This paper presents an alternative adaptive filtering design based on the minimum symbol error rate (MSER) criterion for communication applications. It is shown that the MSER filtering is smarter, as it exploits the non-Gaussian distribution of filter output effectively. Consequently, it provides significant performance gain in terms of smaller symbol error over the MMSE approach. Adopting Parzen window or kernel density estimation for a probability density function, a block-data gradient adaptive MSER algorithm is derived. A stochastic gradient adaptive MSER algorithm, referred to as the least symbol error rate, is further developed for sample-by-sample adaptive implementation of the MSER filtering. Two applications, involving single-user channel equalization and beamforming assisted receiver, are included to demonstrate the effectiveness and generality of the proposed adaptive MSER filtering approach.展开更多
The paper considers the theoretical basics and the specific mathematical techniques having been developed for solving the tasks of the stochastic data analysis within the Rice statistical model in which the output sig...The paper considers the theoretical basics and the specific mathematical techniques having been developed for solving the tasks of the stochastic data analysis within the Rice statistical model in which the output signal’s amplitude is composed as a sum of the sought-for initial value and a random Gaussian noise. The Rician signal’s characteristics such as the average value and the noise dispersion have been shown to depend upon the Rice distribution’s parameters nonlinearly what has become a prerequisite for the development of a new approach to the stochastic Rician data analysis implying the joint signal and noise accurate evaluation. The joint computing of the Rice distribution’s parameters allows efficient reconstruction of the signal’s in-formative component against the noise background. A meaningful advantage of the proposed approach consists in the absence of restrictions connected with any a priori suppositions inherent to the traditional techniques. The results of the numerical experiments are provided confirming the efficiency of the elaborated approach to stochastic data analysis within the Rice statistical model.展开更多
The load balance is a critical issue of distributed Hash table (DHT), and the previous work shows that there exists O(logn) imbalance of load in Chord. The load distribution of Chord, Pastry, and the virtual serve...The load balance is a critical issue of distributed Hash table (DHT), and the previous work shows that there exists O(logn) imbalance of load in Chord. The load distribution of Chord, Pastry, and the virtual servers (VS) balancing scheme and deduces the closed form expressions of the probability density function (PDF) and cumulative distribution function (CDF) of the load in these DHTs is analyzes. The analysis and simulation show that the load of all these DHTs obeys the gamma distribution with similar formed parameters.展开更多
基金supported by the National Natural Science Foundation of China(Grant Nos.42175099,42027804,42075073)the Innovative Project of Postgraduates in Jiangsu Province in 2023(Grant No.KYCX23_1319)+3 种基金supported by the National Natural Science Foundation of China(Grant No.42205080)the Natural Science Foundation of Sichuan(Grant No.2023YFS0442)the Research Fund of Civil Aviation Flight University of China(Grant No.J2022-037)supported by the National Key Scientific and Technological Infrastructure project“Earth System Science Numerical Simulator Facility”(Earth Lab)。
文摘The process of entrainment-mixing between cumulus clouds and the ambient air is important for the development of cumulus clouds.Accurately obtaining the entrainment rate(λ)is particularly important for its parameterization within the overall cumulus parameterization scheme.In this study,an improved bulk-plume method is proposed by solving the equations of two conserved variables simultaneously to calculateλof cumulus clouds in a large-eddy simulation.The results demonstrate that the improved bulk-plume method is more reliable than the traditional bulk-plume method,becauseλ,as calculated from the improved method,falls within the range ofλvalues obtained from the traditional method using different conserved variables.The probability density functions ofλfor all data,different times,and different heights can be well-fitted by a log-normal distribution,which supports the assumed stochastic entrainment process in previous studies.Further analysis demonstrate that the relationship betweenλand the vertical velocity is better than other thermodynamic/dynamical properties;thus,the vertical velocity is recommended as the primary influencing factor for the parameterization ofλin the future.The results of this study enhance the theoretical understanding ofλand its influencing factors and shed new light on the development ofλparameterization.
基金supported in part by the Open Fund of Intelligent Control Laboratory,China(No.ICL-2023–0202)in part by National Key R&D Program of China(Nos.2021YFC2202600,2021YFC2202603)。
文摘In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distributed fusion algorithms,such as the Covariance Union(CU)and Faulttolerant Generalized Convex Combination(FGCC),are only used for the point estimation case where local estimates and their associated error covariances are provided.A treatment with focus on the fault-tolerant distributed fusions of arbitrary local Probability Density Functions(PDFs)is lacking.For this problem,we first propose Kullback–Leibler Divergence(KLD)and reversed KLD induced functional Fuzzy c-Means(FCM)clustering algorithms to soft cluster all local PDFs,respectively.On this basis,two fault-tolerant distributed fusion algorithms of arbitrary local PDFs are then developed.They select the representing PDF of the cluster with the largest sum of memberships as the fused PDF.Numerical examples verify the better fault tolerance of the developed two distributed fusion algorithms.
基金National Science and Technology Major Project of China(No.2016ZX04003001)。
文摘The probability distributions of small sample data are difficult to determine,while a large proportion of samples occur in the early failure period,so it is particularly important to make full use of these data in the statistical analysis.Based on gamma distribution,four methods of probability density function(PDF)reconstruction with early failure data are proposed,and then the mean time between failures(MTBF)evaluation expressions are concluded from the reconstructed PDFs.Both theory analysis and an example show that method 2 is the best evaluation method in dealing with early-failure-small-sample data.The reconstruction methods of PDF also have certain guiding significance for other distribution types.
文摘The probability density functions (pdf’s) and the first order structure functions (SF’s) of the pairwise Euclidean distances between scaled multichannel human EEG signals at different time lags under hypoxia and in resting state at different ages are estimated. It is found that the hyper gamma distribution is a good fit for the empirically derived pdf in all cases. It means that only two parameters (sample mean of EEG Euclidean distances at a given time lag and relevant coefficient of variation) may be used in the approximate classification of empirical pdf’s. Both these parameters tend to increase in the first twenty years of life and tend to decrease as healthy adults getting older. Our findings indicate that such age-related dependence of these parameters looks like as age- related dependence of the total brain white matter volume. It is shown that 15 min hypoxia (8% oxygen in nitrogen) causes a significant (about 50%) decrease of the mean relative displacement EEG value that is typical for the rest state. In some sense the impact of the oxygen deficit looks like the subject getting older during short-term period.
文摘Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in GIS does not obey the normal distribution but the p-norm distribution with a determinate parameter. Assuming that the error is random and has the same statistical properties, the probability density function of the normal distribution, Laplace distribution and p-norm distribution are derived based on the arithmetic mean axiom, median axiom and p-median axiom, which means that the normal distribution is only one of these distributions but not the least one. Based on this ideal distribution fitness tests such as Skewness and Kurtosis coefficient test, Pearson chi-square chi(2) test and Kolmogorov test for digitized data are conducted. The results show that the error in map digitization obeys the p-norm distribution whose parameter is close to 1.60. A least p-norm estimation and the least square estimation of digitized data are further analyzed, showing that the least p-norm adjustment is better than the least square adjustment for digitized data processing in GIS.
文摘Based on the maximum entropy principle, a probability density function (PDF) is derived for the distribution of wave heights in a random wave field, without any more hypothesis. The present PDF, being a non-Rayleigh form, involves two parameters: the average wave height H— and the state parameter γ. The role of γ in the distribution of wave heights is examined. It is found that γ may be a certain measure of sea state. A least square method for determining γ from measured data is proposed. In virtue of the method, the values of γ are determined for three sea states from the data measured in the East China Sea. The present PDF is compared with the well known Rayleigh PDF of wave height and it is shown that it much better fits the data than the Rayleigh PDF. It is expected that the present PDF would fit some other wave variables, since its derivation is not restricted only to the wave height.
基金This work was financially supported by the National Natural Science Foundation of China (Grant No50479028)the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No20060423009)
文摘The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFF filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.
文摘Adaptive digital filtering has traditionally been developed based on the minimum mean square error (MMSE) criterion and has found ever-increasing applications in communications. This paper presents an alternative adaptive filtering design based on the minimum symbol error rate (MSER) criterion for communication applications. It is shown that the MSER filtering is smarter, as it exploits the non-Gaussian distribution of filter output effectively. Consequently, it provides significant performance gain in terms of smaller symbol error over the MMSE approach. Adopting Parzen window or kernel density estimation for a probability density function, a block-data gradient adaptive MSER algorithm is derived. A stochastic gradient adaptive MSER algorithm, referred to as the least symbol error rate, is further developed for sample-by-sample adaptive implementation of the MSER filtering. Two applications, involving single-user channel equalization and beamforming assisted receiver, are included to demonstrate the effectiveness and generality of the proposed adaptive MSER filtering approach.
文摘The paper considers the theoretical basics and the specific mathematical techniques having been developed for solving the tasks of the stochastic data analysis within the Rice statistical model in which the output signal’s amplitude is composed as a sum of the sought-for initial value and a random Gaussian noise. The Rician signal’s characteristics such as the average value and the noise dispersion have been shown to depend upon the Rice distribution’s parameters nonlinearly what has become a prerequisite for the development of a new approach to the stochastic Rician data analysis implying the joint signal and noise accurate evaluation. The joint computing of the Rice distribution’s parameters allows efficient reconstruction of the signal’s in-formative component against the noise background. A meaningful advantage of the proposed approach consists in the absence of restrictions connected with any a priori suppositions inherent to the traditional techniques. The results of the numerical experiments are provided confirming the efficiency of the elaborated approach to stochastic data analysis within the Rice statistical model.
基金supported by the National Development and Reform Commission of China (CNGI-04-12-1D).
文摘The load balance is a critical issue of distributed Hash table (DHT), and the previous work shows that there exists O(logn) imbalance of load in Chord. The load distribution of Chord, Pastry, and the virtual servers (VS) balancing scheme and deduces the closed form expressions of the probability density function (PDF) and cumulative distribution function (CDF) of the load in these DHTs is analyzes. The analysis and simulation show that the load of all these DHTs obeys the gamma distribution with similar formed parameters.