In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distr...In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distributed fusion algorithms,such as the Covariance Union(CU)and Faulttolerant Generalized Convex Combination(FGCC),are only used for the point estimation case where local estimates and their associated error covariances are provided.A treatment with focus on the fault-tolerant distributed fusions of arbitrary local Probability Density Functions(PDFs)is lacking.For this problem,we first propose Kullback–Leibler Divergence(KLD)and reversed KLD induced functional Fuzzy c-Means(FCM)clustering algorithms to soft cluster all local PDFs,respectively.On this basis,two fault-tolerant distributed fusion algorithms of arbitrary local PDFs are then developed.They select the representing PDF of the cluster with the largest sum of memberships as the fused PDF.Numerical examples verify the better fault tolerance of the developed two distributed fusion algorithms.展开更多
Starting with the Aalen (1989) version of Cox (1972) 'regression model' we show the method for construction of "any" joint survival function given marginal survival functions. Basically, however, we restrict o...Starting with the Aalen (1989) version of Cox (1972) 'regression model' we show the method for construction of "any" joint survival function given marginal survival functions. Basically, however, we restrict ourselves to model positive stochastic dependences only with the general assumption that the underlying two marginal random variables are centered on the set of nonnegative real values. With only these assumptions we obtain nice general characterization of bivariate probability distributions that may play similar role as the copula methodology. Examples of reliability and biomedical applications are given.展开更多
Turbulent gas-particle flows are studied by a kinetic description using a prob- ability density function (PDF). Unlike other investigators deriving the particle Reynolds stress equations using the PDF equations, the...Turbulent gas-particle flows are studied by a kinetic description using a prob- ability density function (PDF). Unlike other investigators deriving the particle Reynolds stress equations using the PDF equations, the particle PDF transport equations are di- rectly solved either using a finite-difference method for two-dimensional (2D) problems or using a Monte-Carlo (MC) method for three-dimensional (3D) problems. The proposed differential stress model together with the PDF (DSM-PDF) is used to simulate turbulent swirling gas-particle flows. The simulation results are compared with the experimental results and the second-order moment (SOM) two-phase modeling results. All of these simulation results are in agreement with the experimental results, implying that the PDF approach validates the SOM two-phase turbulence modeling. The PDF model with the SOM-MC method is used to simulate evaporating gas-droplet flows, and the simulation results are in good agreement with the experimental results.展开更多
The probability distributions of small sample data are difficult to determine,while a large proportion of samples occur in the early failure period,so it is particularly important to make full use of these data in the...The probability distributions of small sample data are difficult to determine,while a large proportion of samples occur in the early failure period,so it is particularly important to make full use of these data in the statistical analysis.Based on gamma distribution,four methods of probability density function(PDF)reconstruction with early failure data are proposed,and then the mean time between failures(MTBF)evaluation expressions are concluded from the reconstructed PDFs.Both theory analysis and an example show that method 2 is the best evaluation method in dealing with early-failure-small-sample data.The reconstruction methods of PDF also have certain guiding significance for other distribution types.展开更多
A new identification method of neuro-uzzy Hammerstein model based on probability density function(PDF) is presented,which is different from the idea that mean squared error(MSE) is employed as the index function in tr...A new identification method of neuro-uzzy Hammerstein model based on probability density function(PDF) is presented,which is different from the idea that mean squared error(MSE) is employed as the index function in traditional identification methods.Firstly,a neuro-fuzzy based Hammerstein model is constructed to describe the nonlinearity of Hammerstein process without any prior process knowledge.Secondly,a kind of special test signal is used to separate the link parts of the Hammerstein model.More specifically,the conception of PDF is introduced to solve the identification problem of the neuro-fuzzy Hammerstein model.The antecedent parameters are estimated by a clustering algorithm,while the consequent parameters of the model are identified by designing a virtual PDF control system in which the PDF of the modeling error is estimated and controlled to converge to the target.The proposed method not only guarantees the accuracy of the model but also dominates the spatial distribution of PDF of the model error to improve the generalization ability of the model.Simulated results show the effectiveness of the proposed method.展开更多
For performance optimization such as placement,interconnect synthesis,and routing, an efficient and accurate interconnect delay metric is critical,even in design tools development like design for yield (DFY) and des...For performance optimization such as placement,interconnect synthesis,and routing, an efficient and accurate interconnect delay metric is critical,even in design tools development like design for yield (DFY) and design for manufacture (DFM). In the nanometer regime, the recently proposed delay models for RLC interconnects based on statistical probability density function (PDF)interpretation such as PRIMO,H-gamma,WED and RLD bridge the gap between accuracy and efficiency. However, these models always require table look-up when operating. In this paper, a novel delay model based on the Birnbaum-Saunders distribution (BSD) is presented. BSD can accomplish interconnect delay estimation fast and accurately without table look-up operations. Furthermore, it only needs the first two moments to match. Experimental results in 90nm technology show that BSD is robust, easy to implement,efficient,and accurate.展开更多
Draxler and Zessin [1] derived the power function for a class of conditional tests of assumptions of a psychometric model known as the Rasch model and suggested an MCMC approach developed by Verhelst [2] for the numer...Draxler and Zessin [1] derived the power function for a class of conditional tests of assumptions of a psychometric model known as the Rasch model and suggested an MCMC approach developed by Verhelst [2] for the numerical approximation of the power of the tests. In this contribution, the precision of the Verhelst approach is investigated and compared with an exact sampling procedure proposed by Miller and Harrison [3] for which the discrete probability distribution to be sampled from is exactly known. Results show no substantial differences between the two numerical procedures and quite accurate power computations. Regarding the question of computing time the Verhelst approach will have to be considered much more efficient.展开更多
This paper presents the Advanced Observer Model (AOM), a groundbreaking conceptual framework designed to clarify the complex and often enigmatic nature of quantum mechanics. The AOM serves as a metaphorical lens, brin...This paper presents the Advanced Observer Model (AOM), a groundbreaking conceptual framework designed to clarify the complex and often enigmatic nature of quantum mechanics. The AOM serves as a metaphorical lens, bringing the elusive quantum realm into sharper focus by transforming its inherent uncertainty into a coherent, structured ‘Frame Stream’ that aids in the understanding of quantum phenomena. While the AOM offers conceptual simplicity and clarity, it recognizes the necessity of a rigorous theoretical foundation to address the fundamental uncertainties that lie at the core of quantum mechanics. This paper seeks to illuminate those theoretical ambiguities, bridging the gap between the abstract insights of the AOM and the intricate mathematical foundations of quantum theory. By integrating the conceptual clarity of the AOM with the theoretical intricacies of quantum mechanics, this work aspires to deepen our understanding of this fascinating and elusive field.展开更多
为了有效评价测量响应中不确定性对结构参量识别结果的影响,提出一种基于λ概率密度函数(Probability distribution function,PDF)和一次二阶矩的不确定性计算反求方法。采用二次衍生λ-PDF对待识不确定性参量的PDF进行建模。内层通过...为了有效评价测量响应中不确定性对结构参量识别结果的影响,提出一种基于λ概率密度函数(Probability distribution function,PDF)和一次二阶矩的不确定性计算反求方法。采用二次衍生λ-PDF对待识不确定性参量的PDF进行建模。内层通过对参量呈λ-PDF的功能函数采用一次二阶矩法进行正问题求解,得到计算响应的概率分布;外层通过最小化测量响应与计算响应之间的概率分布特征量将不确定性反问题转化为确定性的最优化问题,并用隔代映射遗传算法识别未知参量λ-PDF的参数。本方法不仅有效地实现了结构未知参量PDF的估计,而且与传统基于抽样的统计方法相比,计算效率较高。数值算例和工程应用验证了本方法的可行性和有效性。展开更多
Highly versatile machines, such as wheel loaders, forklifts, and mining haulers, are subject to many kinds of working conditions, as well as indefinite factors that lead to the complexity of the load. The load probabi...Highly versatile machines, such as wheel loaders, forklifts, and mining haulers, are subject to many kinds of working conditions, as well as indefinite factors that lead to the complexity of the load. The load probability distribution function (PDF) of transmission gears has many distributions centers; thus, its PDF cannot be well represented by just a single-peak function. For the purpose of representing the distribution characteristics of the complicated phenomenon accurately, this paper proposes a novel method to establish a mixture model. Based on linear regression models and correlation coefficients, the proposed method can be used to automatically select the best-fitting function in the mixture model. Coefficient of determination, the mean square error, and the maximum deviation are chosen and then used as judging criteria to describe the fitting precision between the theoretical distribution and the corresponding histogram of the available load data. The applicability of this modeling method is illustrated by the field testing data of a wheel loader. Meanwhile, the load spectra based on the mixture model are compiled. The comparison results show that the mixture model is more suitable for the description of the load-distribution characteristics. The proposed research improves the flexibility and intelligence of modeling, reduces the statistical error and enhances the fitting accuracy, and the load spectra complied by this method can better reflect the actual load characteristic of the gear component.展开更多
The steady states and the transient properties of an insect outbreak model driven by Gaussian colored noise are studied in this paper. According to the Fokker-Planck equation in the unified colored-noise approximation...The steady states and the transient properties of an insect outbreak model driven by Gaussian colored noise are studied in this paper. According to the Fokker-Planck equation in the unified colored-noise approximation, we analyse the stationary probability distribution and the mean first-passage time of this model. By numerical analysis, the effects of the self-correlation time of insect birth rate and predation rate respectively reveal a manifest population divergence on the insect density. The decrease of the mean first-passage time indicates an enhancement dynamic on the density divergency with colored noise of a large self-correlation time based on the insect outbreak model.展开更多
The paper deals with the performing of a critical analysis of the problems arising in matching the classical models of the statistical and phenomenological thermodynamics. The performed analysis shows that some concep...The paper deals with the performing of a critical analysis of the problems arising in matching the classical models of the statistical and phenomenological thermodynamics. The performed analysis shows that some concepts of the statistical and phenomenological methods of describing the classical systems do not quite correlate with each other. Particularly, in these methods various caloric ideal gas equations of state are employed, while the possibility existing in the thermodynamic cyclic processes to obtain the same distributions both due to a change of the particle concentration and owing to a change of temperature is not allowed for in the statistical methods. The above-mentioned difference of the equations of state is cleared away when using in the statistical functions corresponding to the canonical Gibbs equations instead of the Planck’s constant a new scale factor that depends on the parameters of a system and coincides with the Planck’s constant in going of the system to the degenerate state. Under such an approach, the statistical entropy is transformed into one of the forms of heat capacity. In its turn, the agreement of the methods under consideration in the question as to the dependence of the molecular distributions on the concentration of particles, apparently, will call for further refinement of the physical model of ideal gas and the techniques for its statistical description.展开更多
Based on Maxwell’s constraint counting theory, rigidity percolation in GexSe1-x glasses occurs when the mean coordination number reaches the value of 2.4. This corresponds to Ge0.20Se0.80 glass. At this composition, ...Based on Maxwell’s constraint counting theory, rigidity percolation in GexSe1-x glasses occurs when the mean coordination number reaches the value of 2.4. This corresponds to Ge0.20Se0.80 glass. At this composition, the number of constraints experienced by an atom equals the number of degrees of freedom in three dimensions. Hence, at this composition, the network changes from a floppy phase to a rigid phase, and rigidity starts to percolate. In this work, we use reverse Monte Carlo (RMC) modeling to model the structure of Ge0.20Se0.80 glass by simulating its experimental total atomic pair distribution function (PDF) obtained via high energy synchrotron radiation. A three-dimensional configuration of 2836 atoms was obtained, from which we extracted the partial atomic pair distribution functions associated with Ge-Ge, Ge-Se and Se-Se real space correlations that are hard to extract experimentally from total scattering methods. Bond angle distributions, coordination numbers, mean coordination numbers and the number of floppy modes were also extracted and discussed. More structural insights about network topology at this composition were illustrated. The results indicate that in Ge0.20Se0.80 glass, Ge atoms break up and cross-link the Se chain structure, and form structural units that are four-fold coordinated (the GeSe4 tetrahedra). These tetrahedra form the basic building block and are connected via shared Se atoms or short Se chains. The extent of the intermediate ranged oscillations in real space (as extracted from the width of the first sharp diffraction peak) was found to be around 19.6 ?. The bonding schemes in this glass are consistent with the so-called “8-N” rule and can be interpreted in terms of a chemically ordered network model.展开更多
基金supported in part by the Open Fund of Intelligent Control Laboratory,China(No.ICL-2023–0202)in part by National Key R&D Program of China(Nos.2021YFC2202600,2021YFC2202603)。
文摘In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distributed fusion algorithms,such as the Covariance Union(CU)and Faulttolerant Generalized Convex Combination(FGCC),are only used for the point estimation case where local estimates and their associated error covariances are provided.A treatment with focus on the fault-tolerant distributed fusions of arbitrary local Probability Density Functions(PDFs)is lacking.For this problem,we first propose Kullback–Leibler Divergence(KLD)and reversed KLD induced functional Fuzzy c-Means(FCM)clustering algorithms to soft cluster all local PDFs,respectively.On this basis,two fault-tolerant distributed fusion algorithms of arbitrary local PDFs are then developed.They select the representing PDF of the cluster with the largest sum of memberships as the fused PDF.Numerical examples verify the better fault tolerance of the developed two distributed fusion algorithms.
文摘Starting with the Aalen (1989) version of Cox (1972) 'regression model' we show the method for construction of "any" joint survival function given marginal survival functions. Basically, however, we restrict ourselves to model positive stochastic dependences only with the general assumption that the underlying two marginal random variables are centered on the set of nonnegative real values. With only these assumptions we obtain nice general characterization of bivariate probability distributions that may play similar role as the copula methodology. Examples of reliability and biomedical applications are given.
基金supported by the National Natural Science Foundation of China(No.51390493)
文摘Turbulent gas-particle flows are studied by a kinetic description using a prob- ability density function (PDF). Unlike other investigators deriving the particle Reynolds stress equations using the PDF equations, the particle PDF transport equations are di- rectly solved either using a finite-difference method for two-dimensional (2D) problems or using a Monte-Carlo (MC) method for three-dimensional (3D) problems. The proposed differential stress model together with the PDF (DSM-PDF) is used to simulate turbulent swirling gas-particle flows. The simulation results are compared with the experimental results and the second-order moment (SOM) two-phase modeling results. All of these simulation results are in agreement with the experimental results, implying that the PDF approach validates the SOM two-phase turbulence modeling. The PDF model with the SOM-MC method is used to simulate evaporating gas-droplet flows, and the simulation results are in good agreement with the experimental results.
基金National Science and Technology Major Project of China(No.2016ZX04003001)。
文摘The probability distributions of small sample data are difficult to determine,while a large proportion of samples occur in the early failure period,so it is particularly important to make full use of these data in the statistical analysis.Based on gamma distribution,four methods of probability density function(PDF)reconstruction with early failure data are proposed,and then the mean time between failures(MTBF)evaluation expressions are concluded from the reconstructed PDFs.Both theory analysis and an example show that method 2 is the best evaluation method in dealing with early-failure-small-sample data.The reconstruction methods of PDF also have certain guiding significance for other distribution types.
基金National Natural Science Foundation of China(No.61374044)Shanghai Municipal Science and Technology Commission,China(No.15510722100)+2 种基金Shanghai Municipal Education Commission,China(No.14ZZ088)Shanghai Talent Development Plan,ChinaShanghai Baoshan Science and Technology Commission,China(No.bkw2013120)
文摘A new identification method of neuro-uzzy Hammerstein model based on probability density function(PDF) is presented,which is different from the idea that mean squared error(MSE) is employed as the index function in traditional identification methods.Firstly,a neuro-fuzzy based Hammerstein model is constructed to describe the nonlinearity of Hammerstein process without any prior process knowledge.Secondly,a kind of special test signal is used to separate the link parts of the Hammerstein model.More specifically,the conception of PDF is introduced to solve the identification problem of the neuro-fuzzy Hammerstein model.The antecedent parameters are estimated by a clustering algorithm,while the consequent parameters of the model are identified by designing a virtual PDF control system in which the PDF of the modeling error is estimated and controlled to converge to the target.The proposed method not only guarantees the accuracy of the model but also dominates the spatial distribution of PDF of the model error to improve the generalization ability of the model.Simulated results show the effectiveness of the proposed method.
文摘For performance optimization such as placement,interconnect synthesis,and routing, an efficient and accurate interconnect delay metric is critical,even in design tools development like design for yield (DFY) and design for manufacture (DFM). In the nanometer regime, the recently proposed delay models for RLC interconnects based on statistical probability density function (PDF)interpretation such as PRIMO,H-gamma,WED and RLD bridge the gap between accuracy and efficiency. However, these models always require table look-up when operating. In this paper, a novel delay model based on the Birnbaum-Saunders distribution (BSD) is presented. BSD can accomplish interconnect delay estimation fast and accurately without table look-up operations. Furthermore, it only needs the first two moments to match. Experimental results in 90nm technology show that BSD is robust, easy to implement,efficient,and accurate.
文摘Draxler and Zessin [1] derived the power function for a class of conditional tests of assumptions of a psychometric model known as the Rasch model and suggested an MCMC approach developed by Verhelst [2] for the numerical approximation of the power of the tests. In this contribution, the precision of the Verhelst approach is investigated and compared with an exact sampling procedure proposed by Miller and Harrison [3] for which the discrete probability distribution to be sampled from is exactly known. Results show no substantial differences between the two numerical procedures and quite accurate power computations. Regarding the question of computing time the Verhelst approach will have to be considered much more efficient.
文摘This paper presents the Advanced Observer Model (AOM), a groundbreaking conceptual framework designed to clarify the complex and often enigmatic nature of quantum mechanics. The AOM serves as a metaphorical lens, bringing the elusive quantum realm into sharper focus by transforming its inherent uncertainty into a coherent, structured ‘Frame Stream’ that aids in the understanding of quantum phenomena. While the AOM offers conceptual simplicity and clarity, it recognizes the necessity of a rigorous theoretical foundation to address the fundamental uncertainties that lie at the core of quantum mechanics. This paper seeks to illuminate those theoretical ambiguities, bridging the gap between the abstract insights of the AOM and the intricate mathematical foundations of quantum theory. By integrating the conceptual clarity of the AOM with the theoretical intricacies of quantum mechanics, this work aspires to deepen our understanding of this fascinating and elusive field.
文摘为了有效评价测量响应中不确定性对结构参量识别结果的影响,提出一种基于λ概率密度函数(Probability distribution function,PDF)和一次二阶矩的不确定性计算反求方法。采用二次衍生λ-PDF对待识不确定性参量的PDF进行建模。内层通过对参量呈λ-PDF的功能函数采用一次二阶矩法进行正问题求解,得到计算响应的概率分布;外层通过最小化测量响应与计算响应之间的概率分布特征量将不确定性反问题转化为确定性的最优化问题,并用隔代映射遗传算法识别未知参量λ-PDF的参数。本方法不仅有效地实现了结构未知参量PDF的估计,而且与传统基于抽样的统计方法相比,计算效率较高。数值算例和工程应用验证了本方法的可行性和有效性。
文摘间歇过程的优化控制依赖于过程精确的数学模型,数据驱动的建模方法是目前间歇过程模型研究中的热点问题。突破传统数据驱动建模方法中采用均方差(mean squared error,MSE)作为准则函数的思想,提出一种新颖的间歇过程数据驱动建模方法,引入了概率密度函数(probability density function,PDF)控制的概念,构造间歇过程模型误差控制系统,将模型的可调参数作为控制系统的输入,模型误差PDF的形状作为控制系统的输出,从而把开环模型参数辨识问题转化为模型误差PDF形状的闭环控制问题。通过可调参数控制模型误差PDF的空间分布状态,不仅能够保障模型精度,还可控制模型误差的空间分布状态,从而消除模型中的有色噪声。仿真实验表明,基于模型误差PDF形状的间歇过程数据驱动模型具有较好的建模精度、鲁棒性和泛化能力,为间歇过程的数据驱动建模提供了一条新途径。
基金supported by National Natural Science Foundation of China (Grant Nos. 50805065, 51075179)
文摘Highly versatile machines, such as wheel loaders, forklifts, and mining haulers, are subject to many kinds of working conditions, as well as indefinite factors that lead to the complexity of the load. The load probability distribution function (PDF) of transmission gears has many distributions centers; thus, its PDF cannot be well represented by just a single-peak function. For the purpose of representing the distribution characteristics of the complicated phenomenon accurately, this paper proposes a novel method to establish a mixture model. Based on linear regression models and correlation coefficients, the proposed method can be used to automatically select the best-fitting function in the mixture model. Coefficient of determination, the mean square error, and the maximum deviation are chosen and then used as judging criteria to describe the fitting precision between the theoretical distribution and the corresponding histogram of the available load data. The applicability of this modeling method is illustrated by the field testing data of a wheel loader. Meanwhile, the load spectra based on the mixture model are compiled. The comparison results show that the mixture model is more suitable for the description of the load-distribution characteristics. The proposed research improves the flexibility and intelligence of modeling, reduces the statistical error and enhances the fitting accuracy, and the load spectra complied by this method can better reflect the actual load characteristic of the gear component.
基金Project supported by the Natural Science Basic Research Plan in Shaanxi Province,China (Grant No. SJ08A12)the Science Foundation of the Education Bureau of Shaanxi Province,China (Grant No. 12JK0962)the Science Foundation of Baoji University of Science and Arts of China (Grant No. ZK11053)
文摘The steady states and the transient properties of an insect outbreak model driven by Gaussian colored noise are studied in this paper. According to the Fokker-Planck equation in the unified colored-noise approximation, we analyse the stationary probability distribution and the mean first-passage time of this model. By numerical analysis, the effects of the self-correlation time of insect birth rate and predation rate respectively reveal a manifest population divergence on the insect density. The decrease of the mean first-passage time indicates an enhancement dynamic on the density divergency with colored noise of a large self-correlation time based on the insect outbreak model.
文摘The paper deals with the performing of a critical analysis of the problems arising in matching the classical models of the statistical and phenomenological thermodynamics. The performed analysis shows that some concepts of the statistical and phenomenological methods of describing the classical systems do not quite correlate with each other. Particularly, in these methods various caloric ideal gas equations of state are employed, while the possibility existing in the thermodynamic cyclic processes to obtain the same distributions both due to a change of the particle concentration and owing to a change of temperature is not allowed for in the statistical methods. The above-mentioned difference of the equations of state is cleared away when using in the statistical functions corresponding to the canonical Gibbs equations instead of the Planck’s constant a new scale factor that depends on the parameters of a system and coincides with the Planck’s constant in going of the system to the degenerate state. Under such an approach, the statistical entropy is transformed into one of the forms of heat capacity. In its turn, the agreement of the methods under consideration in the question as to the dependence of the molecular distributions on the concentration of particles, apparently, will call for further refinement of the physical model of ideal gas and the techniques for its statistical description.
文摘Based on Maxwell’s constraint counting theory, rigidity percolation in GexSe1-x glasses occurs when the mean coordination number reaches the value of 2.4. This corresponds to Ge0.20Se0.80 glass. At this composition, the number of constraints experienced by an atom equals the number of degrees of freedom in three dimensions. Hence, at this composition, the network changes from a floppy phase to a rigid phase, and rigidity starts to percolate. In this work, we use reverse Monte Carlo (RMC) modeling to model the structure of Ge0.20Se0.80 glass by simulating its experimental total atomic pair distribution function (PDF) obtained via high energy synchrotron radiation. A three-dimensional configuration of 2836 atoms was obtained, from which we extracted the partial atomic pair distribution functions associated with Ge-Ge, Ge-Se and Se-Se real space correlations that are hard to extract experimentally from total scattering methods. Bond angle distributions, coordination numbers, mean coordination numbers and the number of floppy modes were also extracted and discussed. More structural insights about network topology at this composition were illustrated. The results indicate that in Ge0.20Se0.80 glass, Ge atoms break up and cross-link the Se chain structure, and form structural units that are four-fold coordinated (the GeSe4 tetrahedra). These tetrahedra form the basic building block and are connected via shared Se atoms or short Se chains. The extent of the intermediate ranged oscillations in real space (as extracted from the width of the first sharp diffraction peak) was found to be around 19.6 ?. The bonding schemes in this glass are consistent with the so-called “8-N” rule and can be interpreted in terms of a chemically ordered network model.