This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which...This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy.展开更多
The industrial Internet of Things(IoT)is a trend of factory development and a basic condition of intelligent factory.It is very important to ensure the security of data transmission in industrial IoT.Applying a new ch...The industrial Internet of Things(IoT)is a trend of factory development and a basic condition of intelligent factory.It is very important to ensure the security of data transmission in industrial IoT.Applying a new chaotic secure communication scheme to address the security problem of data transmission is the main contribution of this paper.The scheme is proposed and studied based on the synchronization of different-structure fractional-order chaotic systems with different order.The Lyapunov stability theory is used to prove the synchronization between the fractional-order drive system and the response system.The encryption and decryption process of the main data signals is implemented by using the n-shift encryption principle.We calculate and analyze the key space of the scheme.Numerical simulations are introduced to show the effectiveness of theoretical approach we proposed.展开更多
For the purpose of enhancing reliability of aileron of Airbus new-generation A350 XWB,an evaluation of aileron reliability on the basis of maintenance data is presented in this paper.Practical maintenance data contain...For the purpose of enhancing reliability of aileron of Airbus new-generation A350 XWB,an evaluation of aileron reliability on the basis of maintenance data is presented in this paper.Practical maintenance data contains large number of censoring samples, information uncertainty of which makes it hard to evaluate reliability of aileron actuator.Considering that true lifetime of censoring sample has identical distribution with complete sample, if censoring sample is transformed into complete sample, conversion frequency of censoring sample can be estimated according to frequency of complete sample.On the one hand, standard life table estimation and product limit method are improved on the basis of such conversion frequency, enabling accurate estimation of various censoring samples.On the other hand, by taking such frequency as one of the weight factors and integrating variance of order statistics under standard distribution, weighted least square estimation is formed for accurately estimating various censoring samples.Large amounts of experiments and simulations show that reliabilities of improved life table and improved product limit method are closer to the true value and more conservative; moreover, weighted least square estimate(WLSE), with conversion frequency of censoring sample and variances of order statistics as the weights, can still estimate accurately with high proportion of censored data in samples.Algorithm in this paper has good effect and can accurately estimate the reliability of aileron actuator even with small sample and high censoring rate.This research has certain significance in theory and engineering practice.展开更多
部分有序数据是同时包含有序特征与无序特征的一类数据,其广泛存在于现实生活中。传统的有序分类方法或者将所有特征都视为有序特征,或者对有序与无序特征分别进行处理,忽略了二者之间的关系,这些方法难以有效解决部分有序数据上的分类...部分有序数据是同时包含有序特征与无序特征的一类数据,其广泛存在于现实生活中。传统的有序分类方法或者将所有特征都视为有序特征,或者对有序与无序特征分别进行处理,忽略了二者之间的关系,这些方法难以有效解决部分有序数据上的分类问题。针对该问题,提出一种基于特征融合的部分有序深度森林模型,称为FFDF(feature fusion-based deep forest)。利用典型相关分析的思想,设计特征融合的贡献度计算方法,将有序特征和无序特征融合到同一特征空间,统一度量二者之间的关系。对融合的特征空间进行数据粒化,降低模型处理连续变量时的复杂性。设计融合空间下的特征矩阵输入级联森林,构建部分有序的深度森林模型。在来自UCI和WEKA的13个公共数据集上与部分单调决策树、有序分类模型、深度森林模型等六种方法进行比较实验,结果表明所提方法在准确性和平均绝对误差方面均优于对比方法;与集成模型深度森林gcForest和DF21进行了时间性能上的对比实验,结果表明所提方法在时间性能上优于对比方法。展开更多
This work presents a comprehensive fourth-order predictive modeling (PM) methodology that uses the MaxEnt principle to incorporate fourth-order moments (means, covariances, skewness, kurtosis) of model parameters, com...This work presents a comprehensive fourth-order predictive modeling (PM) methodology that uses the MaxEnt principle to incorporate fourth-order moments (means, covariances, skewness, kurtosis) of model parameters, computed and measured model responses, as well as fourth (and higher) order sensitivities of computed model responses to model parameters. This new methodology is designated by the acronym 4<sup>th</sup>-BERRU-PM, which stands for “fourth-order best-estimate results with reduced uncertainties.” The results predicted by the 4<sup>th</sup>-BERRU-PM incorporates, as particular cases, the results previously predicted by the second-order predictive modeling methodology 2<sup>nd</sup>-BERRU-PM, and vastly generalizes the results produced by extant data assimilation and data adjustment procedures.展开更多
This work presents a comprehensive second-order predictive modeling (PM) methodology designated by the acronym 2<sup>nd</sup>-BERRU-PMD. The attribute “2<sup>nd</sup>” indicates that this met...This work presents a comprehensive second-order predictive modeling (PM) methodology designated by the acronym 2<sup>nd</sup>-BERRU-PMD. The attribute “2<sup>nd</sup>” indicates that this methodology incorporates second-order uncertainties (means and covariances) and second-order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best- Estimate Results with Reduced Uncertainties” and the last letter (“D”) in the acronym indicates “deterministic,” referring to the deterministic inclusion of the computational model responses. The 2<sup>nd</sup>-BERRU-PMD methodology is fundamentally based on the maximum entropy (MaxEnt) principle. This principle is in contradistinction to the fundamental principle that underlies the extant data assimilation and/or adjustment procedures which minimize in a least-square sense a subjective user-defined functional which is meant to represent the discrepancies between measured and computed model responses. It is shown that the 2<sup>nd</sup>-BERRU-PMD methodology generalizes and extends current data assimilation and/or data adjustment procedures while overcoming the fundamental limitations of these procedures. In the accompanying work (Part II), the alternative framework for developing the “second- order MaxEnt predictive modelling methodology” is presented by incorporating probabilistically (as opposed to “deterministically”) the computed model responses.展开更多
This work presents a comprehensive second-order predictive modeling (PM) methodology based on the maximum entropy (MaxEnt) principle for obtaining best-estimate mean values and correlations for model responses and par...This work presents a comprehensive second-order predictive modeling (PM) methodology based on the maximum entropy (MaxEnt) principle for obtaining best-estimate mean values and correlations for model responses and parameters. This methodology is designated by the acronym 2<sup>nd</sup>-BERRU-PMP, where the attribute “2<sup>nd</sup>” indicates that this methodology incorporates second- order uncertainties (means and covariances) and second (and higher) order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best-Estimate Results with Reduced Uncertainties” and the last letter (“P”) in the acronym indicates “probabilistic,” referring to the MaxEnt probabilistic inclusion of the computational model responses. This is in contradistinction to the 2<sup>nd</sup>-BERRU-PMD methodology, which deterministically combines the computed model responses with the experimental information, as presented in the accompanying work (Part I). Although both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies yield expressions that include second (and higher) order sensitivities of responses to model parameters, the respective expressions for the predicted responses, for the calibrated predicted parameters and for their predicted uncertainties (covariances), are not identical to each other. Nevertheless, the results predicted by both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies encompass, as particular cases, the results produced by the extant data assimilation and data adjustment procedures, which rely on the minimization, in a least-square sense, of a user-defined functional meant to represent the discrepancies between measured and computed model responses.展开更多
This work (in two parts) will present a novel predictive modeling methodology aimed at obtaining “best-estimate results with reduced uncertainties” for the first four moments (mean values, covariance, skewness and k...This work (in two parts) will present a novel predictive modeling methodology aimed at obtaining “best-estimate results with reduced uncertainties” for the first four moments (mean values, covariance, skewness and kurtosis) of the optimally predicted distribution of model results and calibrated model parameters, by combining fourth-order experimental and computational information, including fourth (and higher) order sensitivities of computed model responses to model parameters. Underlying the construction of this fourth-order predictive modeling methodology is the “maximum entropy principle” which is initially used to obtain a novel closed-form expression of the (moments-constrained) fourth-order Maximum Entropy (MaxEnt) probability distribution constructed from the first four moments (means, covariances, skewness, kurtosis), which are assumed to be known, of an otherwise unknown distribution of a high-dimensional multivariate uncertain quantity of interest. This fourth-order MaxEnt distribution provides optimal compatibility of the available information while simultaneously ensuring minimal spurious information content, yielding an estimate of a probability density with the highest uncertainty among all densities satisfying the known moment constraints. Since this novel generic fourth-order MaxEnt distribution is of interest in its own right for applications in addition to predictive modeling, its construction is presented separately, in this first part of a two-part work. The fourth-order predictive modeling methodology that will be constructed by particularizing this generic fourth-order MaxEnt distribution will be presented in the accompanying work (Part-2).展开更多
Edge location is an important information of the source,and can be obtained by the potential field data. Most edge detection methods of potential field data are the functions of horizontal and vertical derivatives.The...Edge location is an important information of the source,and can be obtained by the potential field data. Most edge detection methods of potential field data are the functions of horizontal and vertical derivatives.The authors provide a new strategy to establish edge detection filters that can improve the resolution to identify small bodies,which use the ratio functions of different-order derivatives to recognize the edges of the sources.The new filter is named as advanced derivative ratio( ADR) filter and balanced outputs can be produced for different forms of ADR filters. The ADR filters are tested on synthetic data and real potential field data. The advantage of the ADR filters is that they can detect the edges of the causative sources more precisely and clearly,and the model testing results show that the resolution of ADR filters is higher than other existing filters. The ADR filters were applied to real data,with more subtle details obtained.展开更多
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61203147,61374047,and 61403168)
文摘This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy.
基金supported in part by the National Science Foundation Project of China (61931001, 61873026)the National Key R&D Program of China (2017YFC0820700)
文摘The industrial Internet of Things(IoT)is a trend of factory development and a basic condition of intelligent factory.It is very important to ensure the security of data transmission in industrial IoT.Applying a new chaotic secure communication scheme to address the security problem of data transmission is the main contribution of this paper.The scheme is proposed and studied based on the synchronization of different-structure fractional-order chaotic systems with different order.The Lyapunov stability theory is used to prove the synchronization between the fractional-order drive system and the response system.The encryption and decryption process of the main data signals is implemented by using the n-shift encryption principle.We calculate and analyze the key space of the scheme.Numerical simulations are introduced to show the effectiveness of theoretical approach we proposed.
基金supported by the National Natural Science Foundation of China (Nos.61403198, 61079013 and 61079014)Youth Foundation of Jiangsu Province in China (No.BK20140827)+2 种基金Key Programs of Natural Science Foundation of Chinathe China Civil Aviation Joint Fund (No.60939003)Natural Science Foundation of Jiangsu Province in China (No.BK2011737)
文摘For the purpose of enhancing reliability of aileron of Airbus new-generation A350 XWB,an evaluation of aileron reliability on the basis of maintenance data is presented in this paper.Practical maintenance data contains large number of censoring samples, information uncertainty of which makes it hard to evaluate reliability of aileron actuator.Considering that true lifetime of censoring sample has identical distribution with complete sample, if censoring sample is transformed into complete sample, conversion frequency of censoring sample can be estimated according to frequency of complete sample.On the one hand, standard life table estimation and product limit method are improved on the basis of such conversion frequency, enabling accurate estimation of various censoring samples.On the other hand, by taking such frequency as one of the weight factors and integrating variance of order statistics under standard distribution, weighted least square estimation is formed for accurately estimating various censoring samples.Large amounts of experiments and simulations show that reliabilities of improved life table and improved product limit method are closer to the true value and more conservative; moreover, weighted least square estimate(WLSE), with conversion frequency of censoring sample and variances of order statistics as the weights, can still estimate accurately with high proportion of censored data in samples.Algorithm in this paper has good effect and can accurately estimate the reliability of aileron actuator even with small sample and high censoring rate.This research has certain significance in theory and engineering practice.
文摘部分有序数据是同时包含有序特征与无序特征的一类数据,其广泛存在于现实生活中。传统的有序分类方法或者将所有特征都视为有序特征,或者对有序与无序特征分别进行处理,忽略了二者之间的关系,这些方法难以有效解决部分有序数据上的分类问题。针对该问题,提出一种基于特征融合的部分有序深度森林模型,称为FFDF(feature fusion-based deep forest)。利用典型相关分析的思想,设计特征融合的贡献度计算方法,将有序特征和无序特征融合到同一特征空间,统一度量二者之间的关系。对融合的特征空间进行数据粒化,降低模型处理连续变量时的复杂性。设计融合空间下的特征矩阵输入级联森林,构建部分有序的深度森林模型。在来自UCI和WEKA的13个公共数据集上与部分单调决策树、有序分类模型、深度森林模型等六种方法进行比较实验,结果表明所提方法在准确性和平均绝对误差方面均优于对比方法;与集成模型深度森林gcForest和DF21进行了时间性能上的对比实验,结果表明所提方法在时间性能上优于对比方法。
文摘This work presents a comprehensive fourth-order predictive modeling (PM) methodology that uses the MaxEnt principle to incorporate fourth-order moments (means, covariances, skewness, kurtosis) of model parameters, computed and measured model responses, as well as fourth (and higher) order sensitivities of computed model responses to model parameters. This new methodology is designated by the acronym 4<sup>th</sup>-BERRU-PM, which stands for “fourth-order best-estimate results with reduced uncertainties.” The results predicted by the 4<sup>th</sup>-BERRU-PM incorporates, as particular cases, the results previously predicted by the second-order predictive modeling methodology 2<sup>nd</sup>-BERRU-PM, and vastly generalizes the results produced by extant data assimilation and data adjustment procedures.
文摘This work presents a comprehensive second-order predictive modeling (PM) methodology designated by the acronym 2<sup>nd</sup>-BERRU-PMD. The attribute “2<sup>nd</sup>” indicates that this methodology incorporates second-order uncertainties (means and covariances) and second-order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best- Estimate Results with Reduced Uncertainties” and the last letter (“D”) in the acronym indicates “deterministic,” referring to the deterministic inclusion of the computational model responses. The 2<sup>nd</sup>-BERRU-PMD methodology is fundamentally based on the maximum entropy (MaxEnt) principle. This principle is in contradistinction to the fundamental principle that underlies the extant data assimilation and/or adjustment procedures which minimize in a least-square sense a subjective user-defined functional which is meant to represent the discrepancies between measured and computed model responses. It is shown that the 2<sup>nd</sup>-BERRU-PMD methodology generalizes and extends current data assimilation and/or data adjustment procedures while overcoming the fundamental limitations of these procedures. In the accompanying work (Part II), the alternative framework for developing the “second- order MaxEnt predictive modelling methodology” is presented by incorporating probabilistically (as opposed to “deterministically”) the computed model responses.
文摘This work presents a comprehensive second-order predictive modeling (PM) methodology based on the maximum entropy (MaxEnt) principle for obtaining best-estimate mean values and correlations for model responses and parameters. This methodology is designated by the acronym 2<sup>nd</sup>-BERRU-PMP, where the attribute “2<sup>nd</sup>” indicates that this methodology incorporates second- order uncertainties (means and covariances) and second (and higher) order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best-Estimate Results with Reduced Uncertainties” and the last letter (“P”) in the acronym indicates “probabilistic,” referring to the MaxEnt probabilistic inclusion of the computational model responses. This is in contradistinction to the 2<sup>nd</sup>-BERRU-PMD methodology, which deterministically combines the computed model responses with the experimental information, as presented in the accompanying work (Part I). Although both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies yield expressions that include second (and higher) order sensitivities of responses to model parameters, the respective expressions for the predicted responses, for the calibrated predicted parameters and for their predicted uncertainties (covariances), are not identical to each other. Nevertheless, the results predicted by both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies encompass, as particular cases, the results produced by the extant data assimilation and data adjustment procedures, which rely on the minimization, in a least-square sense, of a user-defined functional meant to represent the discrepancies between measured and computed model responses.
文摘This work (in two parts) will present a novel predictive modeling methodology aimed at obtaining “best-estimate results with reduced uncertainties” for the first four moments (mean values, covariance, skewness and kurtosis) of the optimally predicted distribution of model results and calibrated model parameters, by combining fourth-order experimental and computational information, including fourth (and higher) order sensitivities of computed model responses to model parameters. Underlying the construction of this fourth-order predictive modeling methodology is the “maximum entropy principle” which is initially used to obtain a novel closed-form expression of the (moments-constrained) fourth-order Maximum Entropy (MaxEnt) probability distribution constructed from the first four moments (means, covariances, skewness, kurtosis), which are assumed to be known, of an otherwise unknown distribution of a high-dimensional multivariate uncertain quantity of interest. This fourth-order MaxEnt distribution provides optimal compatibility of the available information while simultaneously ensuring minimal spurious information content, yielding an estimate of a probability density with the highest uncertainty among all densities satisfying the known moment constraints. Since this novel generic fourth-order MaxEnt distribution is of interest in its own right for applications in addition to predictive modeling, its construction is presented separately, in this first part of a two-part work. The fourth-order predictive modeling methodology that will be constructed by particularizing this generic fourth-order MaxEnt distribution will be presented in the accompanying work (Part-2).
基金Supported by Projects of National Key R&D Program of China(Nos.2017YFC0602203,2017YFC0601606)National Science and Technology Major Project(No.2016ZX05027-002-03)+1 种基金National Natural Science Foundation of China(Nos.41604098,41404089)State Key Program of National Natural Science of China(No.41430322)
文摘Edge location is an important information of the source,and can be obtained by the potential field data. Most edge detection methods of potential field data are the functions of horizontal and vertical derivatives.The authors provide a new strategy to establish edge detection filters that can improve the resolution to identify small bodies,which use the ratio functions of different-order derivatives to recognize the edges of the sources.The new filter is named as advanced derivative ratio( ADR) filter and balanced outputs can be produced for different forms of ADR filters. The ADR filters are tested on synthetic data and real potential field data. The advantage of the ADR filters is that they can detect the edges of the causative sources more precisely and clearly,and the model testing results show that the resolution of ADR filters is higher than other existing filters. The ADR filters were applied to real data,with more subtle details obtained.