Monitoring sensors in complex engineering environments often record abnormal data,leading to significant positioning errors.To reduce the influence of abnormal arrival times,we introduce an innovative,outlier-robust l...Monitoring sensors in complex engineering environments often record abnormal data,leading to significant positioning errors.To reduce the influence of abnormal arrival times,we introduce an innovative,outlier-robust localization method that integrates kernel density estimation(KDE)with damping linear correction to enhance the precision of microseismic/acoustic emission(MS/AE)source positioning.Our approach systematically addresses abnormal arrival times through a three-step process:initial location by 4-arrival combinations,elimination of outliers based on three-dimensional KDE,and refinement using a linear correction with an adaptive damping factor.We validate our method through lead-breaking experiments,demonstrating over a 23%improvement in positioning accuracy with a maximum error of 9.12 mm(relative error of 15.80%)—outperforming 4 existing methods.Simulations under various system errors,outlier scales,and ratios substantiate our method’s superior performance.Field blasting experiments also confirm the practical applicability,with an average positioning error of 11.71 m(relative error of 7.59%),compared to 23.56,66.09,16.95,and 28.52 m for other methods.This research is significant as it enhances the robustness of MS/AE source localization when confronted with data anomalies.It also provides a practical solution for real-world engineering and safety monitoring applications.展开更多
In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate pr...In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.展开更多
The sixth-generation fighter has superior stealth performance,but for the traditional kernel density estimation(KDE),precision requirements are difficult to satisfy when dealing with the fluctuation characteristics of...The sixth-generation fighter has superior stealth performance,but for the traditional kernel density estimation(KDE),precision requirements are difficult to satisfy when dealing with the fluctuation characteristics of complex radar cross section(RCS).To solve this problem,this paper studies the KDE algorithm for F/AXX stealth fighter.By considering the accuracy lack of existing fixed bandwidth algorithms,a novel adaptive kernel density estimation(AKDE)algorithm equipped with least square cross validation and integrated squared error criterion is proposed to optimize the bandwidth.Meanwhile,an adaptive RCS density estimation can be obtained according to the optimized bandwidth.Finally,simulations verify that the estimation accuracy of the adaptive bandwidth RCS density estimation algorithm is more than 50%higher than that of the traditional algorithm.Based on the proposed algorithm(i.e.,AKDE),statistical characteristics of the considered fighter are more accurately acquired,and then the significant advantages of the AKDE algorithm in solving cumulative distribution function estimation of RCS less than 1 m2 are analyzed.展开更多
In the process of large-scale,grid-connected wind power operations,it is important to establish an accurate probability distribution model for wind farm fluctuations.In this study,a wind power fluctuation modeling met...In the process of large-scale,grid-connected wind power operations,it is important to establish an accurate probability distribution model for wind farm fluctuations.In this study,a wind power fluctuation modeling method is proposed based on the method of moving average and adaptive nonparametric kernel density estimation(NPKDE)method.Firstly,the method of moving average is used to reduce the fluctuation of the sampling wind power component,and the probability characteristics of the modeling are then determined based on the NPKDE.Secondly,the model is improved adaptively,and is then solved by using constraint-order optimization.The simulation results show that this method has a better accuracy and applicability compared with the modeling method based on traditional parameter estimation,and solves the local adaptation problem of traditional NPKDE.展开更多
A new algorithm for linear instantaneous independent component analysis is proposed based on maximizing the log-likelihood contrast function which can be changed into a gradient equation.An iterative method is introdu...A new algorithm for linear instantaneous independent component analysis is proposed based on maximizing the log-likelihood contrast function which can be changed into a gradient equation.An iterative method is introduced to solve this equation efficiently.The unknown probability density functions as well as their first and second derivatives in the gradient equation are estimated by kernel density method.Computer simulations on artificially generated signals and gray scale natural scene images confirm the efficiency and accuracy of the proposed algorithm.展开更多
An improved method using kernel density estimation (KDE) and confidence level is presented for model validation with small samples. Decision making is a challenging problem because of input uncertainty and only smal...An improved method using kernel density estimation (KDE) and confidence level is presented for model validation with small samples. Decision making is a challenging problem because of input uncertainty and only small samples can be used due to the high costs of experimental measurements. However, model validation provides more confidence for decision makers when improving prediction accuracy at the same time. The confidence level method is introduced and the optimum sample variance is determined using a new method in kernel density estimation to increase the credibility of model validation. As a numerical example, the static frame model validation challenge problem presented by Sandia National Laboratories has been chosen. The optimum bandwidth is selected in kernel density estimation in order to build the probability model based on the calibration data. The model assessment is achieved using validation and accreditation experimental data respectively based on the probability model. Finally, the target structure prediction is performed using validated model, which are consistent with the results obtained by other researchers. The results demonstrate that the method using the improved confidence level and kernel density estimation is an effective approach to solve the model validation problem with small samples.展开更多
In order to improve the performance of the probability hypothesis density(PHD) algorithm based particle filter(PF) in terms of number estimation and states extraction of multiple targets, a new probability hypothesis ...In order to improve the performance of the probability hypothesis density(PHD) algorithm based particle filter(PF) in terms of number estimation and states extraction of multiple targets, a new probability hypothesis density filter algorithm based on marginalized particle and kernel density estimation is proposed, which utilizes the idea of marginalized particle filter to enhance the estimating performance of the PHD. The state variables are decomposed into linear and non-linear parts. The particle filter is adopted to predict and estimate the nonlinear states of multi-target after dimensionality reduction, while the Kalman filter is applied to estimate the linear parts under linear Gaussian condition. Embedding the information of the linear states into the estimated nonlinear states helps to reduce the estimating variance and improve the accuracy of target number estimation. The meanshift kernel density estimation, being of the inherent nature of searching peak value via an adaptive gradient ascent iteration, is introduced to cluster particles and extract target states, which is independent of the target number and can converge to the local peak position of the PHD distribution while avoiding the errors due to the inaccuracy in modeling and parameters estimation. Experiments show that the proposed algorithm can obtain higher tracking accuracy when using fewer sampling particles and is of lower computational complexity compared with the PF-PHD.展开更多
In this paper, we propose a new method that combines collage error in fractal domain and Hu moment invariants for image retrieval with a statistical method - variable bandwidth Kernel Density Estimation (KDE). The pro...In this paper, we propose a new method that combines collage error in fractal domain and Hu moment invariants for image retrieval with a statistical method - variable bandwidth Kernel Density Estimation (KDE). The proposed method is called CHK (KDE of Collage error and Hu moment) and it is tested on the Vistex texture database with 640 natural images. Experimental results show that the Average Retrieval Rate (ARR) can reach into 78.18%, which demonstrates that the proposed method performs better than the one with parameters respectively as well as the commonly used histogram method both on retrieval rate and retrieval time.展开更多
A novel diversity-sampling based nonparametric multi-modal background model is proposed. Using the samples having more popular and various intensity values in the training sequence, a nonparametric model is built for ...A novel diversity-sampling based nonparametric multi-modal background model is proposed. Using the samples having more popular and various intensity values in the training sequence, a nonparametric model is built for background subtraction. According to the related intensifies, different weights are given to the distinct samples in kernel density estimation. This avoids repeated computation using all samples, and makes computation more efficient in the evaluation phase. Experimental results show the validity of the diversity- sampling scheme and robustness of the proposed model in moving objects segmentation. The proposed algorithm can be used in outdoor surveillance systems.展开更多
One-class support vector machine (OCSVM) and support vector data description (SVDD) are two main domain-based one-class (kernel) classifiers. To reveal their relationship with density estimation in the case of t...One-class support vector machine (OCSVM) and support vector data description (SVDD) are two main domain-based one-class (kernel) classifiers. To reveal their relationship with density estimation in the case of the Gaussian kernel, OCSVM and SVDD are firstly unified into the framework of kernel density estimation, and the essential relationship between them is explicitly revealed. Then the result proves that the density estimation induced by OCSVM or SVDD is in agreement with the true density. Meanwhile, it can also reduce the integrated squared error (ISE). Finally, experiments on several simulated datasets verify the revealed relationships.展开更多
Identifying precise egg attachment areas and tracking trends of spawning magnitude (total amount of spawned eggs) are critical for accurate habitat assessment and effective conservation efforts, especially for lithoph...Identifying precise egg attachment areas and tracking trends of spawning magnitude (total amount of spawned eggs) are critical for accurate habitat assessment and effective conservation efforts, especially for lithophilic spawning fishes. However, accurate measurement of spawning conditions across both spatial and temporal dimensions poses significant challenges. We conducted a fourteen-year field study below the Gezhouba Dam, the main spawning ground for the Chinese sturgeon, using Kernel Density Estimation (KDE) method and Catch per Unit of Effort (CPUE) to refine knowledge on egg attachment areas relative to previous assessments. In addition, our analysis documented shifts in spawning locations within these four areas over the past fourteen years, revealing a worrying trend of decreasing spawning magnitude. This approach not only enabled the incorporation of the density distribution of eggs into the assessment of spawning magnitude trends, but also underscored the potential of the KDE as a framework for identifying egg attachment areas and estimating spawning magnitude trends. Our results provide valuable insights into spawning degradation of Chinese sturgeon and inform conservation strategies to protect their fragile spawning grounds.展开更多
Assume that f_(n)is the nonparametric kernel density estimator of directional data based on a kernel function K and a sequence of independent and identically distributed random variables taking values in d-dimensional...Assume that f_(n)is the nonparametric kernel density estimator of directional data based on a kernel function K and a sequence of independent and identically distributed random variables taking values in d-dimensional unit sphere S^(d-1).We established that the large deviation principle for{sup_(x∈S^(d-1))|fn(x)-fn(-x)|,n≥1}holds if the kernel function is a function with bounded variation,and the density function f of the random variables is continuous and symmetric.展开更多
A kernel density estimator is proposed when tile data are subject to censorship in multivariate case. The asymptotic normality, strong convergence and asymptotic optimal bandwidth which minimize the mean square error ...A kernel density estimator is proposed when tile data are subject to censorship in multivariate case. The asymptotic normality, strong convergence and asymptotic optimal bandwidth which minimize the mean square error of the estimator are studied.展开更多
Let {Xn, n≥1} be a strictly stationary sequence of random variables, which are either associated or negatively associated, f(.) be their common density. In this paper, the author shows a central limit theorem for a k...Let {Xn, n≥1} be a strictly stationary sequence of random variables, which are either associated or negatively associated, f(.) be their common density. In this paper, the author shows a central limit theorem for a kernel estimate of f(.) under certain regular conditions.展开更多
Beijing Xianyukou Hutong(hutong refers to historical and cultural block in Chinese)occupies an important geographical location with unique urban fabric,and after years of renewal and protection,the commercial space of...Beijing Xianyukou Hutong(hutong refers to historical and cultural block in Chinese)occupies an important geographical location with unique urban fabric,and after years of renewal and protection,the commercial space of Xianyukou Street and has gained some recognition.This article Xianyukou takes commercial hutong in Beijing as an example,spatial analysis was carried out using methods like GIS kernel density method,space syntax after site investigation and research.Based on the street space problems found,this paper then puts forward strategies to improve and upgrade Xianyukou Street’s commercial space and improve businesses in Xianyukou Street and other similar hutong.展开更多
Mechanical properties are critical to the quality of hot-rolled steel pipe products.Accurately understanding the relationship between rolling parameters and mechanical properties is crucial for effective prediction an...Mechanical properties are critical to the quality of hot-rolled steel pipe products.Accurately understanding the relationship between rolling parameters and mechanical properties is crucial for effective prediction and control.To address this,an industrial big data platform was developed to collect and process multi-source heterogeneous data from the entire production process,providing a complete dataset for mechanical property prediction.The adaptive bandwidth kernel density estimation(ABKDE)method was proposed to adjust bandwidth dynamically based on data density.Combining long short-term memory neural networks with ABKDE offers robust prediction interval capabilities for mechanical properties.The proposed method was deployed in a large-scale steel plant,which demonstrated superior prediction interval performance compared to lower upper bound estimation,mean variance estimation,and extreme learning machine-adaptive bandwidth kernel density estimation,achieving a prediction interval normalized average width of 0.37,a prediction interval coverage probability of 0.94,and the lowest coverage width-based criterion of 1.35.Notably,shapley additive explanations-based explanations significantly improved the proposed model’s credibility by providing a clear analysis of feature impacts.展开更多
The development of digital construction management is an important initiative to promote the digital transformation of the construction industry. But the attention to the regional differences in the development level ...The development of digital construction management is an important initiative to promote the digital transformation of the construction industry. But the attention to the regional differences in the development level of digital construction management in China from the industrial level is still relatively scarce. In this paper, the combination assignment method, Dagum’s Gini coefficient and Kernel density estimation method, are used to explore the regional differences and their dynamic evolution trends of China’s digital construction management development level. The study finds that the overall development level in China’s construction industry is on the rise, but it is still at a relatively low level. The overall Gini coefficient has increased, which is mainly due to uneven development between regions. There are large development differences between the eastern region and the other three regions. The interregional Gini coefficients for the Central-Northeastern and Central-Western regions are all growing at a higher rate.展开更多
Urban air pollution has brought great troubles to physical and mental health,economic development,environmental protection,and other aspects.Predicting the changes and trends of air pollution can provide a scientific ...Urban air pollution has brought great troubles to physical and mental health,economic development,environmental protection,and other aspects.Predicting the changes and trends of air pollution can provide a scientific basis for governance and prevention efforts.In this paper,we propose an interval prediction method that considers the spatio-temporal characteristic information of PM_(2.5)signals from multiple stations.K-nearest neighbor(KNN)algorithm interpolates the lost signals in the process of collection,transmission,and storage to ensure the continuity of data.Graph generative network(GGN)is used to process time-series meteorological data with complex structures.The graph U-Nets framework is introduced into the GGN model to enhance its controllability to the graph generation process,which is beneficial to improve the efficiency and robustness of the model.In addition,sparse Bayesian regression is incorporated to improve the dimensional disaster defect of traditional kernel density estimation(KDE)interval prediction.With the support of sparse strategy,sparse Bayesian regression kernel density estimation(SBR-KDE)is very efficient in processing high-dimensional large-scale data.The PM_(2.5)data of spring,summer,autumn,and winter from 34 air quality monitoring sites in Beijing verified the accuracy,generalization,and superiority of the proposed model in interval prediction.展开更多
The reliable,rapid,and accurate Remaining Useful Life(RUL)prognostics of aircraft power supply and distribution system are essential for enhancing the reliability and stability of system and reducing the life-cycle co...The reliable,rapid,and accurate Remaining Useful Life(RUL)prognostics of aircraft power supply and distribution system are essential for enhancing the reliability and stability of system and reducing the life-cycle costs.To achieve the reliable,rapid,and accurate RUL prognostics,the balance between accuracy and computational burden deserves more attention.In addition,the uncertainty is intrinsically present in RUL prognostic process.Due to the limitation of the uncertainty quantification,the point-wise prognostics strategy is not trustworthy.A Dual Adaptive Sliding-window Hybrid(DASH)RUL probabilistic prognostics strategy is proposed to tackle these deficiencies.The DASH strategy contains two adaptive mechanisms,the adaptive Long Short-Term Memory-Polynomial Regression(LSTM-PR)hybrid prognostics mechanism and the adaptive sliding-window Kernel Density Estimation(KDE)probabilistic prognostics mechanism.Owing to the dual adaptive mechanisms,the DASH strategy can achieve the balance between accuracy and computational burden and obtain the trustworthy probabilistic prognostics.Based on the degradation dataset of aircraft electromagnetic contactors,the superiority of DASH strategy is validated.In terms of probabilistic,point-wise and integrated prognostics performance,the proposed strategy increases by 66.89%,81.73% and 25.84%on average compared with the baseline methods and their variants.展开更多
As a production quality index of hematite grinding process,particle size(PS)is hard to be measured in real time.To achieve the PS estimation,this paper proposes a novel data driven model of PS using stochastic configu...As a production quality index of hematite grinding process,particle size(PS)is hard to be measured in real time.To achieve the PS estimation,this paper proposes a novel data driven model of PS using stochastic configuration network(SCN)with robust technique,namely,robust SCN(RSCN).Firstly,this paper proves the universal approximation property of RSCN with weighted least squares technique.Secondly,three robust algorithms are presented by employing M-estimation with Huber loss function,M-estimation with interquartile range(IQR)and nonparametric kernel density estimation(NKDE)function respectively to set the penalty weight.Comparison experiments are first carried out based on the UCI standard data sets to verify the effectiveness of these methods,and then the data-driven PS model based on the robust algorithms are established and verified.Experimental results show that the RSCN has an excellent performance for the PS estimation.展开更多
基金the financial support provided by the National Key Research and Development Program for Young Scientists(No.2021YFC2900400)Postdoctoral Fellowship Program of China Postdoctoral Science Foundation(CPSF)(No.GZB20230914)+2 种基金National Natural Science Foundation of China(No.52304123)China Postdoctoral Science Foundation(No.2023M730412)Chongqing Outstanding Youth Science Foundation Program(No.CSTB2023NSCQ-JQX0027).
文摘Monitoring sensors in complex engineering environments often record abnormal data,leading to significant positioning errors.To reduce the influence of abnormal arrival times,we introduce an innovative,outlier-robust localization method that integrates kernel density estimation(KDE)with damping linear correction to enhance the precision of microseismic/acoustic emission(MS/AE)source positioning.Our approach systematically addresses abnormal arrival times through a three-step process:initial location by 4-arrival combinations,elimination of outliers based on three-dimensional KDE,and refinement using a linear correction with an adaptive damping factor.We validate our method through lead-breaking experiments,demonstrating over a 23%improvement in positioning accuracy with a maximum error of 9.12 mm(relative error of 15.80%)—outperforming 4 existing methods.Simulations under various system errors,outlier scales,and ratios substantiate our method’s superior performance.Field blasting experiments also confirm the practical applicability,with an average positioning error of 11.71 m(relative error of 7.59%),compared to 23.56,66.09,16.95,and 28.52 m for other methods.This research is significant as it enhances the robustness of MS/AE source localization when confronted with data anomalies.It also provides a practical solution for real-world engineering and safety monitoring applications.
文摘In real-world applications, datasets frequently contain outliers, which can hinder the generalization ability of machine learning models. Bayesian classifiers, a popular supervised learning method, rely on accurate probability density estimation for classifying continuous datasets. However, achieving precise density estimation with datasets containing outliers poses a significant challenge. This paper introduces a Bayesian classifier that utilizes optimized robust kernel density estimation to address this issue. Our proposed method enhances the accuracy of probability density distribution estimation by mitigating the impact of outliers on the training sample’s estimated distribution. Unlike the conventional kernel density estimator, our robust estimator can be seen as a weighted kernel mapping summary for each sample. This kernel mapping performs the inner product in the Hilbert space, allowing the kernel density estimation to be considered the average of the samples’ mapping in the Hilbert space using a reproducing kernel. M-estimation techniques are used to obtain accurate mean values and solve the weights. Meanwhile, complete cross-validation is used as the objective function to search for the optimal bandwidth, which impacts the estimator. The Harris Hawks Optimisation optimizes the objective function to improve the estimation accuracy. The experimental results show that it outperforms other optimization algorithms regarding convergence speed and objective function value during the bandwidth search. The optimal robust kernel density estimator achieves better fitness performance than the traditional kernel density estimator when the training data contains outliers. The Naïve Bayesian with optimal robust kernel density estimation improves the generalization in the classification with outliers.
基金the National Natural Science Foundation of China(Nos.61074090 and 60804025)。
文摘The sixth-generation fighter has superior stealth performance,but for the traditional kernel density estimation(KDE),precision requirements are difficult to satisfy when dealing with the fluctuation characteristics of complex radar cross section(RCS).To solve this problem,this paper studies the KDE algorithm for F/AXX stealth fighter.By considering the accuracy lack of existing fixed bandwidth algorithms,a novel adaptive kernel density estimation(AKDE)algorithm equipped with least square cross validation and integrated squared error criterion is proposed to optimize the bandwidth.Meanwhile,an adaptive RCS density estimation can be obtained according to the optimized bandwidth.Finally,simulations verify that the estimation accuracy of the adaptive bandwidth RCS density estimation algorithm is more than 50%higher than that of the traditional algorithm.Based on the proposed algorithm(i.e.,AKDE),statistical characteristics of the considered fighter are more accurately acquired,and then the significant advantages of the AKDE algorithm in solving cumulative distribution function estimation of RCS less than 1 m2 are analyzed.
基金supported by Science and Technology project of the State Grid Corporation of China“Research on Active Development Planning Technology and Comprehensive Benefit Analysis Method for Regional Smart Grid Comprehensive Demonstration Zone”National Natural Science Foundation of China(51607104)
文摘In the process of large-scale,grid-connected wind power operations,it is important to establish an accurate probability distribution model for wind farm fluctuations.In this study,a wind power fluctuation modeling method is proposed based on the method of moving average and adaptive nonparametric kernel density estimation(NPKDE)method.Firstly,the method of moving average is used to reduce the fluctuation of the sampling wind power component,and the probability characteristics of the modeling are then determined based on the NPKDE.Secondly,the model is improved adaptively,and is then solved by using constraint-order optimization.The simulation results show that this method has a better accuracy and applicability compared with the modeling method based on traditional parameter estimation,and solves the local adaptation problem of traditional NPKDE.
文摘A new algorithm for linear instantaneous independent component analysis is proposed based on maximizing the log-likelihood contrast function which can be changed into a gradient equation.An iterative method is introduced to solve this equation efficiently.The unknown probability density functions as well as their first and second derivatives in the gradient equation are estimated by kernel density method.Computer simulations on artificially generated signals and gray scale natural scene images confirm the efficiency and accuracy of the proposed algorithm.
基金Funding of Jiangsu Innovation Program for Graduate Education (CXZZ11_0193)NUAA Research Funding (NJ2010009)
文摘An improved method using kernel density estimation (KDE) and confidence level is presented for model validation with small samples. Decision making is a challenging problem because of input uncertainty and only small samples can be used due to the high costs of experimental measurements. However, model validation provides more confidence for decision makers when improving prediction accuracy at the same time. The confidence level method is introduced and the optimum sample variance is determined using a new method in kernel density estimation to increase the credibility of model validation. As a numerical example, the static frame model validation challenge problem presented by Sandia National Laboratories has been chosen. The optimum bandwidth is selected in kernel density estimation in order to build the probability model based on the calibration data. The model assessment is achieved using validation and accreditation experimental data respectively based on the probability model. Finally, the target structure prediction is performed using validated model, which are consistent with the results obtained by other researchers. The results demonstrate that the method using the improved confidence level and kernel density estimation is an effective approach to solve the model validation problem with small samples.
基金Project(61101185) supported by the National Natural Science Foundation of ChinaProject(2011AA1221) supported by the National High Technology Research and Development Program of China
文摘In order to improve the performance of the probability hypothesis density(PHD) algorithm based particle filter(PF) in terms of number estimation and states extraction of multiple targets, a new probability hypothesis density filter algorithm based on marginalized particle and kernel density estimation is proposed, which utilizes the idea of marginalized particle filter to enhance the estimating performance of the PHD. The state variables are decomposed into linear and non-linear parts. The particle filter is adopted to predict and estimate the nonlinear states of multi-target after dimensionality reduction, while the Kalman filter is applied to estimate the linear parts under linear Gaussian condition. Embedding the information of the linear states into the estimated nonlinear states helps to reduce the estimating variance and improve the accuracy of target number estimation. The meanshift kernel density estimation, being of the inherent nature of searching peak value via an adaptive gradient ascent iteration, is introduced to cluster particles and extract target states, which is independent of the target number and can converge to the local peak position of the PHD distribution while avoiding the errors due to the inaccuracy in modeling and parameters estimation. Experiments show that the proposed algorithm can obtain higher tracking accuracy when using fewer sampling particles and is of lower computational complexity compared with the PF-PHD.
基金Supported by the Fundamental Research Funds for the Central Universities (No. NS2012093)
文摘In this paper, we propose a new method that combines collage error in fractal domain and Hu moment invariants for image retrieval with a statistical method - variable bandwidth Kernel Density Estimation (KDE). The proposed method is called CHK (KDE of Collage error and Hu moment) and it is tested on the Vistex texture database with 640 natural images. Experimental results show that the Average Retrieval Rate (ARR) can reach into 78.18%, which demonstrates that the proposed method performs better than the one with parameters respectively as well as the commonly used histogram method both on retrieval rate and retrieval time.
基金Project supported by National Basic Research Program of Chinaon Urban Traffic Monitoring and Management System(Grant No .TG1998030408)
文摘A novel diversity-sampling based nonparametric multi-modal background model is proposed. Using the samples having more popular and various intensity values in the training sequence, a nonparametric model is built for background subtraction. According to the related intensifies, different weights are given to the distinct samples in kernel density estimation. This avoids repeated computation using all samples, and makes computation more efficient in the evaluation phase. Experimental results show the validity of the diversity- sampling scheme and robustness of the proposed model in moving objects segmentation. The proposed algorithm can be used in outdoor surveillance systems.
基金Supported by the National Natural Science Foundation of China(60603029)the Natural Science Foundation of Jiangsu Province(BK2007074)the Natural Science Foundation for Colleges and Universities in Jiangsu Province(06KJB520132)~~
文摘One-class support vector machine (OCSVM) and support vector data description (SVDD) are two main domain-based one-class (kernel) classifiers. To reveal their relationship with density estimation in the case of the Gaussian kernel, OCSVM and SVDD are firstly unified into the framework of kernel density estimation, and the essential relationship between them is explicitly revealed. Then the result proves that the density estimation induced by OCSVM or SVDD is in agreement with the true density. Meanwhile, it can also reduce the integrated squared error (ISE). Finally, experiments on several simulated datasets verify the revealed relationships.
基金funded by the National Key Technologies R&D Program of China(2021YFD1200304)Hubei Province International Cooperation project of China(2022EHB029)the Central Public-interest Scientific Institution Basal Research Fund,CAFS(NO.2023TD08).
文摘Identifying precise egg attachment areas and tracking trends of spawning magnitude (total amount of spawned eggs) are critical for accurate habitat assessment and effective conservation efforts, especially for lithophilic spawning fishes. However, accurate measurement of spawning conditions across both spatial and temporal dimensions poses significant challenges. We conducted a fourteen-year field study below the Gezhouba Dam, the main spawning ground for the Chinese sturgeon, using Kernel Density Estimation (KDE) method and Catch per Unit of Effort (CPUE) to refine knowledge on egg attachment areas relative to previous assessments. In addition, our analysis documented shifts in spawning locations within these four areas over the past fourteen years, revealing a worrying trend of decreasing spawning magnitude. This approach not only enabled the incorporation of the density distribution of eggs into the assessment of spawning magnitude trends, but also underscored the potential of the KDE as a framework for identifying egg attachment areas and estimating spawning magnitude trends. Our results provide valuable insights into spawning degradation of Chinese sturgeon and inform conservation strategies to protect their fragile spawning grounds.
基金Supported by the Doctoral Scientific Research Starting Foundation of Jingdezhen Ceramic University(Grant No.102/01003002031)Program of Department of Education of Jiangxi Province of China(Grant Nos.GJJ190732,GJJ180737)the Natural Science Foundation Program of Jiangxi Province(Grant No.20202BABL211005).
文摘Assume that f_(n)is the nonparametric kernel density estimator of directional data based on a kernel function K and a sequence of independent and identically distributed random variables taking values in d-dimensional unit sphere S^(d-1).We established that the large deviation principle for{sup_(x∈S^(d-1))|fn(x)-fn(-x)|,n≥1}holds if the kernel function is a function with bounded variation,and the density function f of the random variables is continuous and symmetric.
文摘A kernel density estimator is proposed when tile data are subject to censorship in multivariate case. The asymptotic normality, strong convergence and asymptotic optimal bandwidth which minimize the mean square error of the estimator are studied.
文摘Let {Xn, n≥1} be a strictly stationary sequence of random variables, which are either associated or negatively associated, f(.) be their common density. In this paper, the author shows a central limit theorem for a kernel estimate of f(.) under certain regular conditions.
基金Beijing Zheshe Base Construction Project:Research on Urban Renewal and Comprehensive Environmental Management of the Old Community in Beijing(110051360022XN121-05)。
文摘Beijing Xianyukou Hutong(hutong refers to historical and cultural block in Chinese)occupies an important geographical location with unique urban fabric,and after years of renewal and protection,the commercial space of Xianyukou Street and has gained some recognition.This article Xianyukou takes commercial hutong in Beijing as an example,spatial analysis was carried out using methods like GIS kernel density method,space syntax after site investigation and research.Based on the street space problems found,this paper then puts forward strategies to improve and upgrade Xianyukou Street’s commercial space and improve businesses in Xianyukou Street and other similar hutong.
基金supported by the National Key Research and Development Plan(Grant No.2023YFB3712400)the National Key Research and Development Plan(Grant No.2020YFB1713600).
文摘Mechanical properties are critical to the quality of hot-rolled steel pipe products.Accurately understanding the relationship between rolling parameters and mechanical properties is crucial for effective prediction and control.To address this,an industrial big data platform was developed to collect and process multi-source heterogeneous data from the entire production process,providing a complete dataset for mechanical property prediction.The adaptive bandwidth kernel density estimation(ABKDE)method was proposed to adjust bandwidth dynamically based on data density.Combining long short-term memory neural networks with ABKDE offers robust prediction interval capabilities for mechanical properties.The proposed method was deployed in a large-scale steel plant,which demonstrated superior prediction interval performance compared to lower upper bound estimation,mean variance estimation,and extreme learning machine-adaptive bandwidth kernel density estimation,achieving a prediction interval normalized average width of 0.37,a prediction interval coverage probability of 0.94,and the lowest coverage width-based criterion of 1.35.Notably,shapley additive explanations-based explanations significantly improved the proposed model’s credibility by providing a clear analysis of feature impacts.
文摘The development of digital construction management is an important initiative to promote the digital transformation of the construction industry. But the attention to the regional differences in the development level of digital construction management in China from the industrial level is still relatively scarce. In this paper, the combination assignment method, Dagum’s Gini coefficient and Kernel density estimation method, are used to explore the regional differences and their dynamic evolution trends of China’s digital construction management development level. The study finds that the overall development level in China’s construction industry is on the rise, but it is still at a relatively low level. The overall Gini coefficient has increased, which is mainly due to uneven development between regions. There are large development differences between the eastern region and the other three regions. The interregional Gini coefficients for the Central-Northeastern and Central-Western regions are all growing at a higher rate.
基金Project(2020YFC2008605)supported by the National Key Research and Development Project of ChinaProject(52072412)supported by the National Natural Science Foundation of ChinaProject(2021JJ30359)supported by the Natural Science Foundation of Hunan Province,China。
文摘Urban air pollution has brought great troubles to physical and mental health,economic development,environmental protection,and other aspects.Predicting the changes and trends of air pollution can provide a scientific basis for governance and prevention efforts.In this paper,we propose an interval prediction method that considers the spatio-temporal characteristic information of PM_(2.5)signals from multiple stations.K-nearest neighbor(KNN)algorithm interpolates the lost signals in the process of collection,transmission,and storage to ensure the continuity of data.Graph generative network(GGN)is used to process time-series meteorological data with complex structures.The graph U-Nets framework is introduced into the GGN model to enhance its controllability to the graph generation process,which is beneficial to improve the efficiency and robustness of the model.In addition,sparse Bayesian regression is incorporated to improve the dimensional disaster defect of traditional kernel density estimation(KDE)interval prediction.With the support of sparse strategy,sparse Bayesian regression kernel density estimation(SBR-KDE)is very efficient in processing high-dimensional large-scale data.The PM_(2.5)data of spring,summer,autumn,and winter from 34 air quality monitoring sites in Beijing verified the accuracy,generalization,and superiority of the proposed model in interval prediction.
基金co-supported by the National Natural Science Foundation of China(Nos.52272403,52402506)Natural Science Basic Research Program of Shaanxi,China(Nos.2022JC-27,2023-JC-QN-0599)。
文摘The reliable,rapid,and accurate Remaining Useful Life(RUL)prognostics of aircraft power supply and distribution system are essential for enhancing the reliability and stability of system and reducing the life-cycle costs.To achieve the reliable,rapid,and accurate RUL prognostics,the balance between accuracy and computational burden deserves more attention.In addition,the uncertainty is intrinsically present in RUL prognostic process.Due to the limitation of the uncertainty quantification,the point-wise prognostics strategy is not trustworthy.A Dual Adaptive Sliding-window Hybrid(DASH)RUL probabilistic prognostics strategy is proposed to tackle these deficiencies.The DASH strategy contains two adaptive mechanisms,the adaptive Long Short-Term Memory-Polynomial Regression(LSTM-PR)hybrid prognostics mechanism and the adaptive sliding-window Kernel Density Estimation(KDE)probabilistic prognostics mechanism.Owing to the dual adaptive mechanisms,the DASH strategy can achieve the balance between accuracy and computational burden and obtain the trustworthy probabilistic prognostics.Based on the degradation dataset of aircraft electromagnetic contactors,the superiority of DASH strategy is validated.In terms of probabilistic,point-wise and integrated prognostics performance,the proposed strategy increases by 66.89%,81.73% and 25.84%on average compared with the baseline methods and their variants.
基金Projects(61603393,61741318)supported in part by the National Natural Science Foundation of ChinaProject(BK20160275)supported by the Natural Science Foundation of Jiangsu Province,China+1 种基金Project(2015M581885)supported by the Postdoctoral Science Foundation of ChinaProject(PAL-N201706)supported by the Open Project Foundation of State Key Laboratory of Synthetical Automation for Process Industries of Northeastern University,China
文摘As a production quality index of hematite grinding process,particle size(PS)is hard to be measured in real time.To achieve the PS estimation,this paper proposes a novel data driven model of PS using stochastic configuration network(SCN)with robust technique,namely,robust SCN(RSCN).Firstly,this paper proves the universal approximation property of RSCN with weighted least squares technique.Secondly,three robust algorithms are presented by employing M-estimation with Huber loss function,M-estimation with interquartile range(IQR)and nonparametric kernel density estimation(NKDE)function respectively to set the penalty weight.Comparison experiments are first carried out based on the UCI standard data sets to verify the effectiveness of these methods,and then the data-driven PS model based on the robust algorithms are established and verified.Experimental results show that the RSCN has an excellent performance for the PS estimation.