Correlation power analysis(CPA)combined with genetic algorithms(GA)now achieves greater attack efficiency and can recover all subkeys simultaneously.However,two issues in GA-based CPA still need to be addressed:key de...Correlation power analysis(CPA)combined with genetic algorithms(GA)now achieves greater attack efficiency and can recover all subkeys simultaneously.However,two issues in GA-based CPA still need to be addressed:key degeneration and slow evolution within populations.These challenges significantly hinder key recovery efforts.This paper proposes a screening correlation power analysis framework combined with a genetic algorithm,named SFGA-CPA,to address these issues.SFGA-CPA introduces three operations designed to exploit CPA characteris-tics:propagative operation,constrained crossover,and constrained mutation.Firstly,the propagative operation accelerates population evolution by maximizing the number of correct bytes in each individual.Secondly,the constrained crossover and mutation operations effectively address key degeneration by preventing the compromise of correct bytes.Finally,an intelligent search method is proposed to identify optimal parameters,further improving attack efficiency.Experiments were conducted on both simulated environments and real power traces collected from the SAKURA-G platform.In the case of simulation,SFGA-CPA reduces the number of traces by 27.3%and 60%compared to CPA based on multiple screening methods(MS-CPA)and CPA based on simple GA method(SGA-CPA)when the success rate reaches 90%.Moreover,real experimental results on the SAKURA-G platform demonstrate that our approach outperforms other methods.展开更多
Disaster mitigation necessitates scientifi c and accurate aftershock forecasting during the critical 2 h after an earthquake. However, this action faces immense challenges due to the lack of early postearthquake data ...Disaster mitigation necessitates scientifi c and accurate aftershock forecasting during the critical 2 h after an earthquake. However, this action faces immense challenges due to the lack of early postearthquake data and the unreliability of forecasts. To obtain foundational data for sequence parameters of the land-sea adjacent zone and establish a reliable and operational aftershock forecasting framework, we combined the initial sequence parameters extracted from envelope functions and incorporated small-earthquake information into our model to construct a Bayesian algorithm for the early postearthquake stage. We performed parameter fitting and early postearthquake aftershock occurrence rate forecasting and effectiveness evaluation for 36 earthquake sequences with M ≥ 4.0 in the Bohai Rim region since 2010. According to the results, during the early stage after the mainshock, earthquake sequence parameters exhibited relatively drastic fl uctuations with signifi cant errors. The integration of prior information can mitigate the intensity of these changes and reduce errors. The initial and stable sequence parameters generally display advantageous distribution characteristics, with each parameter’s distribution being relatively concentrated and showing good symmetry and remarkable consistency. The sequence parameter p-values were relatively small, which indicates the comparatively slow attenuation of signifi cant earthquake events in the Bohai Rim region. A certain positive correlation was observed between earthquake sequence parameters b and p. However, sequence parameters are unrelated to the mainshock magnitude, which implies that their statistical characteristics and trends are universal. The Bayesian algorithm revealed a good forecasting capability for aftershocks in the early postearthquake period (2 h) in the Bohai Rim region, with an overall forecasting effi cacy rate of 76.39%. The proportion of “too low” failures exceeded that of “too high” failures, and the number of forecasting failures for the next three days was greater than that for the next day.展开更多
Based on the major gene and polygene mixed inheritance model for multiple correlated quantitative traits, the authors proposed a new joint segregation analysis method of major gene controlling multiple correlated quan...Based on the major gene and polygene mixed inheritance model for multiple correlated quantitative traits, the authors proposed a new joint segregation analysis method of major gene controlling multiple correlated quantitative traits, which include major gene detection and its effect and variation estimation. The effect and variation of major gene are estimated by the maximum likelihood method implemented via expectation-maximization (EM) algorithm. Major gene is tested with the likelihood ratio (LR) test statistic. Extensive simulation studies showed that joint analysis not only increases the statistical power of major gene detection but also improves the precision and accuracy of major gene effect estimates. An example of the plant height and the number of tiller of F2 population in rice cross Duonieai x Zhonghua 11 was used in the illustration. The results indicated that the genetic difference of these two traits in this cross refers to only one pleiotropic major gene. The additive effect and dominance effect of the major gene are estimated as -21.3 and 40.6 cm on plant height, and 22.7 and -25.3 on number of tiller, respectively. The major gene shows overdominance for plant height and close to complete dominance for number of tillers.展开更多
The Least Squares Residual(LSR)algorithm,one of the classical Receiver Autonomous Integrity Monitoring(RAIM)algorithms for Global Navigation Satellite System(GNSS),presents a high Missed Detection Risk(MDR)for a large...The Least Squares Residual(LSR)algorithm,one of the classical Receiver Autonomous Integrity Monitoring(RAIM)algorithms for Global Navigation Satellite System(GNSS),presents a high Missed Detection Risk(MDR)for a large-slope faulty satellite and a high False Alarm Risk(FAR)for a small-slope faulty satellite.From the theoretical analysis of the high MDR and FAR cause,the optimal slope is determined,and thereby the optimal test statistic for fault detection is conceived,which can minimize the FAR with the MDR not exceeding its allowable value.To construct a test statistic approximate to the optimal one,the CorrelationWeighted LSR(CW-LSR)algorithm is proposed.The CW-LSR test statistic remains the sum of pseudorange residual squares,but the square for the most potentially faulty satellite,judged by correlation analysis between the pseudorange residual and observation error,is weighted with an optimal-slope-based factor.It does not obey the same distribution but has the same noncentral parameter with the optimal test statistic.The superior performance of the CW-LSR algorithm is verified via simulation,both reducing the FAR for a small-slope faulty satellite with the MDR not exceeding its allowable value and reducing the MDR for a large-slope faulty satellite at the expense of FAR addition.展开更多
Based on spatio-temporal correlativity analysis method, the automatic identification techniques for data anomaly monitoring of coal mining working face gas are presented. The asynchronous correlative characteristics o...Based on spatio-temporal correlativity analysis method, the automatic identification techniques for data anomaly monitoring of coal mining working face gas are presented. The asynchronous correlative characteristics of gas migration in working face airflow direction are qualitatively analyzed. The calculation method of asynchronous correlation delay step and the prediction and inversion formulas of gas concentration changing with time and space after gas emission in the air return roadway are provided. By calculating one hundred and fifty groups of gas sensors data series from a coal mine which have the theoretical correlativity, the correlative coefficient values range of eight kinds of data anomaly is obtained. Then the gas moni- toring data anomaly identification algorithm based on spatio-temporal correlativity analysis is accordingly presented. In order to improve the efficiency of analysis, the gas sensors code rules which can express the spatial topological relations are sug- gested. The experiments indicate that methods presented in this article can effectively compensate the defects of methods based on a single gas sensor monitoring data.展开更多
Improved picture quality is critical to the effectiveness of object recog-nition and tracking.The consistency of those photos is impacted by night-video systems because the contrast between high-profile items and diffe...Improved picture quality is critical to the effectiveness of object recog-nition and tracking.The consistency of those photos is impacted by night-video systems because the contrast between high-profile items and different atmospheric conditions,such as mist,fog,dust etc.The pictures then shift in intensity,colour,polarity and consistency.A general challenge for computer vision analyses lies in the horrid appearance of night images in arbitrary illumination and ambient envir-onments.In recent years,target recognition techniques focused on deep learning and machine learning have become standard algorithms for object detection with the exponential growth of computer performance capabilities.However,the iden-tification of objects in the night world also poses further problems because of the distorted backdrop and dim light.The Correlation aware LSTM based YOLO(You Look Only Once)classifier method for exact object recognition and deter-mining its properties under night vision was a major inspiration for this work.In order to create virtual target sets similar to daily environments,we employ night images as inputs;and to obtain high enhanced image using histogram based enhancement and iterative wienerfilter for removing the noise in the image.The process of the feature extraction and feature selection was done for electing the potential features using the Adaptive internal linear embedding(AILE)and uplift linear discriminant analysis(ULDA).The region of interest mask can be segmen-ted using the Recurrent-Phase Level set Segmentation.Finally,we use deep con-volution feature fusion and region of interest pooling to integrate the presently extremely sophisticated quicker Long short term memory based(LSTM)with YOLO method for object tracking system.A range of experimentalfindings demonstrate that our technique achieves high average accuracy with a precision of 99.7%for object detection of SSAN datasets that is considerably more than that of the other standard object detection mechanism.Our approach may therefore satisfy the true demands of night scene target detection applications.We very much believe that our method will help future research.展开更多
Traditional distribution network planning relies on the professional knowledge of planners,especially when analyzing the correlations between the problems existing in the network and the crucial influencing factors.Th...Traditional distribution network planning relies on the professional knowledge of planners,especially when analyzing the correlations between the problems existing in the network and the crucial influencing factors.The inherent laws reflected by the historical data of the distribution network are ignored,which affects the objectivity of the planning scheme.In this study,to improve the efficiency and accuracy of distribution network planning,the characteristics of distribution network data were extracted using a data-mining technique,and correlation knowledge of existing problems in the network was obtained.A data-mining model based on correlation rules was established.The inputs of the model were the electrical characteristic indices screened using the gray correlation method.The Apriori algorithm was used to extract correlation knowledge from the operational data of the distribution network and obtain strong correlation rules.Degree of promotion and chi-square tests were used to verify the rationality of the strong correlation rules of the model output.In this study,the correlation relationship between heavy load or overload problems of distribution network feeders in different regions and related characteristic indices was determined,and the confidence of the correlation rules was obtained.These results can provide an effective basis for the formulation of a distribution network planning scheme.展开更多
In most of real operational conditions only response data are measurable while the actual excitations are unknown, so modal parameter must be extracted only from responses. This paper gives a theoretical formulation f...In most of real operational conditions only response data are measurable while the actual excitations are unknown, so modal parameter must be extracted only from responses. This paper gives a theoretical formulation for the cross-correlation functions and cross-power spectra between the outputs under the assumption of white-noise excitation. It widens the field of modal analysis under ambient excitation because many classical methods by impulse response functions or frequency response functions can be used easily for modal analysis under unknown excitation. The Polyreference Complex Exponential method and Eigensystem Realization Algorithm using cross-correlation functions in time domain and Orthogonal Polynomial method using cross-power spectra in frequency domain are applied to a steel frame to extract modal parameters under operational conditions. The modal properties of the steel frame from these three methods are compared with those from frequency response functions analysis. The results show that the modal analysis method using cross-correlation functions or cross-power spectra presented in this paper can extract modal parameters efficiently under unknown excitation.展开更多
In a Digital Particle Image Velocimetry (DPW) system, the correlation of digital images is normally used to acquire the displacement information of particles and give estimates of the flow field. The accuracy and ro...In a Digital Particle Image Velocimetry (DPW) system, the correlation of digital images is normally used to acquire the displacement information of particles and give estimates of the flow field. The accuracy and robustness of the correlation algorithm directly affect the validity of the analysis result. In this article, an improved algorithm for the correlation analysis was proposed which could be used to optimize the selection/determination of the correlation window, analysis area and search path. This algorithm not only reduces largely the amount of calculation, but also improves effectively the accuracy and reliability of the correlation analysis. The algorithm was demonstrated to be accurate and efficient in the measurement of the velocity field in a flocculation pool.展开更多
基金supported by the Hunan Provincial Natrual Science Foundation of China(2022JJ30103)“the 14th Five-Year”Key Disciplines and Application Oriented Special Disciplines of Hunan Province(Xiangjiaotong[2022],351)the Science and Technology Innovation Program of Hunan Province(2016TP1020).
文摘Correlation power analysis(CPA)combined with genetic algorithms(GA)now achieves greater attack efficiency and can recover all subkeys simultaneously.However,two issues in GA-based CPA still need to be addressed:key degeneration and slow evolution within populations.These challenges significantly hinder key recovery efforts.This paper proposes a screening correlation power analysis framework combined with a genetic algorithm,named SFGA-CPA,to address these issues.SFGA-CPA introduces three operations designed to exploit CPA characteris-tics:propagative operation,constrained crossover,and constrained mutation.Firstly,the propagative operation accelerates population evolution by maximizing the number of correct bytes in each individual.Secondly,the constrained crossover and mutation operations effectively address key degeneration by preventing the compromise of correct bytes.Finally,an intelligent search method is proposed to identify optimal parameters,further improving attack efficiency.Experiments were conducted on both simulated environments and real power traces collected from the SAKURA-G platform.In the case of simulation,SFGA-CPA reduces the number of traces by 27.3%and 60%compared to CPA based on multiple screening methods(MS-CPA)and CPA based on simple GA method(SGA-CPA)when the success rate reaches 90%.Moreover,real experimental results on the SAKURA-G platform demonstrate that our approach outperforms other methods.
基金supported by the Natural Science Foundation of Tianjin (No. 22JCQNJC01070)the National Natural Science Foundation of China (No. 42404079)the Key Project of Tianjin Earthquake Agency (No. Zd202402)。
文摘Disaster mitigation necessitates scientifi c and accurate aftershock forecasting during the critical 2 h after an earthquake. However, this action faces immense challenges due to the lack of early postearthquake data and the unreliability of forecasts. To obtain foundational data for sequence parameters of the land-sea adjacent zone and establish a reliable and operational aftershock forecasting framework, we combined the initial sequence parameters extracted from envelope functions and incorporated small-earthquake information into our model to construct a Bayesian algorithm for the early postearthquake stage. We performed parameter fitting and early postearthquake aftershock occurrence rate forecasting and effectiveness evaluation for 36 earthquake sequences with M ≥ 4.0 in the Bohai Rim region since 2010. According to the results, during the early stage after the mainshock, earthquake sequence parameters exhibited relatively drastic fl uctuations with signifi cant errors. The integration of prior information can mitigate the intensity of these changes and reduce errors. The initial and stable sequence parameters generally display advantageous distribution characteristics, with each parameter’s distribution being relatively concentrated and showing good symmetry and remarkable consistency. The sequence parameter p-values were relatively small, which indicates the comparatively slow attenuation of signifi cant earthquake events in the Bohai Rim region. A certain positive correlation was observed between earthquake sequence parameters b and p. However, sequence parameters are unrelated to the mainshock magnitude, which implies that their statistical characteristics and trends are universal. The Bayesian algorithm revealed a good forecasting capability for aftershocks in the early postearthquake period (2 h) in the Bohai Rim region, with an overall forecasting effi cacy rate of 76.39%. The proportion of “too low” failures exceeded that of “too high” failures, and the number of forecasting failures for the next three days was greater than that for the next day.
基金This research was supported by the National Natural Science Foundation of China to Xu Chenwu (39900080, 30270724 and 30370758).
文摘Based on the major gene and polygene mixed inheritance model for multiple correlated quantitative traits, the authors proposed a new joint segregation analysis method of major gene controlling multiple correlated quantitative traits, which include major gene detection and its effect and variation estimation. The effect and variation of major gene are estimated by the maximum likelihood method implemented via expectation-maximization (EM) algorithm. Major gene is tested with the likelihood ratio (LR) test statistic. Extensive simulation studies showed that joint analysis not only increases the statistical power of major gene detection but also improves the precision and accuracy of major gene effect estimates. An example of the plant height and the number of tiller of F2 population in rice cross Duonieai x Zhonghua 11 was used in the illustration. The results indicated that the genetic difference of these two traits in this cross refers to only one pleiotropic major gene. The additive effect and dominance effect of the major gene are estimated as -21.3 and 40.6 cm on plant height, and 22.7 and -25.3 on number of tiller, respectively. The major gene shows overdominance for plant height and close to complete dominance for number of tillers.
基金co-supported by the National Natural Science Foundation of China (Nos. 41804024, 41804026)the Open Fund of Shaanxi Key Laboratory of Integrated and Intelligent Navigation of China (No. SKLIIN-20190205)
文摘The Least Squares Residual(LSR)algorithm,one of the classical Receiver Autonomous Integrity Monitoring(RAIM)algorithms for Global Navigation Satellite System(GNSS),presents a high Missed Detection Risk(MDR)for a large-slope faulty satellite and a high False Alarm Risk(FAR)for a small-slope faulty satellite.From the theoretical analysis of the high MDR and FAR cause,the optimal slope is determined,and thereby the optimal test statistic for fault detection is conceived,which can minimize the FAR with the MDR not exceeding its allowable value.To construct a test statistic approximate to the optimal one,the CorrelationWeighted LSR(CW-LSR)algorithm is proposed.The CW-LSR test statistic remains the sum of pseudorange residual squares,but the square for the most potentially faulty satellite,judged by correlation analysis between the pseudorange residual and observation error,is weighted with an optimal-slope-based factor.It does not obey the same distribution but has the same noncentral parameter with the optimal test statistic.The superior performance of the CW-LSR algorithm is verified via simulation,both reducing the FAR for a small-slope faulty satellite with the MDR not exceeding its allowable value and reducing the MDR for a large-slope faulty satellite at the expense of FAR addition.
基金Supported by the National Natural Science Foundation of China (40971275, 50811120111)
文摘Based on spatio-temporal correlativity analysis method, the automatic identification techniques for data anomaly monitoring of coal mining working face gas are presented. The asynchronous correlative characteristics of gas migration in working face airflow direction are qualitatively analyzed. The calculation method of asynchronous correlation delay step and the prediction and inversion formulas of gas concentration changing with time and space after gas emission in the air return roadway are provided. By calculating one hundred and fifty groups of gas sensors data series from a coal mine which have the theoretical correlativity, the correlative coefficient values range of eight kinds of data anomaly is obtained. Then the gas moni- toring data anomaly identification algorithm based on spatio-temporal correlativity analysis is accordingly presented. In order to improve the efficiency of analysis, the gas sensors code rules which can express the spatial topological relations are sug- gested. The experiments indicate that methods presented in this article can effectively compensate the defects of methods based on a single gas sensor monitoring data.
文摘Improved picture quality is critical to the effectiveness of object recog-nition and tracking.The consistency of those photos is impacted by night-video systems because the contrast between high-profile items and different atmospheric conditions,such as mist,fog,dust etc.The pictures then shift in intensity,colour,polarity and consistency.A general challenge for computer vision analyses lies in the horrid appearance of night images in arbitrary illumination and ambient envir-onments.In recent years,target recognition techniques focused on deep learning and machine learning have become standard algorithms for object detection with the exponential growth of computer performance capabilities.However,the iden-tification of objects in the night world also poses further problems because of the distorted backdrop and dim light.The Correlation aware LSTM based YOLO(You Look Only Once)classifier method for exact object recognition and deter-mining its properties under night vision was a major inspiration for this work.In order to create virtual target sets similar to daily environments,we employ night images as inputs;and to obtain high enhanced image using histogram based enhancement and iterative wienerfilter for removing the noise in the image.The process of the feature extraction and feature selection was done for electing the potential features using the Adaptive internal linear embedding(AILE)and uplift linear discriminant analysis(ULDA).The region of interest mask can be segmen-ted using the Recurrent-Phase Level set Segmentation.Finally,we use deep con-volution feature fusion and region of interest pooling to integrate the presently extremely sophisticated quicker Long short term memory based(LSTM)with YOLO method for object tracking system.A range of experimentalfindings demonstrate that our technique achieves high average accuracy with a precision of 99.7%for object detection of SSAN datasets that is considerably more than that of the other standard object detection mechanism.Our approach may therefore satisfy the true demands of night scene target detection applications.We very much believe that our method will help future research.
基金supported by the Science and Technology Project of China Southern Power Grid(GZHKJXM20210043-080041KK52210002).
文摘Traditional distribution network planning relies on the professional knowledge of planners,especially when analyzing the correlations between the problems existing in the network and the crucial influencing factors.The inherent laws reflected by the historical data of the distribution network are ignored,which affects the objectivity of the planning scheme.In this study,to improve the efficiency and accuracy of distribution network planning,the characteristics of distribution network data were extracted using a data-mining technique,and correlation knowledge of existing problems in the network was obtained.A data-mining model based on correlation rules was established.The inputs of the model were the electrical characteristic indices screened using the gray correlation method.The Apriori algorithm was used to extract correlation knowledge from the operational data of the distribution network and obtain strong correlation rules.Degree of promotion and chi-square tests were used to verify the rationality of the strong correlation rules of the model output.In this study,the correlation relationship between heavy load or overload problems of distribution network feeders in different regions and related characteristic indices was determined,and the confidence of the correlation rules was obtained.These results can provide an effective basis for the formulation of a distribution network planning scheme.
基金Item of the 9-th F ive Plan of the Aeronautical Industrial Corporation
文摘In most of real operational conditions only response data are measurable while the actual excitations are unknown, so modal parameter must be extracted only from responses. This paper gives a theoretical formulation for the cross-correlation functions and cross-power spectra between the outputs under the assumption of white-noise excitation. It widens the field of modal analysis under ambient excitation because many classical methods by impulse response functions or frequency response functions can be used easily for modal analysis under unknown excitation. The Polyreference Complex Exponential method and Eigensystem Realization Algorithm using cross-correlation functions in time domain and Orthogonal Polynomial method using cross-power spectra in frequency domain are applied to a steel frame to extract modal parameters under operational conditions. The modal properties of the steel frame from these three methods are compared with those from frequency response functions analysis. The results show that the modal analysis method using cross-correlation functions or cross-power spectra presented in this paper can extract modal parameters efficiently under unknown excitation.
文摘In a Digital Particle Image Velocimetry (DPW) system, the correlation of digital images is normally used to acquire the displacement information of particles and give estimates of the flow field. The accuracy and robustness of the correlation algorithm directly affect the validity of the analysis result. In this article, an improved algorithm for the correlation analysis was proposed which could be used to optimize the selection/determination of the correlation window, analysis area and search path. This algorithm not only reduces largely the amount of calculation, but also improves effectively the accuracy and reliability of the correlation analysis. The algorithm was demonstrated to be accurate and efficient in the measurement of the velocity field in a flocculation pool.