A strong and stable correlation in quantum information is of high quality for quantum information processing.We define two quantities,selective average correlation and ripple coefficient,to evaluate the quality of cor...A strong and stable correlation in quantum information is of high quality for quantum information processing.We define two quantities,selective average correlation and ripple coefficient,to evaluate the quality of correlation in quantum information in a time interval.As a new communication channel,Heisenberg spin chains are widely investigated.We select a two-qubit Heisenberg XXZs pin chain with Dzyaloshinskii-Moriya interaction in an inhomogeneous magnetic field as an example,and use the two quantities to evaluate the qualities of the correlation in quantum information with different measures.The result shows that,if the time evolutions are similar,there needs only evaluating one of them to know when the correlation has high quality for quantum information processing.展开更多
This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(...This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method.展开更多
This study investigated the impacts of random negative training datasets(NTDs)on the uncertainty of machine learning models for geologic hazard susceptibility assessment of the Loess Plateau,northern Shaanxi Province,...This study investigated the impacts of random negative training datasets(NTDs)on the uncertainty of machine learning models for geologic hazard susceptibility assessment of the Loess Plateau,northern Shaanxi Province,China.Based on randomly generated 40 NTDs,the study developed models for the geologic hazard susceptibility assessment using the random forest algorithm and evaluated their performances using the area under the receiver operating characteristic curve(AUC).Specifically,the means and standard deviations of the AUC values from all models were then utilized to assess the overall spatial correlation between the conditioning factors and the susceptibility assessment,as well as the uncertainty introduced by the NTDs.A risk and return methodology was thus employed to quantify and mitigate the uncertainty,with log odds ratios used to characterize the susceptibility assessment levels.The risk and return values were calculated based on the standard deviations and means of the log odds ratios of various locations.After the mean log odds ratios were converted into probability values,the final susceptibility map was plotted,which accounts for the uncertainty induced by random NTDs.The results indicate that the AUC values of the models ranged from 0.810 to 0.963,with an average of 0.852 and a standard deviation of 0.035,indicating encouraging prediction effects and certain uncertainty.The risk and return analysis reveals that low-risk and high-return areas suggest lower standard deviations and higher means across multiple model-derived assessments.Overall,this study introduces a new framework for quantifying the uncertainty of multiple training and evaluation models,aimed at improving their robustness and reliability.Additionally,by identifying low-risk and high-return areas,resource allocation for geologic hazard prevention and control can be optimized,thus ensuring that limited resources are directed toward the most effective prevention and control measures.展开更多
In wireless sensor networks(WSNs),the performance of related applications is highly dependent on the quality of data collected.Unfortunately,missing data is almost inevitable in the process of data acquisition and tra...In wireless sensor networks(WSNs),the performance of related applications is highly dependent on the quality of data collected.Unfortunately,missing data is almost inevitable in the process of data acquisition and transmission.Existing methods often rely on prior information such as low-rank characteristics or spatiotemporal correlation when recovering missing WSNs data.However,in realistic application scenarios,it is very difficult to obtain these prior information from incomplete data sets.Therefore,we aim to recover the missing WSNs data effectively while getting rid of the perplexity of prior information.By designing the corresponding measurement matrix that can capture the position of missing data and sparse representation matrix,a compressive sensing(CS)based missing data recovery model is established.Then,we design a comparison standard to select the best sparse representation basis and introduce average cross-correlation to examine the rationality of the established model.Furthermore,an improved fast matching pursuit algorithm is proposed to solve the model.Simulation results show that the proposed method can effectively recover the missing WSNs data.展开更多
Most existing studies on the relationship between factor analysis(FA)and principal component analysis(PCA)focus on approximating the common factors by the first few components via the closeness between their loadings....Most existing studies on the relationship between factor analysis(FA)and principal component analysis(PCA)focus on approximating the common factors by the first few components via the closeness between their loadings.Based on a setup in Bentler and de Leeuw(Psychometrika 76:461-470,2011),this study examines the relationship between FA loadings and PCA loadings when specificities are treated as latent factors.In particular,we will examine the closeness between the two types of loadings when the number of observed variables(p)increases.Parallel to the development in Schneeweiss(Multivar Behav Res 32:375-401,1997),an average squared canonical correlation(ASCC)is used as the criterion for measuring the closeness.We show that the ASCC can be partitioned into two parts,the first of which is a function of FA loadings and the inverse correlation matrix,and the second of which is a function of unique variances and the inverse correlation matrix of the observed variables.We examine the behavior of these two parts as p approaches infinity.The study gives a different perspective on the relationship between PCA and FA,and the results add additional insights on the selection of the two types of methods in the analysis of high dimensional data.展开更多
基金Supported by the National Natural Science Foundation of China(11075013,11375025)
文摘A strong and stable correlation in quantum information is of high quality for quantum information processing.We define two quantities,selective average correlation and ripple coefficient,to evaluate the quality of correlation in quantum information in a time interval.As a new communication channel,Heisenberg spin chains are widely investigated.We select a two-qubit Heisenberg XXZs pin chain with Dzyaloshinskii-Moriya interaction in an inhomogeneous magnetic field as an example,and use the two quantities to evaluate the qualities of the correlation in quantum information with different measures.The result shows that,if the time evolutions are similar,there needs only evaluating one of them to know when the correlation has high quality for quantum information processing.
基金supported by the Fundamental Research Fund for the Central Universities(NS2013016)
文摘This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method.
基金supported by a project entitled Loess Plateau Region-Watershed-Slope Geological Hazard Multi-Scale Collaborative Intelligent Early Warning System of the National Key R&D Program of China(2022YFC3003404)a project of the Shaanxi Youth Science and Technology Star(2021KJXX-87)public welfare geological survey projects of Shaanxi Institute of Geologic Survey(20180301,201918,202103,and 202413).
文摘This study investigated the impacts of random negative training datasets(NTDs)on the uncertainty of machine learning models for geologic hazard susceptibility assessment of the Loess Plateau,northern Shaanxi Province,China.Based on randomly generated 40 NTDs,the study developed models for the geologic hazard susceptibility assessment using the random forest algorithm and evaluated their performances using the area under the receiver operating characteristic curve(AUC).Specifically,the means and standard deviations of the AUC values from all models were then utilized to assess the overall spatial correlation between the conditioning factors and the susceptibility assessment,as well as the uncertainty introduced by the NTDs.A risk and return methodology was thus employed to quantify and mitigate the uncertainty,with log odds ratios used to characterize the susceptibility assessment levels.The risk and return values were calculated based on the standard deviations and means of the log odds ratios of various locations.After the mean log odds ratios were converted into probability values,the final susceptibility map was plotted,which accounts for the uncertainty induced by random NTDs.The results indicate that the AUC values of the models ranged from 0.810 to 0.963,with an average of 0.852 and a standard deviation of 0.035,indicating encouraging prediction effects and certain uncertainty.The risk and return analysis reveals that low-risk and high-return areas suggest lower standard deviations and higher means across multiple model-derived assessments.Overall,this study introduces a new framework for quantifying the uncertainty of multiple training and evaluation models,aimed at improving their robustness and reliability.Additionally,by identifying low-risk and high-return areas,resource allocation for geologic hazard prevention and control can be optimized,thus ensuring that limited resources are directed toward the most effective prevention and control measures.
基金supported by the National Natural Science Foundation of China(No.61871400)the Natural Science Foundation of the Jiangsu Province of China(No.BK20171401)。
文摘In wireless sensor networks(WSNs),the performance of related applications is highly dependent on the quality of data collected.Unfortunately,missing data is almost inevitable in the process of data acquisition and transmission.Existing methods often rely on prior information such as low-rank characteristics or spatiotemporal correlation when recovering missing WSNs data.However,in realistic application scenarios,it is very difficult to obtain these prior information from incomplete data sets.Therefore,we aim to recover the missing WSNs data effectively while getting rid of the perplexity of prior information.By designing the corresponding measurement matrix that can capture the position of missing data and sparse representation matrix,a compressive sensing(CS)based missing data recovery model is established.Then,we design a comparison standard to select the best sparse representation basis and introduce average cross-correlation to examine the rationality of the established model.Furthermore,an improved fast matching pursuit algorithm is proposed to solve the model.Simulation results show that the proposed method can effectively recover the missing WSNs data.
基金supported by a grant from the Natural Science Foundation of China(31971029)by a grant from the Department of Education(R305D210023).
文摘Most existing studies on the relationship between factor analysis(FA)and principal component analysis(PCA)focus on approximating the common factors by the first few components via the closeness between their loadings.Based on a setup in Bentler and de Leeuw(Psychometrika 76:461-470,2011),this study examines the relationship between FA loadings and PCA loadings when specificities are treated as latent factors.In particular,we will examine the closeness between the two types of loadings when the number of observed variables(p)increases.Parallel to the development in Schneeweiss(Multivar Behav Res 32:375-401,1997),an average squared canonical correlation(ASCC)is used as the criterion for measuring the closeness.We show that the ASCC can be partitioned into two parts,the first of which is a function of FA loadings and the inverse correlation matrix,and the second of which is a function of unique variances and the inverse correlation matrix of the observed variables.We examine the behavior of these two parts as p approaches infinity.The study gives a different perspective on the relationship between PCA and FA,and the results add additional insights on the selection of the two types of methods in the analysis of high dimensional data.