期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
Quality of correlation in quantum information
1
作者 崔景云 曾天海 娄乐生 《Journal of Beijing Institute of Technology》 EI CAS 2016年第1期115-119,共5页
A strong and stable correlation in quantum information is of high quality for quantum information processing.We define two quantities,selective average correlation and ripple coefficient,to evaluate the quality of cor... A strong and stable correlation in quantum information is of high quality for quantum information processing.We define two quantities,selective average correlation and ripple coefficient,to evaluate the quality of correlation in quantum information in a time interval.As a new communication channel,Heisenberg spin chains are widely investigated.We select a two-qubit Heisenberg XXZs pin chain with Dzyaloshinskii-Moriya interaction in an inhomogeneous magnetic field as an example,and use the two quantities to evaluate the qualities of the correlation in quantum information with different measures.The result shows that,if the time evolutions are similar,there needs only evaluating one of them to know when the correlation has high quality for quantum information processing. 展开更多
关键词 correlation in quantum information quantum information processing selective average correlation ripple coefficient
在线阅读 下载PDF
Fast acquisition of L2C CL codes based on combination of hyper codes and averaging correlation 被引量:1
2
作者 Qingxi Zeng Linlin Tang +1 位作者 Pengna Zhang Ling Pei 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2016年第2期308-318,共11页
This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(... This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method. 展开更多
关键词 L2C civilian long length(CL) code fast acquisition hyper code averaging correlation
在线阅读 下载PDF
Impacts of random negative training datasets on machine learning-based geologic hazard susceptibility assessment
3
作者 Hao Cheng Wei Hong +3 位作者 Zhen-kai Zhang Zeng-lin Hong Zi-yao Wang Yu-xuan Dong 《China Geology》 2025年第4期676-690,共15页
This study investigated the impacts of random negative training datasets(NTDs)on the uncertainty of machine learning models for geologic hazard susceptibility assessment of the Loess Plateau,northern Shaanxi Province,... This study investigated the impacts of random negative training datasets(NTDs)on the uncertainty of machine learning models for geologic hazard susceptibility assessment of the Loess Plateau,northern Shaanxi Province,China.Based on randomly generated 40 NTDs,the study developed models for the geologic hazard susceptibility assessment using the random forest algorithm and evaluated their performances using the area under the receiver operating characteristic curve(AUC).Specifically,the means and standard deviations of the AUC values from all models were then utilized to assess the overall spatial correlation between the conditioning factors and the susceptibility assessment,as well as the uncertainty introduced by the NTDs.A risk and return methodology was thus employed to quantify and mitigate the uncertainty,with log odds ratios used to characterize the susceptibility assessment levels.The risk and return values were calculated based on the standard deviations and means of the log odds ratios of various locations.After the mean log odds ratios were converted into probability values,the final susceptibility map was plotted,which accounts for the uncertainty induced by random NTDs.The results indicate that the AUC values of the models ranged from 0.810 to 0.963,with an average of 0.852 and a standard deviation of 0.035,indicating encouraging prediction effects and certain uncertainty.The risk and return analysis reveals that low-risk and high-return areas suggest lower standard deviations and higher means across multiple model-derived assessments.Overall,this study introduces a new framework for quantifying the uncertainty of multiple training and evaluation models,aimed at improving their robustness and reliability.Additionally,by identifying low-risk and high-return areas,resource allocation for geologic hazard prevention and control can be optimized,thus ensuring that limited resources are directed toward the most effective prevention and control measures. 展开更多
关键词 LANDSLIDES Debris flows Collapses Ground fissures Geologic hazard prevention and control ENGINEERING Geologic hazard susceptibility assessment Negative training dataset average spatial correlation Random forest algorithm Risk and return analysis Geological survey engineering Loess Plateau area
在线阅读 下载PDF
A Practical Approach for Missing Wireless Sensor Networks Data Recovery
4
作者 Song Xiaoxiang Guo Yan +1 位作者 Li Ning Ren Bing 《China Communications》 SCIE CSCD 2024年第5期202-217,共16页
In wireless sensor networks(WSNs),the performance of related applications is highly dependent on the quality of data collected.Unfortunately,missing data is almost inevitable in the process of data acquisition and tra... In wireless sensor networks(WSNs),the performance of related applications is highly dependent on the quality of data collected.Unfortunately,missing data is almost inevitable in the process of data acquisition and transmission.Existing methods often rely on prior information such as low-rank characteristics or spatiotemporal correlation when recovering missing WSNs data.However,in realistic application scenarios,it is very difficult to obtain these prior information from incomplete data sets.Therefore,we aim to recover the missing WSNs data effectively while getting rid of the perplexity of prior information.By designing the corresponding measurement matrix that can capture the position of missing data and sparse representation matrix,a compressive sensing(CS)based missing data recovery model is established.Then,we design a comparison standard to select the best sparse representation basis and introduce average cross-correlation to examine the rationality of the established model.Furthermore,an improved fast matching pursuit algorithm is proposed to solve the model.Simulation results show that the proposed method can effectively recover the missing WSNs data. 展开更多
关键词 average cross correlation matching pursuit missing data wireless sensor networks
在线阅读 下载PDF
On the Relationship Between Factor Loadings and Component Loadings When Latent Traits and Specificities are Treated as Latent Factors
5
作者 Kentaro Hayashi Ke-Hai Yuan Peter M.Bentler 《Fudan Journal of the Humanities and Social Sciences》 2025年第1期1-15,共15页
Most existing studies on the relationship between factor analysis(FA)and principal component analysis(PCA)focus on approximating the common factors by the first few components via the closeness between their loadings.... Most existing studies on the relationship between factor analysis(FA)and principal component analysis(PCA)focus on approximating the common factors by the first few components via the closeness between their loadings.Based on a setup in Bentler and de Leeuw(Psychometrika 76:461-470,2011),this study examines the relationship between FA loadings and PCA loadings when specificities are treated as latent factors.In particular,we will examine the closeness between the two types of loadings when the number of observed variables(p)increases.Parallel to the development in Schneeweiss(Multivar Behav Res 32:375-401,1997),an average squared canonical correlation(ASCC)is used as the criterion for measuring the closeness.We show that the ASCC can be partitioned into two parts,the first of which is a function of FA loadings and the inverse correlation matrix,and the second of which is a function of unique variances and the inverse correlation matrix of the observed variables.We examine the behavior of these two parts as p approaches infinity.The study gives a different perspective on the relationship between PCA and FA,and the results add additional insights on the selection of the two types of methods in the analysis of high dimensional data. 展开更多
关键词 Factor analysis Principal component analysis average squared canonical correlation Woodbury identity Kaiser-Meyer-Olkin measure of sampling adequacy
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部