When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes...When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes a high-performance classification algorithm specifically designed for imbalanced datasets.The proposed method first uses a biased second-order cone programming support vectormachine(B-SOCP-SVM)to identify the support vectors(SVs)and non-support vectors(NSVs)in the imbalanced data.Then,it applies the synthetic minority over-sampling technique(SV-SMOTE)to oversample the support vectors of the minority class and uses the random under-sampling technique(NSV-RUS)multiple times to undersample the non-support vectors of the majority class.Combining the above-obtained minority class data set withmultiple majority class datasets can obtainmultiple new balanced data sets.Finally,SOCP-SVM is used to classify each data set,and the final result is obtained through the integrated algorithm.Experimental results demonstrate that the proposed method performs excellently on imbalanced datasets.展开更多
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re...Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.展开更多
It is essential to investigate the light field camera parameters for the accurate flame temperature measurement because the sampling characteristics of the flame radiation can be varied with them. In this study, novel...It is essential to investigate the light field camera parameters for the accurate flame temperature measurement because the sampling characteristics of the flame radiation can be varied with them. In this study, novel indices of the light field camera were proposed to investigate the directional and spatial sampling characteristics of the flame radiation. Effects of light field camera parameters such as focal length and magnification of the main lens, focal length and magnification of the microlens were investigated. It was observed that the sampling characteristics of the flame are varied with the different parameters of the light field camera. The optimized parameters of the light field camera were then proposed for the flame radiation sampling. The larger sampling angle(23 times larger) is achieved by the optimized parameters compared to the commercial light field camera parameters. A non-negative least square(NNLS) algorithm was used to reconstruct the flame temperature. The reconstruction accuracy was also evaluated by the optimized parameters. The results suggested that the optimized parameters can provide higher reconstruction accuracy for axisymmetric and non-symmetric flame conditions in comparison to the commercial light field camera.展开更多
China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a...China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data.展开更多
In the field of supercritical wing design, various principles and rules have been summarized through theoretical and experimental analyses. Compared with black-box relationships between geometry parameters and perform...In the field of supercritical wing design, various principles and rules have been summarized through theoretical and experimental analyses. Compared with black-box relationships between geometry parameters and performances, quantitative physical laws about pressure distributions and performances are clearer and more beneficial to designers. With the advancement of computational fluid dynamics and computational intelligence, discovering new rules through statistical analysis on computers has become increasingly attractive and affordable. This paper proposes a novel sampling method for the statistical study on pressure distribution features and performances, so that new physical laws can be revealed. It utilizes an adaptive sampling algorithm, of which the criteria are developed based on Kullback–Leibler divergence and Euclidean distance.In this paper, the proposed method is employed to generate airfoil samples to study the relationships between the supercritical pressure distribution features and the drag divergence Mach number as well as the drag creep characteristic. Compared with conventional sampling methods, the proposed method can efficiently distribute samples in the pressure distribution feature space rather than directly sampling airfoil geometry parameters. The corresponding geometry parameters are searched and found under constraints, so that supercritical airfoil samples that are well distributed in the pressure distribution space are obtained. These samples allow statistical studies to obtain more reliable and universal aerodynamic rules that can be applied to supercritical airfoil designs.展开更多
基金supported by the Natural Science Basic Research Program of Shaanxi(Program No.2024JC-YBMS-026).
文摘When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes a high-performance classification algorithm specifically designed for imbalanced datasets.The proposed method first uses a biased second-order cone programming support vectormachine(B-SOCP-SVM)to identify the support vectors(SVs)and non-support vectors(NSVs)in the imbalanced data.Then,it applies the synthetic minority over-sampling technique(SV-SMOTE)to oversample the support vectors of the minority class and uses the random under-sampling technique(NSV-RUS)multiple times to undersample the non-support vectors of the majority class.Combining the above-obtained minority class data set withmultiple majority class datasets can obtainmultiple new balanced data sets.Finally,SOCP-SVM is used to classify each data set,and the final result is obtained through the integrated algorithm.Experimental results demonstrate that the proposed method performs excellently on imbalanced datasets.
基金supported in part by the Research Fund of Key Lab of Education Blockchain and Intelligent Technology,Ministry of Education(EBME25-F-08).
文摘Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.
基金supported by the National Natural Science Foundation of China(Grant Nos.51676044 and 51327803)the Social Development Project of Jiangsu Province,China(Grant No.BE20187053)+1 种基金the Postgraduate Research and Practice Innovation Program of Jiangsu Province,China(Grant No.KYCX170081)China Scholarship Council
文摘It is essential to investigate the light field camera parameters for the accurate flame temperature measurement because the sampling characteristics of the flame radiation can be varied with them. In this study, novel indices of the light field camera were proposed to investigate the directional and spatial sampling characteristics of the flame radiation. Effects of light field camera parameters such as focal length and magnification of the main lens, focal length and magnification of the microlens were investigated. It was observed that the sampling characteristics of the flame are varied with the different parameters of the light field camera. The optimized parameters of the light field camera were then proposed for the flame radiation sampling. The larger sampling angle(23 times larger) is achieved by the optimized parameters compared to the commercial light field camera parameters. A non-negative least square(NNLS) algorithm was used to reconstruct the flame temperature. The reconstruction accuracy was also evaluated by the optimized parameters. The results suggested that the optimized parameters can provide higher reconstruction accuracy for axisymmetric and non-symmetric flame conditions in comparison to the commercial light field camera.
文摘China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data.
基金supported by the National Natural Science Foundation of China(Nos.91852108 and 11872230)。
文摘In the field of supercritical wing design, various principles and rules have been summarized through theoretical and experimental analyses. Compared with black-box relationships between geometry parameters and performances, quantitative physical laws about pressure distributions and performances are clearer and more beneficial to designers. With the advancement of computational fluid dynamics and computational intelligence, discovering new rules through statistical analysis on computers has become increasingly attractive and affordable. This paper proposes a novel sampling method for the statistical study on pressure distribution features and performances, so that new physical laws can be revealed. It utilizes an adaptive sampling algorithm, of which the criteria are developed based on Kullback–Leibler divergence and Euclidean distance.In this paper, the proposed method is employed to generate airfoil samples to study the relationships between the supercritical pressure distribution features and the drag divergence Mach number as well as the drag creep characteristic. Compared with conventional sampling methods, the proposed method can efficiently distribute samples in the pressure distribution feature space rather than directly sampling airfoil geometry parameters. The corresponding geometry parameters are searched and found under constraints, so that supercritical airfoil samples that are well distributed in the pressure distribution space are obtained. These samples allow statistical studies to obtain more reliable and universal aerodynamic rules that can be applied to supercritical airfoil designs.