Thermal image, or thermogram, becomes a new type of signal for machine condition monitoring and fault diagnosis due to the capability to display real-time temperature distribution and possibility to indicate the mach...Thermal image, or thermogram, becomes a new type of signal for machine condition monitoring and fault diagnosis due to the capability to display real-time temperature distribution and possibility to indicate the machine’s operating condition through its temperature. In this paper, an investigation of using the second-order statistical features of thermogram in association with minimum redundancy maximum relevance (mRMR) feature selection and simplified fuzzy ARTMAP (SFAM) classification is conducted for rotating machinery fault diagnosis. The thermograms of different machine conditions are firstly preprocessed for improving the image contrast, removing noise, and cropping to obtain the regions of interest (ROIs). Then, an enhanced algorithm based on bi-dimensional empirical mode decomposition is implemented to further increase the quality of ROIs before the second-order statistical features are extracted from their gray-level co-occurrence matrix (GLCM). The highly relevant features to the machine condition are selected from the total feature set by mRMR and are fed into SFAM to accomplish the fault diagnosis. In order to verify this investigation, the thermograms acquired from different conditions of a fault simulator including normal, misalignment, faulty bearing, and mass unbalance are used. This investigation also provides a comparative study of SFAM and other traditional methods such as back-propagation and probabilistic neural networks. The results show that the second-order statistical features used in this framework can provide a plausible accuracy in fault diagnosis of rotating machinery.展开更多
Discriminative region localization and efficient feature encoding are crucial for fine-grained object recognition.However,existing data augmentation methods struggle to accurately locate discriminative regions in comp...Discriminative region localization and efficient feature encoding are crucial for fine-grained object recognition.However,existing data augmentation methods struggle to accurately locate discriminative regions in complex backgrounds,small target objects,and limited training data,leading to poor recognition.Fine-grained images exhibit“small inter-class differences,”and while second-order feature encoding enhances discrimination,it often requires dual Convolutional Neural Networks(CNN),increasing training time and complexity.This study proposes a model integrating discriminative region localization and efficient second-order feature encoding.By ranking feature map channels via a fully connected layer,it selects high-importance channels to generate an enhanced map,accurately locating discriminative regions.Cropping and erasing augmentations further refine recognition.To improve efficiency,a novel second-order feature encoding module generates an attention map from the fourth convolutional group of Residual Network 50 layers(ResNet-50)and multiplies it with features from the fifth group,producing second-order features while reducing dimensionality and training time.Experiments on Caltech-University of California,San Diego Birds-200-2011(CUB-200-2011),Stanford Car,and Fine-Grained Visual Classification of Aircraft(FGVC Aircraft)datasets show state-of-the-art accuracy of 88.9%,94.7%,and 93.3%,respectively.展开更多
Based on the second order random wave solutions of water wave equations in finite water depth, statistical distributions of the depth integrated local horizontal momentum components are derived by use of the charact...Based on the second order random wave solutions of water wave equations in finite water depth, statistical distributions of the depth integrated local horizontal momentum components are derived by use of the characteristic function expansion method. The parameters involved in the distributions can be all determined by the water depth and the wave number spectrum of ocean waves. As an illustrative example, a fully developed wind generated sea is considered and the parameters are calculated for typical wind speeds and water depths by means of the Donelan and Pierson spectrum. The effects of nonlinearity and water depth on the distributions are also investigated.展开更多
In recent years, the interest in damage identification of structural components through innovative techniques has grown significantly. Damage identification has always been a crucial concern in quality assessment and ...In recent years, the interest in damage identification of structural components through innovative techniques has grown significantly. Damage identification has always been a crucial concern in quality assessment and load capacity rating of infrastructure. In this regard, researchers focus on proposing efficient tools to identify the damages in early stages to prevent the sudden failure in structural components, ensuring the public safety and reducing the asset management costs. The sensing technologies along with the data analysis through various techniques and machine learning approaches have been the area of interest for these innovative techniques. The purpose of this research is to develop a robust method for automatic condition assessment of real-life concrete structures for the detection of relatively small cracks at early stages. A damage identification algorithm is proposed using the hybrid approaches to analyze the sensors data. The data obtained from transducers mounted on concrete beams under static loading in laboratory. These data are used as the input parameters. The method relies only on the measured time responses. After filtering and normalization of the data, the damage sensitive statistical features are extracted from the signals and used as the inputs of Self-Advising Support Vector Machine (SA-SVM) for the classification purpose in civil Engineering area. Finally, the results are compared with traditional methods to investigate the feasibility of the hybrid proposed algorithm. It is demonstrated that the presented method can reliably detect the crack in the structure and thereby enable the real-time infrastructure health monitoring.展开更多
On the basis of the arctic monthly mean sea ice extent data set during 1953-1984, the arctic region is divided into eight subregions,and the analyses of empirical orthogonal functions, power spectrum and maximum entro...On the basis of the arctic monthly mean sea ice extent data set during 1953-1984, the arctic region is divided into eight subregions,and the analyses of empirical orthogonal functions, power spectrum and maximum entropy spectrum are made to indentify the major spatial and temporal features of the sea ice fluctuations within 32-year period. And then, a brief appropriate physical explanation is tentatively suggested. The results show that both seasonal and non-seasonal variations of the sea ice extent are remarkable, and iis mean annual peripheral positions as well as their interannu-al shifting amplitudes are quite different among all subregions. These features are primarily affected by solar radiation, o-cean circulation, sea surface temperature and maritime-continental contrast, while the non-seasonal variations are most possibly affected by the cosmic-geophysical factors such as earth pole shife, earth rotation oscillation and solar activity.展开更多
With the increasing popularity of high-resolution remote sensing images,the remote sensing image retrieval(RSIR)has always been a topic of major issue.A combined,global non-subsampled shearlet transform(NSST)-domain s...With the increasing popularity of high-resolution remote sensing images,the remote sensing image retrieval(RSIR)has always been a topic of major issue.A combined,global non-subsampled shearlet transform(NSST)-domain statistical features(NSSTds)and local three dimensional local ternary pattern(3D-LTP)features,is proposed for high-resolution remote sensing images.We model the NSST image coefficients of detail subbands using 2-state laplacian mixture(LM)distribution and its three parameters are estimated using Expectation-Maximization(EM)algorithm.We also calculate the statistical parameters such as subband kurtosis and skewness from detail subbands along with mean and standard deviation calculated from approximation subband,and concatenate all of them with the 2-state LM parameters to describe the global features of the image.The various properties of NSST such as multiscale,localization and flexible directional sensitivity make it a suitable choice to provide an effective approximation of an image.In order to extract the dense local features,a new 3D-LTP is proposed where dimension reduction is performed via selection of‘uniform’patterns.The 3D-LTP is calculated from spatial RGB planes of the input image.The proposed inter-channel 3D-LTP not only exploits the local texture information but the color information is captured too.Finally,a fused feature representation(NSSTds-3DLTP)is proposed using new global(NSSTds)and local(3D-LTP)features to enhance the discriminativeness of features.The retrieval performance of proposed NSSTds-3DLTP features are tested on three challenging remote sensing image datasets such as WHU-RS19,Aerial Image Dataset(AID)and PatternNet in terms of mean average precision(MAP),average normalized modified retrieval rank(ANMRR)and precision-recall(P-R)graph.The experimental results are encouraging and the NSSTds-3DLTP features leads to superior retrieval performance compared to many well known existing descriptors such as Gabor RGB,Granulometry,local binary pattern(LBP),Fisher vector(FV),vector of locally aggregated descriptors(VLAD)and median robust extended local binary pattern(MRELBP).For WHU-RS19 dataset,in terms of{MAP,ANMRR},the NSSTds-3DLTP improves upon Gabor RGB,Granulometry,LBP,FV,VLAD and MRELBP descriptors by{41.93%,20.87%},{92.30%,32.68%},{86.14%,31.97%},{18.18%,15.22%},{8.96%,19.60%}and{15.60%,13.26%},respectively.For AID,in terms of{MAP,ANMRR},the NSSTds-3DLTP improves upon Gabor RGB,Granulometry,LBP,FV,VLAD and MRELBP descriptors by{152.60%,22.06%},{226.65%,25.08%},{185.03%,23.33%},{80.06%,12.16%},{50.58%,10.49%}and{62.34%,3.24%},respectively.For PatternNet,the NSSTds-3DLTP respectively improves upon Gabor RGB,Granulometry,LBP,FV,VLAD and MRELBP descriptors by{32.79%,10.34%},{141.30%,24.72%},{17.47%,10.34%},{83.20%,19.07%},{21.56%,3.60%},and{19.30%,0.48%}in terms of{MAP,ANMRR}.The moderate dimensionality of simple NSSTds-3DLTP allows the system to run in real-time.展开更多
This paper discusses a statistical second-order two-scale(SSOTS) analysis and computation for a heat conduction problem with a radiation boundary condition in random porous materials.Firstly,the microscopic configur...This paper discusses a statistical second-order two-scale(SSOTS) analysis and computation for a heat conduction problem with a radiation boundary condition in random porous materials.Firstly,the microscopic configuration for the structure with random distribution is briefly characterized.Secondly,the SSOTS formulae for computing the heat transfer problem are derived successively by means of the construction way for each cell.Then,the statistical prediction algorithm based on the proposed two-scale model is described in detail.Finally,some numerical experiments are proposed,which show that the SSOTS method developed in this paper is effective for predicting the heat transfer performance of porous materials and demonstrating its significant applications in actual engineering computation.展开更多
In the field of supercritical wing design, various principles and rules have been summarized through theoretical and experimental analyses. Compared with black-box relationships between geometry parameters and perform...In the field of supercritical wing design, various principles and rules have been summarized through theoretical and experimental analyses. Compared with black-box relationships between geometry parameters and performances, quantitative physical laws about pressure distributions and performances are clearer and more beneficial to designers. With the advancement of computational fluid dynamics and computational intelligence, discovering new rules through statistical analysis on computers has become increasingly attractive and affordable. This paper proposes a novel sampling method for the statistical study on pressure distribution features and performances, so that new physical laws can be revealed. It utilizes an adaptive sampling algorithm, of which the criteria are developed based on Kullback–Leibler divergence and Euclidean distance.In this paper, the proposed method is employed to generate airfoil samples to study the relationships between the supercritical pressure distribution features and the drag divergence Mach number as well as the drag creep characteristic. Compared with conventional sampling methods, the proposed method can efficiently distribute samples in the pressure distribution feature space rather than directly sampling airfoil geometry parameters. The corresponding geometry parameters are searched and found under constraints, so that supercritical airfoil samples that are well distributed in the pressure distribution space are obtained. These samples allow statistical studies to obtain more reliable and universal aerodynamic rules that can be applied to supercritical airfoil designs.展开更多
Web data extraction has become a key technology for extracting valuable data from websites.At present,most extraction methods based on rule learning,visual pattern or tree matching have limited performance on complex ...Web data extraction has become a key technology for extracting valuable data from websites.At present,most extraction methods based on rule learning,visual pattern or tree matching have limited performance on complex web pages.Through ana-lyzing various statistical characteristics of HTML el-ements in web documents,this paper proposes,based on statistical features,an unsupervised web data ex-traction method—traversing the HTML DOM parse tree at first,calculating and generating the statistical matrix of the elements,and then locating data records by clustering method and heuristic rules that reveal in-herent links between the visual characteristics of the data recording areas and the statistical characteristics of the HTML nodes—which is both suitable for data records extraction of single-page and multi-pages,and it has strong generality and needs no training.The ex-periments show that the accuracy and efficiency of this method are equally better than the current data extrac-tion method.展开更多
Traditional meteorological downscaling methods face limitations due to the complex distribution of meteorological variables,which can lead to unstable forecasting results,especially in extreme scenarios.To overcome th...Traditional meteorological downscaling methods face limitations due to the complex distribution of meteorological variables,which can lead to unstable forecasting results,especially in extreme scenarios.To overcome this issue,we propose a convolutional graph neural network(CGNN)model,which we enhance with multilayer feature fusion and a squeeze-and-excitation block.Additionally,we introduce a spatially balanced mean squared error(SBMSE)loss function to address the imbalanced distribution and spatial variability of meteorological variables.The CGNN is capable of extracting essential spatial features and aggregating them from a global perspective,thereby improving the accuracy of prediction and enhancing the model's generalization ability.Based on the experimental results,CGNN has certain advantages in terms of bias distribution,exhibiting a smaller variance.When it comes to precipitation,both UNet and AE also demonstrate relatively small biases.As for temperature,AE and CNNdense perform outstandingly during the winter.The time correlation coefficients show an improvement of at least 10%at daily and monthly scales for both temperature and precipitation.Furthermore,the SBMSE loss function displays an advantage over existing loss functions in predicting the98th percentile and identifying areas where extreme events occur.However,the SBMSE tends to overestimate the distribution of extreme precipitation,which may be due to the theoretical assumptions about the posterior distribution of data that partially limit the effectiveness of the loss function.In future work,we will further optimize the SBMSE to improve prediction accuracy.展开更多
In literature, features based on First and Second Order Statistics that characterizes textures are used for classification of images. Features based on statistics of texture provide far less number of relevant and dis...In literature, features based on First and Second Order Statistics that characterizes textures are used for classification of images. Features based on statistics of texture provide far less number of relevant and distinguishable features in comparison to existing methods based on wavelet transformation. In this paper, we investigated performance of texture-based features in comparison to wavelet-based features with commonly used classifiers for the classification of Alzheimer’s disease based on T2-weighted MRI brain image. The performance is evaluated in terms of sensitivity, specificity, accuracy, training and testing time. Experiments are performed on publicly available medical brain images. Experimental results show that the performance with First and Second Order Statistics based features is significantly better in comparison to existing methods based on wavelet transformation in terms of all performance measures for all classifiers.展开更多
The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software w...The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.展开更多
Tyre pressure monitoring system(TPMS)is compulsory in most countries like the United States and European Union.The existing systems depend on pressure sensors strapped on the tyre or on wheel speed sensor data.A diffe...Tyre pressure monitoring system(TPMS)is compulsory in most countries like the United States and European Union.The existing systems depend on pressure sensors strapped on the tyre or on wheel speed sensor data.A difference in wheel speed would trigger an alarm based on the algorithm implemented.In this paper,machine learning approach is proposed as a new method to monitor tyre pressure by extracting the vertical vibrations from a wheel hub of a moving vehicle using an accelerometer.The obtained signals will be used to compute through statistical features and histogram features for the feature extraction process.The LMT(Logistic Model Tree)was used as the classifier and attained a classification accuracy of 92.5%with 10-fold cross validation for statistical features and 90.5% with 10-fold cross validation for histogram features.The proposed model can be used for monitoring the automobile tyre pressure successfully.展开更多
Forecasting the movement of stock market is a long-time attractive topic. This paper implements different statistical learning models to predict the movement of S&P 500 index. The S&P 500 index is influenced b...Forecasting the movement of stock market is a long-time attractive topic. This paper implements different statistical learning models to predict the movement of S&P 500 index. The S&P 500 index is influenced by other important financial indexes across the world such as commodity price and financial technical indicators. This paper systematically investigated four supervised learning models, including Logistic Regression, Gaussian Discriminant Analysis (GDA), Naive Bayes and Support Vector Machine (SVM) in the forecast of S&P 500 index. After several experiments of optimization in features and models, especially the SVM kernel selection and feature selection for different models, this paper concludes that a SVM model with a Radial Basis Function (RBF) kernel can achieve an accuracy rate of 62.51% for the future market trend of the S&P 500 index.展开更多
Statistical Signal Transmission(SST)is a technique based on orthogonal frequency-division multiplexing(OFDM)and adopts cyclostationary features,which can transmit extra information without additional bandwidth.However...Statistical Signal Transmission(SST)is a technique based on orthogonal frequency-division multiplexing(OFDM)and adopts cyclostationary features,which can transmit extra information without additional bandwidth.However,the more complicated environment in 5G communication systems,especially the fast time-varying scenarios,will dramatically degrade the performance of the SST.In this paper,we propose a fragmental weight-conservation combining(FWCC)scheme for SST,to overcome its performance degradation under fast time-varying channels.The proposed FWCC scheme consists of three phases:1、incise the received OFDM stream into pieces;2、endue different weights for fine and contaminated pieces,respectively;3、combine cyclic autocorrelation function energies of all the pieces;and 4、compute the final feature and demodulate data of SST.Through these procedures above,the detection accuracy of SST will be theoretically refined under fast time-varying channels.Such an inference is confirmed through numerical results in this paper.It is demonstrated that the BER performance of proposed scheme outperforms that of the original scheme both in ideal channel estimation conditions and in imperfect channel estimation conditions.In addition,we also find the experiential optimal weight distribution strategy for the proposed FWCC scheme,which facilitates practical applications.展开更多
Rock failure can cause serious geological disasters,and the non-extensive statistical features of electric potential(EP)are expected to provide valuable information for disaster prediction.In this paper,the uniaxial c...Rock failure can cause serious geological disasters,and the non-extensive statistical features of electric potential(EP)are expected to provide valuable information for disaster prediction.In this paper,the uniaxial compression experiments with EP monitoring were carried out on fine sandstone,marble and granite samples under four displacement rates.The Tsallis entropy q value of EPs is used to analyze the selforganization evolution of rock failure.Then the influence of displacement rate and rock type on q value are explored by mineral structure and fracture modes.A self-organized critical prediction method with q value is proposed.The results show that the probability density function(PDF)of EPs follows the q-Gaussian distribution.The displacement rate is positively correlated with q value.With the displacement rate increasing,the fracture mode changes,the damage degree intensifies,and the microcrack network becomes denser.The influence of rock type on q value is related to the burst intensity of energy release and the crack fracture mode.The q value of EPs can be used as an effective prediction index for rock failure like b value of acoustic emission(AE).The results provide useful reference and method for the monitoring and early warning of geological disasters.展开更多
Emanated from the idea of reinvestigating ancient medical system of Ayurveda—Traditional Indian Medicine (TIM), our recent study had shown significant applications of analysis of arterial pulse waveforms for non-inva...Emanated from the idea of reinvestigating ancient medical system of Ayurveda—Traditional Indian Medicine (TIM), our recent study had shown significant applications of analysis of arterial pulse waveforms for non-invasive diagnosis of cardiovascular functions. Here we present results of further investigations analyzing the relation of pulse-characteristics with some clinical and pathological parameters and other features that are of diagnostic importance in Ayurveda.展开更多
The motivation for this article is to propose new damage classifiers based on a supervised learning problem for locating and quantifying damage.A new feature extraction approach using time series analysis is introduce...The motivation for this article is to propose new damage classifiers based on a supervised learning problem for locating and quantifying damage.A new feature extraction approach using time series analysis is introduced to extract damage-sensitive features from auto-regressive models.This approach sets out to improve current feature extraction techniques in the context of time series modeling.The coefficients and residuals of the AR model obtained from the proposed approach are selected as the main features and are applied to the proposed supervised learning classifiers that are categorized as coefficient-based and residual-based classifiers.These classifiers compute the relative errors in the extracted features between the undamaged and damaged states.Eventually,the abilities of the proposed methods to localize and quantify single and multiple damage scenarios are verified by applying experimental data for a laboratory frame and a four-story steel structure.Comparative analyses are performed to validate the superiority of the proposed methods over some existing techniques.Results show that the proposed classifiers,with the aid of extracted features from the proposed feature extraction approach,are able to locate and quantify damage;however,the residual-based classifiers yield better results than the coefficient-based classifiers.Moreover,these methods are superior to some classical techniques.展开更多
This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of...This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of the problem by a statistical second-order two-scale (SSOTS) analysis method and the algorithm procedure based on the finite-element difference method are presented. Numerical results of coupled cases are compared with those of uncoupled cases. It shows that the coupling effects on temperature, thermal flux, displacement, and stresses are very distinct, and the micro- characteristics of particles affect the coupling effect of the random composites. Furthermore, the coupling effect causes a lag in the variations of temperature, thermal flux, displacement, and stresses.展开更多
文摘Thermal image, or thermogram, becomes a new type of signal for machine condition monitoring and fault diagnosis due to the capability to display real-time temperature distribution and possibility to indicate the machine’s operating condition through its temperature. In this paper, an investigation of using the second-order statistical features of thermogram in association with minimum redundancy maximum relevance (mRMR) feature selection and simplified fuzzy ARTMAP (SFAM) classification is conducted for rotating machinery fault diagnosis. The thermograms of different machine conditions are firstly preprocessed for improving the image contrast, removing noise, and cropping to obtain the regions of interest (ROIs). Then, an enhanced algorithm based on bi-dimensional empirical mode decomposition is implemented to further increase the quality of ROIs before the second-order statistical features are extracted from their gray-level co-occurrence matrix (GLCM). The highly relevant features to the machine condition are selected from the total feature set by mRMR and are fed into SFAM to accomplish the fault diagnosis. In order to verify this investigation, the thermograms acquired from different conditions of a fault simulator including normal, misalignment, faulty bearing, and mass unbalance are used. This investigation also provides a comparative study of SFAM and other traditional methods such as back-propagation and probabilistic neural networks. The results show that the second-order statistical features used in this framework can provide a plausible accuracy in fault diagnosis of rotating machinery.
基金supported,in part,by the National Nature Science Foundation of China under Grant 62272236,62376128 and 62306139the Natural Science Foundation of Jiangsu Province under Grant BK20201136,BK20191401.
文摘Discriminative region localization and efficient feature encoding are crucial for fine-grained object recognition.However,existing data augmentation methods struggle to accurately locate discriminative regions in complex backgrounds,small target objects,and limited training data,leading to poor recognition.Fine-grained images exhibit“small inter-class differences,”and while second-order feature encoding enhances discrimination,it often requires dual Convolutional Neural Networks(CNN),increasing training time and complexity.This study proposes a model integrating discriminative region localization and efficient second-order feature encoding.By ranking feature map channels via a fully connected layer,it selects high-importance channels to generate an enhanced map,accurately locating discriminative regions.Cropping and erasing augmentations further refine recognition.To improve efficiency,a novel second-order feature encoding module generates an attention map from the fourth convolutional group of Residual Network 50 layers(ResNet-50)and multiplies it with features from the fifth group,producing second-order features while reducing dimensionality and training time.Experiments on Caltech-University of California,San Diego Birds-200-2011(CUB-200-2011),Stanford Car,and Fine-Grained Visual Classification of Aircraft(FGVC Aircraft)datasets show state-of-the-art accuracy of 88.9%,94.7%,and 93.3%,respectively.
文摘Based on the second order random wave solutions of water wave equations in finite water depth, statistical distributions of the depth integrated local horizontal momentum components are derived by use of the characteristic function expansion method. The parameters involved in the distributions can be all determined by the water depth and the wave number spectrum of ocean waves. As an illustrative example, a fully developed wind generated sea is considered and the parameters are calculated for typical wind speeds and water depths by means of the Donelan and Pierson spectrum. The effects of nonlinearity and water depth on the distributions are also investigated.
文摘In recent years, the interest in damage identification of structural components through innovative techniques has grown significantly. Damage identification has always been a crucial concern in quality assessment and load capacity rating of infrastructure. In this regard, researchers focus on proposing efficient tools to identify the damages in early stages to prevent the sudden failure in structural components, ensuring the public safety and reducing the asset management costs. The sensing technologies along with the data analysis through various techniques and machine learning approaches have been the area of interest for these innovative techniques. The purpose of this research is to develop a robust method for automatic condition assessment of real-life concrete structures for the detection of relatively small cracks at early stages. A damage identification algorithm is proposed using the hybrid approaches to analyze the sensors data. The data obtained from transducers mounted on concrete beams under static loading in laboratory. These data are used as the input parameters. The method relies only on the measured time responses. After filtering and normalization of the data, the damage sensitive statistical features are extracted from the signals and used as the inputs of Self-Advising Support Vector Machine (SA-SVM) for the classification purpose in civil Engineering area. Finally, the results are compared with traditional methods to investigate the feasibility of the hybrid proposed algorithm. It is demonstrated that the presented method can reliably detect the crack in the structure and thereby enable the real-time infrastructure health monitoring.
文摘On the basis of the arctic monthly mean sea ice extent data set during 1953-1984, the arctic region is divided into eight subregions,and the analyses of empirical orthogonal functions, power spectrum and maximum entropy spectrum are made to indentify the major spatial and temporal features of the sea ice fluctuations within 32-year period. And then, a brief appropriate physical explanation is tentatively suggested. The results show that both seasonal and non-seasonal variations of the sea ice extent are remarkable, and iis mean annual peripheral positions as well as their interannu-al shifting amplitudes are quite different among all subregions. These features are primarily affected by solar radiation, o-cean circulation, sea surface temperature and maritime-continental contrast, while the non-seasonal variations are most possibly affected by the cosmic-geophysical factors such as earth pole shife, earth rotation oscillation and solar activity.
文摘With the increasing popularity of high-resolution remote sensing images,the remote sensing image retrieval(RSIR)has always been a topic of major issue.A combined,global non-subsampled shearlet transform(NSST)-domain statistical features(NSSTds)and local three dimensional local ternary pattern(3D-LTP)features,is proposed for high-resolution remote sensing images.We model the NSST image coefficients of detail subbands using 2-state laplacian mixture(LM)distribution and its three parameters are estimated using Expectation-Maximization(EM)algorithm.We also calculate the statistical parameters such as subband kurtosis and skewness from detail subbands along with mean and standard deviation calculated from approximation subband,and concatenate all of them with the 2-state LM parameters to describe the global features of the image.The various properties of NSST such as multiscale,localization and flexible directional sensitivity make it a suitable choice to provide an effective approximation of an image.In order to extract the dense local features,a new 3D-LTP is proposed where dimension reduction is performed via selection of‘uniform’patterns.The 3D-LTP is calculated from spatial RGB planes of the input image.The proposed inter-channel 3D-LTP not only exploits the local texture information but the color information is captured too.Finally,a fused feature representation(NSSTds-3DLTP)is proposed using new global(NSSTds)and local(3D-LTP)features to enhance the discriminativeness of features.The retrieval performance of proposed NSSTds-3DLTP features are tested on three challenging remote sensing image datasets such as WHU-RS19,Aerial Image Dataset(AID)and PatternNet in terms of mean average precision(MAP),average normalized modified retrieval rank(ANMRR)and precision-recall(P-R)graph.The experimental results are encouraging and the NSSTds-3DLTP features leads to superior retrieval performance compared to many well known existing descriptors such as Gabor RGB,Granulometry,local binary pattern(LBP),Fisher vector(FV),vector of locally aggregated descriptors(VLAD)and median robust extended local binary pattern(MRELBP).For WHU-RS19 dataset,in terms of{MAP,ANMRR},the NSSTds-3DLTP improves upon Gabor RGB,Granulometry,LBP,FV,VLAD and MRELBP descriptors by{41.93%,20.87%},{92.30%,32.68%},{86.14%,31.97%},{18.18%,15.22%},{8.96%,19.60%}and{15.60%,13.26%},respectively.For AID,in terms of{MAP,ANMRR},the NSSTds-3DLTP improves upon Gabor RGB,Granulometry,LBP,FV,VLAD and MRELBP descriptors by{152.60%,22.06%},{226.65%,25.08%},{185.03%,23.33%},{80.06%,12.16%},{50.58%,10.49%}and{62.34%,3.24%},respectively.For PatternNet,the NSSTds-3DLTP respectively improves upon Gabor RGB,Granulometry,LBP,FV,VLAD and MRELBP descriptors by{32.79%,10.34%},{141.30%,24.72%},{17.47%,10.34%},{83.20%,19.07%},{21.56%,3.60%},and{19.30%,0.48%}in terms of{MAP,ANMRR}.The moderate dimensionality of simple NSSTds-3DLTP allows the system to run in real-time.
基金Project supported by the China Postdoctoral Science Foundation(Grant Nos.2015M580256 and 2016T90276)
文摘This paper discusses a statistical second-order two-scale(SSOTS) analysis and computation for a heat conduction problem with a radiation boundary condition in random porous materials.Firstly,the microscopic configuration for the structure with random distribution is briefly characterized.Secondly,the SSOTS formulae for computing the heat transfer problem are derived successively by means of the construction way for each cell.Then,the statistical prediction algorithm based on the proposed two-scale model is described in detail.Finally,some numerical experiments are proposed,which show that the SSOTS method developed in this paper is effective for predicting the heat transfer performance of porous materials and demonstrating its significant applications in actual engineering computation.
基金supported by the National Natural Science Foundation of China(Nos.91852108 and 11872230)。
文摘In the field of supercritical wing design, various principles and rules have been summarized through theoretical and experimental analyses. Compared with black-box relationships between geometry parameters and performances, quantitative physical laws about pressure distributions and performances are clearer and more beneficial to designers. With the advancement of computational fluid dynamics and computational intelligence, discovering new rules through statistical analysis on computers has become increasingly attractive and affordable. This paper proposes a novel sampling method for the statistical study on pressure distribution features and performances, so that new physical laws can be revealed. It utilizes an adaptive sampling algorithm, of which the criteria are developed based on Kullback–Leibler divergence and Euclidean distance.In this paper, the proposed method is employed to generate airfoil samples to study the relationships between the supercritical pressure distribution features and the drag divergence Mach number as well as the drag creep characteristic. Compared with conventional sampling methods, the proposed method can efficiently distribute samples in the pressure distribution feature space rather than directly sampling airfoil geometry parameters. The corresponding geometry parameters are searched and found under constraints, so that supercritical airfoil samples that are well distributed in the pressure distribution space are obtained. These samples allow statistical studies to obtain more reliable and universal aerodynamic rules that can be applied to supercritical airfoil designs.
文摘Web data extraction has become a key technology for extracting valuable data from websites.At present,most extraction methods based on rule learning,visual pattern or tree matching have limited performance on complex web pages.Through ana-lyzing various statistical characteristics of HTML el-ements in web documents,this paper proposes,based on statistical features,an unsupervised web data ex-traction method—traversing the HTML DOM parse tree at first,calculating and generating the statistical matrix of the elements,and then locating data records by clustering method and heuristic rules that reveal in-herent links between the visual characteristics of the data recording areas and the statistical characteristics of the HTML nodes—which is both suitable for data records extraction of single-page and multi-pages,and it has strong generality and needs no training.The ex-periments show that the accuracy and efficiency of this method are equally better than the current data extrac-tion method.
基金partially funded by the National Natural Science Foundation of China(U2142205)the Guangdong Major Project of Basic and Applied Basic Research(2020B0301030004)+1 种基金the Special Fund for Forecasters of China Meteorological Administration(CMAYBY2020-094)the Graduate Student Research and Innovation Program of Central South University(2023ZZTS0347)。
文摘Traditional meteorological downscaling methods face limitations due to the complex distribution of meteorological variables,which can lead to unstable forecasting results,especially in extreme scenarios.To overcome this issue,we propose a convolutional graph neural network(CGNN)model,which we enhance with multilayer feature fusion and a squeeze-and-excitation block.Additionally,we introduce a spatially balanced mean squared error(SBMSE)loss function to address the imbalanced distribution and spatial variability of meteorological variables.The CGNN is capable of extracting essential spatial features and aggregating them from a global perspective,thereby improving the accuracy of prediction and enhancing the model's generalization ability.Based on the experimental results,CGNN has certain advantages in terms of bias distribution,exhibiting a smaller variance.When it comes to precipitation,both UNet and AE also demonstrate relatively small biases.As for temperature,AE and CNNdense perform outstandingly during the winter.The time correlation coefficients show an improvement of at least 10%at daily and monthly scales for both temperature and precipitation.Furthermore,the SBMSE loss function displays an advantage over existing loss functions in predicting the98th percentile and identifying areas where extreme events occur.However,the SBMSE tends to overestimate the distribution of extreme precipitation,which may be due to the theoretical assumptions about the posterior distribution of data that partially limit the effectiveness of the loss function.In future work,we will further optimize the SBMSE to improve prediction accuracy.
文摘In literature, features based on First and Second Order Statistics that characterizes textures are used for classification of images. Features based on statistics of texture provide far less number of relevant and distinguishable features in comparison to existing methods based on wavelet transformation. In this paper, we investigated performance of texture-based features in comparison to wavelet-based features with commonly used classifiers for the classification of Alzheimer’s disease based on T2-weighted MRI brain image. The performance is evaluated in terms of sensitivity, specificity, accuracy, training and testing time. Experiments are performed on publicly available medical brain images. Experimental results show that the performance with First and Second Order Statistics based features is significantly better in comparison to existing methods based on wavelet transformation in terms of all performance measures for all classifiers.
文摘The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.
基金Acknowledgements: This work is supported by the National Natural Science Foundation of China (Nos. 60832010, 60671064, 60703011) the Chinese National 863 Program (No. 2007AA0 IZ- 458) the research fund for the Doctoral Program of Higher Education (No. RFDP20070213047).
文摘Tyre pressure monitoring system(TPMS)is compulsory in most countries like the United States and European Union.The existing systems depend on pressure sensors strapped on the tyre or on wheel speed sensor data.A difference in wheel speed would trigger an alarm based on the algorithm implemented.In this paper,machine learning approach is proposed as a new method to monitor tyre pressure by extracting the vertical vibrations from a wheel hub of a moving vehicle using an accelerometer.The obtained signals will be used to compute through statistical features and histogram features for the feature extraction process.The LMT(Logistic Model Tree)was used as the classifier and attained a classification accuracy of 92.5%with 10-fold cross validation for statistical features and 90.5% with 10-fold cross validation for histogram features.The proposed model can be used for monitoring the automobile tyre pressure successfully.
文摘Forecasting the movement of stock market is a long-time attractive topic. This paper implements different statistical learning models to predict the movement of S&P 500 index. The S&P 500 index is influenced by other important financial indexes across the world such as commodity price and financial technical indicators. This paper systematically investigated four supervised learning models, including Logistic Regression, Gaussian Discriminant Analysis (GDA), Naive Bayes and Support Vector Machine (SVM) in the forecast of S&P 500 index. After several experiments of optimization in features and models, especially the SVM kernel selection and feature selection for different models, this paper concludes that a SVM model with a Radial Basis Function (RBF) kernel can achieve an accuracy rate of 62.51% for the future market trend of the S&P 500 index.
基金supported by the National Natural Science Foundation of China (Nos. 61801461, 61801460)the Strategical Leadership Project of Chinese Academy of Sciences (grant No. XDC02070800)the Shanghai Municipality of Science and Technology Commission Project (Nos. 18XD1404100, 17QA1403800)
文摘Statistical Signal Transmission(SST)is a technique based on orthogonal frequency-division multiplexing(OFDM)and adopts cyclostationary features,which can transmit extra information without additional bandwidth.However,the more complicated environment in 5G communication systems,especially the fast time-varying scenarios,will dramatically degrade the performance of the SST.In this paper,we propose a fragmental weight-conservation combining(FWCC)scheme for SST,to overcome its performance degradation under fast time-varying channels.The proposed FWCC scheme consists of three phases:1、incise the received OFDM stream into pieces;2、endue different weights for fine and contaminated pieces,respectively;3、combine cyclic autocorrelation function energies of all the pieces;and 4、compute the final feature and demodulate data of SST.Through these procedures above,the detection accuracy of SST will be theoretically refined under fast time-varying channels.Such an inference is confirmed through numerical results in this paper.It is demonstrated that the BER performance of proposed scheme outperforms that of the original scheme both in ideal channel estimation conditions and in imperfect channel estimation conditions.In addition,we also find the experiential optimal weight distribution strategy for the proposed FWCC scheme,which facilitates practical applications.
基金supported by National Key R&D Program of China(2022YFC3004705)the National Natural Science Foundation of China(Nos.52074280,52227901 and 52204249)+1 种基金the Postgraduate Research&Practice Innovation Program of Jiangsu Province(No.KYCX24_2913)the Graduate Innovation Program of China University of Mining and Technology(No.2024WLKXJ139).
文摘Rock failure can cause serious geological disasters,and the non-extensive statistical features of electric potential(EP)are expected to provide valuable information for disaster prediction.In this paper,the uniaxial compression experiments with EP monitoring were carried out on fine sandstone,marble and granite samples under four displacement rates.The Tsallis entropy q value of EPs is used to analyze the selforganization evolution of rock failure.Then the influence of displacement rate and rock type on q value are explored by mineral structure and fracture modes.A self-organized critical prediction method with q value is proposed.The results show that the probability density function(PDF)of EPs follows the q-Gaussian distribution.The displacement rate is positively correlated with q value.With the displacement rate increasing,the fracture mode changes,the damage degree intensifies,and the microcrack network becomes denser.The influence of rock type on q value is related to the burst intensity of energy release and the crack fracture mode.The q value of EPs can be used as an effective prediction index for rock failure like b value of acoustic emission(AE).The results provide useful reference and method for the monitoring and early warning of geological disasters.
文摘Emanated from the idea of reinvestigating ancient medical system of Ayurveda—Traditional Indian Medicine (TIM), our recent study had shown significant applications of analysis of arterial pulse waveforms for non-invasive diagnosis of cardiovascular functions. Here we present results of further investigations analyzing the relation of pulse-characteristics with some clinical and pathological parameters and other features that are of diagnostic importance in Ayurveda.
文摘The motivation for this article is to propose new damage classifiers based on a supervised learning problem for locating and quantifying damage.A new feature extraction approach using time series analysis is introduced to extract damage-sensitive features from auto-regressive models.This approach sets out to improve current feature extraction techniques in the context of time series modeling.The coefficients and residuals of the AR model obtained from the proposed approach are selected as the main features and are applied to the proposed supervised learning classifiers that are categorized as coefficient-based and residual-based classifiers.These classifiers compute the relative errors in the extracted features between the undamaged and damaged states.Eventually,the abilities of the proposed methods to localize and quantify single and multiple damage scenarios are verified by applying experimental data for a laboratory frame and a four-story steel structure.Comparative analyses are performed to validate the superiority of the proposed methods over some existing techniques.Results show that the proposed classifiers,with the aid of extracted features from the proposed feature extraction approach,are able to locate and quantify damage;however,the residual-based classifiers yield better results than the coefficient-based classifiers.Moreover,these methods are superior to some classical techniques.
基金supported by the Special Funds for the National Basic Research Program of China(Grant No.2012CB025904)the National Natural ScienceFoundation of China(Grant Nos.90916027 and 11302052)
文摘This paper focuses on the dynamic thermo-mechanical coupled response of random particulate composite materials. Both the inertia term and coupling term are considered in the dynamic coupled problem. The formulation of the problem by a statistical second-order two-scale (SSOTS) analysis method and the algorithm procedure based on the finite-element difference method are presented. Numerical results of coupled cases are compared with those of uncoupled cases. It shows that the coupling effects on temperature, thermal flux, displacement, and stresses are very distinct, and the micro- characteristics of particles affect the coupling effect of the random composites. Furthermore, the coupling effect causes a lag in the variations of temperature, thermal flux, displacement, and stresses.