Traditional distributed denial of service(DDoS)detection methods need a lot of computing resource,and many of them which are based on single element have high missing rate and false alarm rate.In order to solve the pr...Traditional distributed denial of service(DDoS)detection methods need a lot of computing resource,and many of them which are based on single element have high missing rate and false alarm rate.In order to solve the problems,this paper proposes a DDoS attack information fusion method based on CNN for multi-element data.Firstly,according to the distribution,concentration and high traffic abruptness of DDoS attacks,this paper defines six features which are respectively obtained from the elements of source IP address,destination IP address,source port,destination port,packet size and the number of IP packets.Then,we propose feature weight calculation algorithm based on principal component analysis to measure the importance of different features in different network environment.The algorithm of weighted multi-element feature fusion proposed in this paper is used to fuse different features,and obtain multi-element fusion feature(MEFF)value.Finally,the DDoS attack information fusion classification model is established by using convolutional neural network and support vector machine respectively based on the MEFF time series.Experimental results show that the information fusion method proposed can effectively fuse multi-element data,reduce the missing rate and total error rate,memory resource consumption,running time,and improve the detection rate.展开更多
A multivariate statistical analysis was performed on multi-element soil geochemical data from the Koda Hill-Bulenga gold prospects in the Wa-Lawra gold belt, northwest Ghana. The objectives of the study were to define...A multivariate statistical analysis was performed on multi-element soil geochemical data from the Koda Hill-Bulenga gold prospects in the Wa-Lawra gold belt, northwest Ghana. The objectives of the study were to define gold relationships with other trace elements to determine possible pathfinder elements for gold from the soil geochemical data. The study focused on seven elements, namely, Au, Fe, Pb, Mn, Ag, As and Cu. Factor analysis and hierarchical cluster analysis were performed on the analyzed samples. Factor analysis explained 79.093% of the total variance of the data through three factors. This had the gold factor being factor 3, having associations of copper, iron, lead and manganese and accounting for 20.903% of the total variance. From hierarchical clustering, gold was also observed to be clustering with lead, copper, arsenic and silver. There was further indication that, gold concentrations were lower than that of its associations. It can be inferred from the results that, the occurrence of gold and its associated elements can be linked to both primary dispersion from underlying rocks and secondary processes such as lateritization. This data shows that Fe and Mn strongly associated with gold, and alongside Pb, Ag, As and Cu, these elements can be used as pathfinders for gold in the area, with ferruginous zones as targets.展开更多
A factor analysis was applied to soil geochemical data to define anomalies related to buried Pb-Zn mineralization.A favorable main factor with a strong association of the elements Zn,Cu and Pb,related to mineralizatio...A factor analysis was applied to soil geochemical data to define anomalies related to buried Pb-Zn mineralization.A favorable main factor with a strong association of the elements Zn,Cu and Pb,related to mineralization,was selected for interpretation.The median+2 MAD(median absolute deviation)method of exploratory data analysis(EDA)and C-A(concentration-area)fractal modeling were then applied to the Mahalanobis distance,as defined by Zn,Cu and Pb from the factor analysis to set the thresholds for defining multi-element anomalies.As a result,the median+2 MAD method more successfully identified the Pb-Zn mineralization than the C-A fractal model.The soil anomaly identified by the median+2 MAD method on the Mahalanobis distances defined by three principal elements(Zn,Cu and Pb)rather than thirteen elements(Co,Zn,Cu,V,Mo,Ni,Cr,Mn,Pb,Ba,Sr,Zr and Ti)was the more favorable reflection of the ore body.The identified soil geochemical anomalies were compared with the in situ economic Pb-Zn ore bodies for validation.The results showed that the median+2 MAD approach is capable of mapping both strong and weak geochemical anomalies related to buried Pb-Zn mineralization,which is therefore useful at the reconnaissance drilling stage.展开更多
In this study, to meet the development and application requirements for high-strength and hightoughness energetic structural materials, a representative volume element of a TA15 matrix embedded with a TaZrNb sphere wa...In this study, to meet the development and application requirements for high-strength and hightoughness energetic structural materials, a representative volume element of a TA15 matrix embedded with a TaZrNb sphere was designed and fabricated via diffusion bonding. The mechanisms of the microstructural evolution of the TaZrNb/TA15 interface were investigated via SEM, EBSD, EDS, and XRD.Interface mechanical property tests and in-situ tensile tests were conducted on the sphere-containing structure, and an equivalent tensile-strength model was established for the structure. The results revealed that the TA15 titanium alloy and joint had high density and no pores or cracks. The thickness of the planar joint was approximately 50-60 μm. The average tensile and shear strengths were 767 MPa and 608 MPa, respectively. The thickness of the spherical joint was approximately 60 μm. The Zr and Nb elements in the joint diffused uniformly and formed strong bonds with Ti without forming intermetallic compounds. The interface exhibited submicron grain refinement and a concave-convex interlocking structure. The tensile fracture surface primarily exhibited intergranular fracture combined with some transgranular fracture, which constituted a quasi-brittle fracture mode. The shear fracture surface exhibited brittle fracture with regular arrangements of furrows. Internal fracture occurred along the spherical interface, as revealed by advanced in-situ X-ray microcomputed tomography. The experimental results agreed well with the theoretical predictions, indicating that the high-strength interface contributes to the overall strength and toughness of the sphere-containing structure.展开更多
To obtain more stable spectral data for accurate quantitative analysis of multi-element,especially for the large-area in-situ elements detection of soils, we propose a method for a multielement quantitative analysis o...To obtain more stable spectral data for accurate quantitative analysis of multi-element,especially for the large-area in-situ elements detection of soils, we propose a method for a multielement quantitative analysis of soils using calibration-free laser-induced breakdown spectroscopy(CF-LIBS) based on data filtering. In this study, we analyze a standard soil sample doped with two heavy metal elements, Cu and Cd, with a specific focus on the line of Cu I324.75 nm for filtering the experimental data of multiple sample sets. Pre-and post-data filtering,the relative standard deviation for Cu decreased from 30% to 10%, The limits of detection(LOD)values for Cu and Cd decreased by 5% and 4%, respectively. Through CF-LIBS, a quantitative analysis was conducted to determine the relative content of elements in soils. Using Cu as a reference, the concentration of Cd was accurately calculated. The results show that post-data filtering, the average relative error of the Cd decreases from 11% to 5%, indicating the effectiveness of data filtering in improving the accuracy of quantitative analysis. Moreover, the content of Si, Fe and other elements can be accurately calculated using this method. To further correct the calculation, the results for Cd was used to provide a more precise calculation. This approach is of great importance for the large-area in-situ heavy metals and trace elements detection in soil, as well as for rapid and accurate quantitative analysis.展开更多
In order to better carry out meteorological service and improve service efficiency,based on Tianqing data multi-factor integration platform,using Tianqing intensive data source of big data cloud platform,daily and hou...In order to better carry out meteorological service and improve service efficiency,based on Tianqing data multi-factor integration platform,using Tianqing intensive data source of big data cloud platform,daily and hourly precipitation,temperature and wind data can be acquired,and C#language can be used to realize data analysis,text template nesting,meteorological material generation and archive management functions.The application shows that the platform can quickly complete the production of rain information within 2 min,and the material filing management has zero errors,which is superior to the traditional methods in terms of framework design and data processing material management.展开更多
Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel a...Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications.展开更多
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a...Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability.展开更多
Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi...Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi-stego images provides good image quality but often results in low embedding capability.To address these challenges,this paper proposes a high-capacity RDH scheme based on PVO that generates three stego images from a single cover image.The cover image is partitioned into non-overlapping blocks with pixels sorted in ascending order.Four secret bits are embedded into each block’s maximum pixel value,while three additional bits are embedded into the second-largest value when the pixel difference exceeds a predefined threshold.A similar embedding strategy is also applied to the minimum side of the block,including the second-smallest pixel value.This design enables each block to embed up to 14 bits of secret data.Experimental results demonstrate that the proposed method achieves significantly higher embedding capacity and improved visual quality compared to existing triple-stego RDH approaches,advancing the field of reversible steganography.展开更多
With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comp...With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.展开更多
Automated essay scoring(AES)systems have gained significant importance in educational settings,offering a scalable,efficient,and objective method for evaluating student essays.However,developing AES systems for Arabic...Automated essay scoring(AES)systems have gained significant importance in educational settings,offering a scalable,efficient,and objective method for evaluating student essays.However,developing AES systems for Arabic poses distinct challenges due to the language’s complex morphology,diglossia,and the scarcity of annotated datasets.This paper presents a hybrid approach to Arabic AES by combining text-based,vector-based,and embeddingbased similarity measures to improve essay scoring accuracy while minimizing the training data required.Using a large Arabic essay dataset categorized into thematic groups,the study conducted four experiments to evaluate the impact of feature selection,data size,and model performance.Experiment 1 established a baseline using a non-machine learning approach,selecting top-N correlated features to predict essay scores.The subsequent experiments employed 5-fold cross-validation.Experiment 2 showed that combining embedding-based,text-based,and vector-based features in a Random Forest(RF)model achieved an R2 of 88.92%and an accuracy of 83.3%within a 0.5-point tolerance.Experiment 3 further refined the feature selection process,demonstrating that 19 correlated features yielded optimal results,improving R2 to 88.95%.In Experiment 4,an optimal data efficiency training approach was introduced,where training data portions increased from 5%to 50%.The study found that using just 10%of the data achieved near-peak performance,with an R2 of 85.49%,emphasizing an effective trade-off between performance and computational costs.These findings highlight the potential of the hybrid approach for developing scalable Arabic AES systems,especially in low-resource environments,addressing linguistic challenges while ensuring efficient data usage.展开更多
Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods...Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results.展开更多
Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using d...Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using deep learning algorithms further enhances performance;nevertheless,it is challenging due to the nature of small-scale and imbalanced PD datasets.This paper proposed a convolutional neural network-based deep support vector machine(CNN-DSVM)to automate the feature extraction process using CNN and extend the conventional SVM to a DSVM for better classification performance in small-scale PD datasets.A customized kernel function reduces the impact of biased classification towards the majority class(healthy candidates in our consideration).An improved generative adversarial network(IGAN)was designed to generate additional training data to enhance the model’s performance.For performance evaluation,the proposed algorithm achieves a sensitivity of 97.6%and a specificity of 97.3%.The performance comparison is evaluated from five perspectives,including comparisons with different data generation algorithms,feature extraction techniques,kernel functions,and existing works.Results reveal the effectiveness of the IGAN algorithm,which improves the sensitivity and specificity by 4.05%–4.72%and 4.96%–5.86%,respectively;and the effectiveness of the CNN-DSVM algorithm,which improves the sensitivity by 1.24%–57.4%and specificity by 1.04%–163%and reduces biased detection towards the majority class.The ablation experiments confirm the effectiveness of individual components.Two future research directions have also been suggested.展开更多
For a complex flow about multi-element airfoils a mixed grid method is set up. C-type grids are produced on each element′s body and in their wakes at first, O-type grids are given in the outmost area, and H-type grid...For a complex flow about multi-element airfoils a mixed grid method is set up. C-type grids are produced on each element′s body and in their wakes at first, O-type grids are given in the outmost area, and H-type grids are used in middle additional areas. An algebra method is used to produce the initial grids in each area. And the girds are optimized by elliptical differential equation method. Then C-O-H zonal patched grids around multi-element airfoils are produced automatically and efficiently. A time accurate finite-volume integration method is used to solve the compressible laminar and turbulent Navier-Stokes (N-S) equations on the grids. Computational results prove the method to be effective.展开更多
Almost half of all flight accidents caused by inflight icing occur at the approach and landing phases when high-lift devices are deployed.The present study focuses on the optimization of an ice-tolerant multi-element ...Almost half of all flight accidents caused by inflight icing occur at the approach and landing phases when high-lift devices are deployed.The present study focuses on the optimization of an ice-tolerant multi-element airfoil.Dual-objective optimization is carried out with critical hornshaped ice accumulated during the holding phase.The optimization results show that the present optimization method significantly enhances the iced-state and clean-state performance.The optimal multi-element airfoil has a larger deflection angle and wider gap at the slat and the flap compared with the baseline configuration.The sensitivity of each design parameter is analyzed,which verifies the robustness of the design.The design is further assessed when ice is accreted during the approach and landing phases,which also shows performance improvement.展开更多
To study the effects of the gamma reflection of multi-element materials,gamma ray transport models of single-element materials,such as iron and lead,and multielement materials,such as polyethylene and ordinary concret...To study the effects of the gamma reflection of multi-element materials,gamma ray transport models of single-element materials,such as iron and lead,and multielement materials,such as polyethylene and ordinary concrete,were established in this study.Relationships among the albedo factors of the gamma photons and energies and average energy of the reflected gamma rays by material type,material thickness,incident gamma energy,and incidence angle of gamma rays were obtained by Monte Carlo simulation.The results show that the albedo factors of single-element and multi-element materials increase rapidly with an increase in the material thickness.When the thickness of the material increases to a certain value,the albedo factors do not increase further but rather tend to the saturation value.The saturation values for the albedo factors of the gamma photons,and energies and the reflection thickness are related not only to the type of material but also to the incident gamma energy and incidence angle of the gamma rays.At a given incident gamma energy,which is between 0.2 and 2.5 MeV,the smaller the effective atomic number of the multi-element material is,the higher the saturation values of the albedo factors are.The larger the incidence angle of the gamma ray is,the greater the saturation value of the gamma albedo factor,saturation reflection thickness,and average saturation energy of the reflected gamma photons are.展开更多
The paper is to integrate aerodynamic and aero-acoustic optimizatiom design of high lift devices,especially for two-element airfoils with slat.Aerodynamic analysis on flow field utilizes a high-order,high-resolution s...The paper is to integrate aerodynamic and aero-acoustic optimizatiom design of high lift devices,especially for two-element airfoils with slat.Aerodynamic analysis on flow field utilizes a high-order,high-resolution spatial differential method for large eddy simulation(LES),which can guarantee accuracy and efficiency.The aeroacoustic analysis for noise level is calculated with Ffowcs Williams-Hawkings(FW-H)integration formula.Fidelity of calculation is verified by standard models.Method of streamline-based Euler simulation(MSES)is used to obtain the aerodynamic characters.Based on the confirmation of numerical methods,detailed research has been conducted for the leading edge slat on multi-element airfoils.Various slot parameter influences on noise are analyzed.The results of the slot optimization parameters can be used in multi-element airfoil design.展开更多
Multi-element analysis in historical sites is a major issue in archaeological studies;however,this approach is almost unknown among Iranian scholars.Geochemical multi-element analysis of soil is very important to eval...Multi-element analysis in historical sites is a major issue in archaeological studies;however,this approach is almost unknown among Iranian scholars.Geochemical multi-element analysis of soil is very important to evaluate anthropogenic activities.The aim of this study consists of assessing the potential usefulness of multi-elemental soil analysis,obtained by Analytical Jena atomic absorption spectrophotometer(AAS) and ICP-MS,to recognize ancient anthropogenic features on the territory of Tappe Rivi(North Khorasan,Iran).For that purpose,a total of 80 ancient soil samples were sampled from each soil horizon and cultural layer.The research involved Fe,Al,Cd,Cu,Ni,Co,Cr,Pb,and P which trace element samples were extracted according to the International Standard ISO 11466 and phosphorus samples by Olsen method.Besides,the contamination of the soils was assessed based on enrichment factors(EFs) by using Fe as a reference element.This geochemical/archaeological approach highlights that the content of most elements in the Parthian and Sassanid ages were significantly higher than the contents of the elements in other zones,which shows that by the development of the eras,the content of the elements has also increased.Also,the accumulation of metals in the Rivi site was significantly higher than in the control area.Among the sampled zones,enrichment factor(EF) indicated that the enrichment of Cu and phosphate at the Parthian and Sassanid had the highest content.This result is important because it shows that the amount of metals and human activities are directly related to each other during different ages.展开更多
The effects of Al and Sc on mechanical properties of FeCoNi multi-element alloys(MEAs) were investigated by compressive tests. The microstructures of FeCoNi MEAs with different contents of Al and Sc were characterized...The effects of Al and Sc on mechanical properties of FeCoNi multi-element alloys(MEAs) were investigated by compressive tests. The microstructures of FeCoNi MEAs with different contents of Al and Sc were characterized and the strengthening mechanisms were discussed. The results show that FeCoNi MEA with a low content of Al has a face-centered cubic(FCC) structure. The yield strength increases linearly with the increase of Al content, which is largely caused by solid solution hardening. Further addition of Sc can promote the formation of a new phase in(FeCoNi)1-xAlx MEAs. A minor addition of Sc can significantly increase the yield strengths of(FeCoNi)1-xAlx MEAs with a low Al content and improve the compressive plasticity of(FeCoNi)1-xAlx MEAs with a high Al content.展开更多
Winter jujube(Ziziphus jujuba'Dongzao')is greatly appreciated by consumers for its excellent quality,but brand infringement frequently occurs in the market.Here,we first determined a total of 38 elements in 16...Winter jujube(Ziziphus jujuba'Dongzao')is greatly appreciated by consumers for its excellent quality,but brand infringement frequently occurs in the market.Here,we first determined a total of 38 elements in 167 winter jujube samples from the main winter jujube producing areas of China by inductively coupled plasma mass spectrometer(ICP-MS).As a result,16 elements(Mg,K,Mn,Cu,Zn,Mo,Ba,Be,As,Se,Cd,Sb,Ce,Er,Tl,and Pb)exhibited significant differences in samples from different producing areas.Supervised linear discriminant analysis(LDA)and orthogonal projection to latent structures discriminant analysis(OPLS-DA)showed better performance in identifying the origin of samples than unsupervised principal component analysis(PCA).LDA and OPLS-DA had a mean identification accuracy of 87.84 and 94.64%in the testing set,respectively.By using the multilayer perceptron(MLP)and C5.0,the prediction accuracy of the models could reach 96.36 and 91.06%,respectively.Based on the above four chemometric methods,Cd,Tl,Mo and Se were selected as the main variables and principal markers for the origin identification of winter jujube.Overall,this study demonstrates that it is practical and precise to identify the origin of winter jujube through multi-element fingerprint analysis with chemometrics,and may also provide reference for establishing the origin traceability system of other fruits.展开更多
基金This work was supported by the Hainan Provincial Natural Science Foundation of China[2018CXTD333,617048]National Natural Science Foundation of China[61762033,61702539]+1 种基金Hainan University Doctor Start Fund Project[kyqd1328]Hainan University Youth Fund Project[qnjj1444].
文摘Traditional distributed denial of service(DDoS)detection methods need a lot of computing resource,and many of them which are based on single element have high missing rate and false alarm rate.In order to solve the problems,this paper proposes a DDoS attack information fusion method based on CNN for multi-element data.Firstly,according to the distribution,concentration and high traffic abruptness of DDoS attacks,this paper defines six features which are respectively obtained from the elements of source IP address,destination IP address,source port,destination port,packet size and the number of IP packets.Then,we propose feature weight calculation algorithm based on principal component analysis to measure the importance of different features in different network environment.The algorithm of weighted multi-element feature fusion proposed in this paper is used to fuse different features,and obtain multi-element fusion feature(MEFF)value.Finally,the DDoS attack information fusion classification model is established by using convolutional neural network and support vector machine respectively based on the MEFF time series.Experimental results show that the information fusion method proposed can effectively fuse multi-element data,reduce the missing rate and total error rate,memory resource consumption,running time,and improve the detection rate.
文摘A multivariate statistical analysis was performed on multi-element soil geochemical data from the Koda Hill-Bulenga gold prospects in the Wa-Lawra gold belt, northwest Ghana. The objectives of the study were to define gold relationships with other trace elements to determine possible pathfinder elements for gold from the soil geochemical data. The study focused on seven elements, namely, Au, Fe, Pb, Mn, Ag, As and Cu. Factor analysis and hierarchical cluster analysis were performed on the analyzed samples. Factor analysis explained 79.093% of the total variance of the data through three factors. This had the gold factor being factor 3, having associations of copper, iron, lead and manganese and accounting for 20.903% of the total variance. From hierarchical clustering, gold was also observed to be clustering with lead, copper, arsenic and silver. There was further indication that, gold concentrations were lower than that of its associations. It can be inferred from the results that, the occurrence of gold and its associated elements can be linked to both primary dispersion from underlying rocks and secondary processes such as lateritization. This data shows that Fe and Mn strongly associated with gold, and alongside Pb, Ag, As and Cu, these elements can be used as pathfinders for gold in the area, with ferruginous zones as targets.
文摘A factor analysis was applied to soil geochemical data to define anomalies related to buried Pb-Zn mineralization.A favorable main factor with a strong association of the elements Zn,Cu and Pb,related to mineralization,was selected for interpretation.The median+2 MAD(median absolute deviation)method of exploratory data analysis(EDA)and C-A(concentration-area)fractal modeling were then applied to the Mahalanobis distance,as defined by Zn,Cu and Pb from the factor analysis to set the thresholds for defining multi-element anomalies.As a result,the median+2 MAD method more successfully identified the Pb-Zn mineralization than the C-A fractal model.The soil anomaly identified by the median+2 MAD method on the Mahalanobis distances defined by three principal elements(Zn,Cu and Pb)rather than thirteen elements(Co,Zn,Cu,V,Mo,Ni,Cr,Mn,Pb,Ba,Sr,Zr and Ti)was the more favorable reflection of the ore body.The identified soil geochemical anomalies were compared with the in situ economic Pb-Zn ore bodies for validation.The results showed that the median+2 MAD approach is capable of mapping both strong and weak geochemical anomalies related to buried Pb-Zn mineralization,which is therefore useful at the reconnaissance drilling stage.
基金supported by the National Natural Science Foundation of China(Grant No.12372351).
文摘In this study, to meet the development and application requirements for high-strength and hightoughness energetic structural materials, a representative volume element of a TA15 matrix embedded with a TaZrNb sphere was designed and fabricated via diffusion bonding. The mechanisms of the microstructural evolution of the TaZrNb/TA15 interface were investigated via SEM, EBSD, EDS, and XRD.Interface mechanical property tests and in-situ tensile tests were conducted on the sphere-containing structure, and an equivalent tensile-strength model was established for the structure. The results revealed that the TA15 titanium alloy and joint had high density and no pores or cracks. The thickness of the planar joint was approximately 50-60 μm. The average tensile and shear strengths were 767 MPa and 608 MPa, respectively. The thickness of the spherical joint was approximately 60 μm. The Zr and Nb elements in the joint diffused uniformly and formed strong bonds with Ti without forming intermetallic compounds. The interface exhibited submicron grain refinement and a concave-convex interlocking structure. The tensile fracture surface primarily exhibited intergranular fracture combined with some transgranular fracture, which constituted a quasi-brittle fracture mode. The shear fracture surface exhibited brittle fracture with regular arrangements of furrows. Internal fracture occurred along the spherical interface, as revealed by advanced in-situ X-ray microcomputed tomography. The experimental results agreed well with the theoretical predictions, indicating that the high-strength interface contributes to the overall strength and toughness of the sphere-containing structure.
基金supported by the Major Science and Technology Project of Gansu Province(No.22ZD6FA021-5)the Industrial Support Project of Gansu Province(Nos.2023CYZC-19 and 2021CYZC-22)the Science and Technology Project of Gansu Province(Nos.23YFFA0074,22JR5RA137 and 22JR5RA151).
文摘To obtain more stable spectral data for accurate quantitative analysis of multi-element,especially for the large-area in-situ elements detection of soils, we propose a method for a multielement quantitative analysis of soils using calibration-free laser-induced breakdown spectroscopy(CF-LIBS) based on data filtering. In this study, we analyze a standard soil sample doped with two heavy metal elements, Cu and Cd, with a specific focus on the line of Cu I324.75 nm for filtering the experimental data of multiple sample sets. Pre-and post-data filtering,the relative standard deviation for Cu decreased from 30% to 10%, The limits of detection(LOD)values for Cu and Cd decreased by 5% and 4%, respectively. Through CF-LIBS, a quantitative analysis was conducted to determine the relative content of elements in soils. Using Cu as a reference, the concentration of Cd was accurately calculated. The results show that post-data filtering, the average relative error of the Cd decreases from 11% to 5%, indicating the effectiveness of data filtering in improving the accuracy of quantitative analysis. Moreover, the content of Si, Fe and other elements can be accurately calculated using this method. To further correct the calculation, the results for Cd was used to provide a more precise calculation. This approach is of great importance for the large-area in-situ heavy metals and trace elements detection in soil, as well as for rapid and accurate quantitative analysis.
基金Supported by Science and Technology Development Foundation for Nanchong Key Laboratory of Severe Weather Research in Northeast Sichuan (NCQXKJ202102)
文摘In order to better carry out meteorological service and improve service efficiency,based on Tianqing data multi-factor integration platform,using Tianqing intensive data source of big data cloud platform,daily and hourly precipitation,temperature and wind data can be acquired,and C#language can be used to realize data analysis,text template nesting,meteorological material generation and archive management functions.The application shows that the platform can quickly complete the production of rain information within 2 min,and the material filing management has zero errors,which is superior to the traditional methods in terms of framework design and data processing material management.
文摘Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R104)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability.
基金funded by University of Transport and Communications(UTC)under grant number T2025-CN-004.
文摘Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi-stego images provides good image quality but often results in low embedding capability.To address these challenges,this paper proposes a high-capacity RDH scheme based on PVO that generates three stego images from a single cover image.The cover image is partitioned into non-overlapping blocks with pixels sorted in ascending order.Four secret bits are embedded into each block’s maximum pixel value,while three additional bits are embedded into the second-largest value when the pixel difference exceeds a predefined threshold.A similar embedding strategy is also applied to the minimum side of the block,including the second-smallest pixel value.This design enables each block to embed up to 14 bits of secret data.Experimental results demonstrate that the proposed method achieves significantly higher embedding capacity and improved visual quality compared to existing triple-stego RDH approaches,advancing the field of reversible steganography.
基金supported by the Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2023-00235509Development of security monitoring technology based network behavior against encrypted cyber threats in ICT convergence environment).
文摘With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.
基金funded by Deanship of Graduate studies and Scientific Research at Jouf University under grant No.(DGSSR-2024-02-01264).
文摘Automated essay scoring(AES)systems have gained significant importance in educational settings,offering a scalable,efficient,and objective method for evaluating student essays.However,developing AES systems for Arabic poses distinct challenges due to the language’s complex morphology,diglossia,and the scarcity of annotated datasets.This paper presents a hybrid approach to Arabic AES by combining text-based,vector-based,and embeddingbased similarity measures to improve essay scoring accuracy while minimizing the training data required.Using a large Arabic essay dataset categorized into thematic groups,the study conducted four experiments to evaluate the impact of feature selection,data size,and model performance.Experiment 1 established a baseline using a non-machine learning approach,selecting top-N correlated features to predict essay scores.The subsequent experiments employed 5-fold cross-validation.Experiment 2 showed that combining embedding-based,text-based,and vector-based features in a Random Forest(RF)model achieved an R2 of 88.92%and an accuracy of 83.3%within a 0.5-point tolerance.Experiment 3 further refined the feature selection process,demonstrating that 19 correlated features yielded optimal results,improving R2 to 88.95%.In Experiment 4,an optimal data efficiency training approach was introduced,where training data portions increased from 5%to 50%.The study found that using just 10%of the data achieved near-peak performance,with an R2 of 85.49%,emphasizing an effective trade-off between performance and computational costs.These findings highlight the potential of the hybrid approach for developing scalable Arabic AES systems,especially in low-resource environments,addressing linguistic challenges while ensuring efficient data usage.
基金supported by the project“Romanian Hub for Artificial Intelligence-HRIA”,Smart Growth,Digitization and Financial Instruments Program,2021–2027,MySMIS No.334906.
文摘Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results.
基金The work described in this paper was fully supported by a grant from Hong Kong Metropolitan University(RIF/2021/05).
文摘Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using deep learning algorithms further enhances performance;nevertheless,it is challenging due to the nature of small-scale and imbalanced PD datasets.This paper proposed a convolutional neural network-based deep support vector machine(CNN-DSVM)to automate the feature extraction process using CNN and extend the conventional SVM to a DSVM for better classification performance in small-scale PD datasets.A customized kernel function reduces the impact of biased classification towards the majority class(healthy candidates in our consideration).An improved generative adversarial network(IGAN)was designed to generate additional training data to enhance the model’s performance.For performance evaluation,the proposed algorithm achieves a sensitivity of 97.6%and a specificity of 97.3%.The performance comparison is evaluated from five perspectives,including comparisons with different data generation algorithms,feature extraction techniques,kernel functions,and existing works.Results reveal the effectiveness of the IGAN algorithm,which improves the sensitivity and specificity by 4.05%–4.72%and 4.96%–5.86%,respectively;and the effectiveness of the CNN-DSVM algorithm,which improves the sensitivity by 1.24%–57.4%and specificity by 1.04%–163%and reduces biased detection towards the majority class.The ablation experiments confirm the effectiveness of individual components.Two future research directions have also been suggested.
文摘For a complex flow about multi-element airfoils a mixed grid method is set up. C-type grids are produced on each element′s body and in their wakes at first, O-type grids are given in the outmost area, and H-type grids are used in middle additional areas. An algebra method is used to produce the initial grids in each area. And the girds are optimized by elliptical differential equation method. Then C-O-H zonal patched grids around multi-element airfoils are produced automatically and efficiently. A time accurate finite-volume integration method is used to solve the compressible laminar and turbulent Navier-Stokes (N-S) equations on the grids. Computational results prove the method to be effective.
基金supported by the National Key Project of China(No.GJXM92579)National Natural Science Foundation of China(Nos.92052203,11872230 and 91852108)。
文摘Almost half of all flight accidents caused by inflight icing occur at the approach and landing phases when high-lift devices are deployed.The present study focuses on the optimization of an ice-tolerant multi-element airfoil.Dual-objective optimization is carried out with critical hornshaped ice accumulated during the holding phase.The optimization results show that the present optimization method significantly enhances the iced-state and clean-state performance.The optimal multi-element airfoil has a larger deflection angle and wider gap at the slat and the flap compared with the baseline configuration.The sensitivity of each design parameter is analyzed,which verifies the robustness of the design.The design is further assessed when ice is accreted during the approach and landing phases,which also shows performance improvement.
基金This work was supported by the State Key Lab of Intense Pulsed Radiation Simulation and Effect Basic Research Foundation(No.SKLIPR1504).
文摘To study the effects of the gamma reflection of multi-element materials,gamma ray transport models of single-element materials,such as iron and lead,and multielement materials,such as polyethylene and ordinary concrete,were established in this study.Relationships among the albedo factors of the gamma photons and energies and average energy of the reflected gamma rays by material type,material thickness,incident gamma energy,and incidence angle of gamma rays were obtained by Monte Carlo simulation.The results show that the albedo factors of single-element and multi-element materials increase rapidly with an increase in the material thickness.When the thickness of the material increases to a certain value,the albedo factors do not increase further but rather tend to the saturation value.The saturation values for the albedo factors of the gamma photons,and energies and the reflection thickness are related not only to the type of material but also to the incident gamma energy and incidence angle of the gamma rays.At a given incident gamma energy,which is between 0.2 and 2.5 MeV,the smaller the effective atomic number of the multi-element material is,the higher the saturation values of the albedo factors are.The larger the incidence angle of the gamma ray is,the greater the saturation value of the gamma albedo factor,saturation reflection thickness,and average saturation energy of the reflected gamma photons are.
文摘The paper is to integrate aerodynamic and aero-acoustic optimizatiom design of high lift devices,especially for two-element airfoils with slat.Aerodynamic analysis on flow field utilizes a high-order,high-resolution spatial differential method for large eddy simulation(LES),which can guarantee accuracy and efficiency.The aeroacoustic analysis for noise level is calculated with Ffowcs Williams-Hawkings(FW-H)integration formula.Fidelity of calculation is verified by standard models.Method of streamline-based Euler simulation(MSES)is used to obtain the aerodynamic characters.Based on the confirmation of numerical methods,detailed research has been conducted for the leading edge slat on multi-element airfoils.Various slot parameter influences on noise are analyzed.The results of the slot optimization parameters can be used in multi-element airfoil design.
文摘Multi-element analysis in historical sites is a major issue in archaeological studies;however,this approach is almost unknown among Iranian scholars.Geochemical multi-element analysis of soil is very important to evaluate anthropogenic activities.The aim of this study consists of assessing the potential usefulness of multi-elemental soil analysis,obtained by Analytical Jena atomic absorption spectrophotometer(AAS) and ICP-MS,to recognize ancient anthropogenic features on the territory of Tappe Rivi(North Khorasan,Iran).For that purpose,a total of 80 ancient soil samples were sampled from each soil horizon and cultural layer.The research involved Fe,Al,Cd,Cu,Ni,Co,Cr,Pb,and P which trace element samples were extracted according to the International Standard ISO 11466 and phosphorus samples by Olsen method.Besides,the contamination of the soils was assessed based on enrichment factors(EFs) by using Fe as a reference element.This geochemical/archaeological approach highlights that the content of most elements in the Parthian and Sassanid ages were significantly higher than the contents of the elements in other zones,which shows that by the development of the eras,the content of the elements has also increased.Also,the accumulation of metals in the Rivi site was significantly higher than in the control area.Among the sampled zones,enrichment factor(EF) indicated that the enrichment of Cu and phosphate at the Parthian and Sassanid had the highest content.This result is important because it shows that the amount of metals and human activities are directly related to each other during different ages.
基金Projects(51671217,51604112) supported by the National Natural Science Foundation of ChinaProject(2017JJ3089) supported by the Natural Science Foundation of Hunan Province,China
文摘The effects of Al and Sc on mechanical properties of FeCoNi multi-element alloys(MEAs) were investigated by compressive tests. The microstructures of FeCoNi MEAs with different contents of Al and Sc were characterized and the strengthening mechanisms were discussed. The results show that FeCoNi MEA with a low content of Al has a face-centered cubic(FCC) structure. The yield strength increases linearly with the increase of Al content, which is largely caused by solid solution hardening. Further addition of Sc can promote the formation of a new phase in(FeCoNi)1-xAlx MEAs. A minor addition of Sc can significantly increase the yield strengths of(FeCoNi)1-xAlx MEAs with a low Al content and improve the compressive plasticity of(FeCoNi)1-xAlx MEAs with a high Al content.
基金This work was supported by the Scientific Research Foundation for High Level Talents of Qingdao Agricultural University,China(665-1120015)the National Program for Quality and Safety Risk Assessment of Agricultural Products of China(GJFP2019011)the National Natural Science Foundation of China(42207017).
文摘Winter jujube(Ziziphus jujuba'Dongzao')is greatly appreciated by consumers for its excellent quality,but brand infringement frequently occurs in the market.Here,we first determined a total of 38 elements in 167 winter jujube samples from the main winter jujube producing areas of China by inductively coupled plasma mass spectrometer(ICP-MS).As a result,16 elements(Mg,K,Mn,Cu,Zn,Mo,Ba,Be,As,Se,Cd,Sb,Ce,Er,Tl,and Pb)exhibited significant differences in samples from different producing areas.Supervised linear discriminant analysis(LDA)and orthogonal projection to latent structures discriminant analysis(OPLS-DA)showed better performance in identifying the origin of samples than unsupervised principal component analysis(PCA).LDA and OPLS-DA had a mean identification accuracy of 87.84 and 94.64%in the testing set,respectively.By using the multilayer perceptron(MLP)and C5.0,the prediction accuracy of the models could reach 96.36 and 91.06%,respectively.Based on the above four chemometric methods,Cd,Tl,Mo and Se were selected as the main variables and principal markers for the origin identification of winter jujube.Overall,this study demonstrates that it is practical and precise to identify the origin of winter jujube through multi-element fingerprint analysis with chemometrics,and may also provide reference for establishing the origin traceability system of other fruits.