Among various architectures of polymers,end-group-free rings have attracted growing interests due to their distinct physicochemical performances over the linear counterparts which are exemplified by reduced hydrodynam...Among various architectures of polymers,end-group-free rings have attracted growing interests due to their distinct physicochemical performances over the linear counterparts which are exemplified by reduced hydrodynamic size and slower degradation.It is key to develop facile methods to large-scale synthesis of polymer rings with tunable compositions and microstructures.Recent progresses in large-scale synthesis of polymer rings against single-chain dynamic nanoparticles,and the example applications in synchronous enhancing toughness and strength of polymer nanocomposites are summarized.Once there is the breakthrough in rational design and effective large-scale synthesis of polymer rings and their functional derivatives,a family of cyclic functional hybrids would be available,thus providing a new paradigm in developing polymer science and engineering.展开更多
Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using d...Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using deep learning algorithms further enhances performance;nevertheless,it is challenging due to the nature of small-scale and imbalanced PD datasets.This paper proposed a convolutional neural network-based deep support vector machine(CNN-DSVM)to automate the feature extraction process using CNN and extend the conventional SVM to a DSVM for better classification performance in small-scale PD datasets.A customized kernel function reduces the impact of biased classification towards the majority class(healthy candidates in our consideration).An improved generative adversarial network(IGAN)was designed to generate additional training data to enhance the model’s performance.For performance evaluation,the proposed algorithm achieves a sensitivity of 97.6%and a specificity of 97.3%.The performance comparison is evaluated from five perspectives,including comparisons with different data generation algorithms,feature extraction techniques,kernel functions,and existing works.Results reveal the effectiveness of the IGAN algorithm,which improves the sensitivity and specificity by 4.05%–4.72%and 4.96%–5.86%,respectively;and the effectiveness of the CNN-DSVM algorithm,which improves the sensitivity by 1.24%–57.4%and specificity by 1.04%–163%and reduces biased detection towards the majority class.The ablation experiments confirm the effectiveness of individual components.Two future research directions have also been suggested.展开更多
Most predictive maintenance studies have emphasized accuracy but provide very little focus on Interpretability or deployment readiness.This study improves on prior methods by developing a small yet robust system that ...Most predictive maintenance studies have emphasized accuracy but provide very little focus on Interpretability or deployment readiness.This study improves on prior methods by developing a small yet robust system that can predict when turbofan engines will fail.It uses the NASA CMAPSS dataset,which has over 200,000 engine cycles from260 engines.The process begins with systematic preprocessing,which includes imputation,outlier removal,scaling,and labelling of the remaining useful life.Dimensionality is reduced using a hybrid selection method that combines variance filtering,recursive elimination,and gradient-boosted importance scores,yielding a stable set of 10 informative sensors.To mitigate class imbalance,minority cases are oversampled,and class-weighted losses are applied during training.Benchmarking is carried out with logistic regression,gradient boosting,and a recurrent design that integrates gated recurrent units with long short-term memory networks.The Long Short-Term Memory–Gated Recurrent Unit(LSTM–GRU)hybrid achieved the strongest performance with an F1 score of 0.92,precision of 0.93,recall of 0.91,ReceiverOperating Characteristic–AreaUnder the Curve(ROC-AUC)of 0.97,andminority recall of 0.75.Interpretability testing using permutation importance and Shapley values indicates that sensors 13,15,and 11 are the most important indicators of engine wear.The proposed system combines imbalance handling,feature reduction,and Interpretability into a practical design suitable for real industrial settings.展开更多
Accurate purchase prediction in e-commerce critically depends on the quality of behavioral features.This paper proposes a layered and interpretable feature engineering framework that organizes user signals into three ...Accurate purchase prediction in e-commerce critically depends on the quality of behavioral features.This paper proposes a layered and interpretable feature engineering framework that organizes user signals into three layers:Basic,Conversion&Stability(efficiency and volatility across actions),and Advanced Interactions&Activity(crossbehavior synergies and intensity).Using real Taobao(Alibaba’s primary e-commerce platform)logs(57,976 records for 10,203 users;25 November–03 December 2017),we conducted a hierarchical,layer-wise evaluation that holds data splits and hyperparameters fixed while varying only the feature set to quantify each layer’s marginal contribution.Across logistic regression(LR),decision tree,random forest,XGBoost,and CatBoost models with stratified 5-fold cross-validation,the performance improvedmonotonically fromBasic to Conversion&Stability to Advanced features.With LR,F1 increased from 0.613(Basic)to 0.962(Advanced);boosted models achieved high discrimination(0.995 AUC Score)and an F1 score up to 0.983.Calibration and precision–recall analyses indicated strong ranking quality and acknowledged potential dataset and period biases given the short(9-day)window.By making feature contributions measurable and reproducible,the framework complements model-centric advances and offers a transparent blueprint for production-grade behavioralmodeling.The code and processed artifacts are publicly available,and future work will extend the validation to longer,seasonal datasets and hybrid approaches that combine automated feature learning with domain-driven design.展开更多
Lithology identification is a critical aspect of geoenergy exploration,including geothermal energy development,gas hydrate extraction,and gas storage.In recent years,artificial intelligence techniques based on drill c...Lithology identification is a critical aspect of geoenergy exploration,including geothermal energy development,gas hydrate extraction,and gas storage.In recent years,artificial intelligence techniques based on drill core images have made significant strides in lithology identification,achieving high accuracy.However,the current demand for advanced lithology identification models remains unmet due to the lack of high-quality drill core image datasets.This study successfully constructs and publicly releases the first open-source Drill Core Image Dataset(DCID),addressing the need for large-scale,high-quality datasets in lithology characterization tasks within geological engineering and establishing a standard dataset for model evaluation.DCID consists of 35 lithology categories and a total of 98,000 high-resolution images(512×512 pixels),making it the most comprehensive drill core image dataset in terms of lithology categories,image quantity,and resolution.This study also provides lithology identification accuracy benchmarks for popular convolutional neural networks(CNNs)such as VGG,ResNet,DenseNet,MobileNet,as well as for the Vision Transformer(ViT)and MLP-Mixer,based on DCID.Additionally,the sensitivity of model performance to various parameters and image resolution is evaluated.In response to real-world challenges,we propose a real-world data augmentation(RWDA)method,leveraging slightly defective images from DCID to enhance model robustness.The study also explores the impact of real-world lighting conditions on the performance of lithology identification models.Finally,we demonstrate how to rapidly evaluate model performance across multiple dimensions using low-resolution datasets,advancing the application and development of new lithology identification models for geoenergy exploration.展开更多
1.Introduction Climate change mitigation pathways aimed at limiting global anthropogenic carbon dioxide(CO_(2))emissions while striving to constrain the global temperature increase to below 2℃—as outlined by the Int...1.Introduction Climate change mitigation pathways aimed at limiting global anthropogenic carbon dioxide(CO_(2))emissions while striving to constrain the global temperature increase to below 2℃—as outlined by the Intergovernmental Panel on Climate Change(IPCC)—consistently predict the widespread implementation of CO_(2)geological storage on a global scale.展开更多
In recent years,artificial intelligence technology has exhibited great potential in seismic signal recognition,setting off a new wave of research.Vast amounts of high-quality labeled data are required to develop and a...In recent years,artificial intelligence technology has exhibited great potential in seismic signal recognition,setting off a new wave of research.Vast amounts of high-quality labeled data are required to develop and apply artificial intelligence in seismology research.In this study,based on the 2013–2020 seismic cataloging reports of the China Earthquake Networks Center,we constructed an artificial intelligence seismological training dataset(“DiTing”)with the largest known total time length.Data were recorded using broadband and short-period seismometers.The obtained dataset included 2,734,748 threecomponent waveform traces from 787,010 regional seismic events,the corresponding P-and S-phase arrival time labels,and 641,025 P-wave first-motion polarity labels.All waveforms were sampled at 50 Hz and cut to a time length of 180 s starting from a random number of seconds before the occurrence of an earthquake.Each three-component waveform contained a considerable amount of descriptive information,such as the epicentral distance,back azimuth,and signal-to-noise ratios.The magnitudes of seismic events,epicentral distance,signal-to-noise ratio of P-wave data,and signal-to-noise ratio of S-wave data ranged from 0 to 7.7,0 to 330 km,–0.05 to 5.31 dB,and–0.05 to 4.73 dB,respectively.The dataset compiled in this study can serve as a high-quality benchmark for machine learning model development and data-driven seismological research on earthquake detection,seismic phase picking,first-motion polarity determination,earthquake magnitude prediction,early warning systems,and strong ground-motion prediction.Such research will further promote the development and application of artificial intelligence in seismology.展开更多
The recent upsurge in metro construction emphasizes the necessity of understanding the mechanical performance of metro shield tunnel subjected to the influence of ground fissures.In this study,a largescale experiment,...The recent upsurge in metro construction emphasizes the necessity of understanding the mechanical performance of metro shield tunnel subjected to the influence of ground fissures.In this study,a largescale experiment,in combination with numerical simulation,was conducted to investigate the influence of ground fissures on a metro shield tunnel.The results indicate that the lining contact pressure at the vault increases in the hanging wall while decreases in the footwall,resulting in a two-dimensional stress state of vertical shear and axial tension-compression,and simultaneous vertical dislocation and axial tilt for the segments around the ground fissure.In addition,the damage to curved bolts includes tensile yield,flexural yield,and shear twist,leading to obvious concrete lining damage,particularly at the vault,arch bottom,and hance,indicating that the joints in these positions are weak areas.The shield tunnel orthogonal to the ground fissure ultimately experiences shear failure,suggesting that the maximum actual dislocation of ground fissure that the structure can withstand is approximately 20 cm,and five segment rings in the hanging wall and six segment rings in the footwall also need to be reinforced.This study could provide a reference for metro design in ground fissure sites.展开更多
Based on the analysis of typical lacustrine shale oil zones in China and their geological characteristics,this study elucidates the fundamental differences between the enrichment patterns of shale oil sweet spots and ...Based on the analysis of typical lacustrine shale oil zones in China and their geological characteristics,this study elucidates the fundamental differences between the enrichment patterns of shale oil sweet spots and conventional oil and gas.The key parameters and evaluation methods for assessing the large-scale production potential of lacustrine shale oil are proposed.The results show that shale oil is a petroleum resource that exists in organic-rich shale formations,in other words,it is preserved in its source bed,following a different process of generation-accumulation-enrichment from conventional oil and gas.Thus,the concept of“reservoir”seems to be inapplicable to shale oil.In China,lacustrine shale oil is distributed widely,but the geological characteristics and sweet spots enrichment patterns of shale oil vary significantly in lacustrine basins where the water environment and the tectonic evolution and diagenetic transformation frameworks are distinct.The core of the evaluation of lacustrine shale oil is“sweet spot volume”.The key factors for evaluating the large-scale production of continental shale oil are the oil storage capacity,oil-bearing capacity and oil producing capacity.The key parameters for evaluating these capacities are total porosity,oil content,and free oil content,respectively.It is recommended to determine the total porosity of shale by combining helium porosity measurement with nuclear magnetic resonance(NMR)method,the oil content of key layers by using organic solvent extraction,NMR method and high pressure mercury intrusion methods,and the free oil content by using NMR fluid distribution secondary spectral stripping decomposition and logging.The research results contribute supplemental insights on continental shale oil deliverability in China,and provide a scientific basis for the rapid exploration and large-scale production of lacustrine shale oil.展开更多
The titanium alloy strut serves as a key load-bearing component of aircraft landing gear,typically manufactured via forging.The friction condition has important influence on material flow and cavity filling during the...The titanium alloy strut serves as a key load-bearing component of aircraft landing gear,typically manufactured via forging.The friction condition has important influence on material flow and cavity filling during the forging process.Using the previously optimized shape and initial position of preform,the influence of the friction condition(friction factor m=0.1–0.3)on material flow and cavity filling was studied by numerical method with a shear friction model.A novel filling index was defined to reflect material flow into left and right flashes and zoom in on friction-induced results.The results indicate that the workpiece moves rigidly to the right direction,with the displacement decreasing as m increases.When m<0.18,the underfilling defect will occur in the left side of strut forging,while overflow occurs in the right forging die cavity.By combining the filling index and analyses of material flow and filling status,a reasonable friction factor interval of m=0.21–0.24 can be determined.Within this interval,the cavity filling behavior demonstrates robustness,with friction fluctuations exerting minimal influence.展开更多
Based on questionnaire surveys and field interviews conducted with various types of agricultural production organizations across five districts and four counties in Daqing City,this study combines relevant theoretical...Based on questionnaire surveys and field interviews conducted with various types of agricultural production organizations across five districts and four counties in Daqing City,this study combines relevant theoretical frameworks to systematically examine the evolution,performance,and influencing factors of governance mechanisms within these organizations.Using both quantitative and inductive analytical methods,the paper proposes innovative designs and supporting measures for improving governance mechanisms.The findings reveal that,amid large-scale farmland circulation,the governance mechanisms of agricultural production organizations in Daqing City are evolving from traditional to modern structures.However,challenges remain in areas such as decision-making efficiency,benefit distribution,and supervision mechanisms.In response,this study proposes innovative governance designs focusing on decision-making processes,profit-sharing mechanisms,and risk prevention.Corresponding policy recommendations are also provided to support the sustainable development of agricultural modernization in China.展开更多
Standardized datasets are foundational to healthcare informatization by enhancing data quality and unleashing the value of data elements.Using bibliometrics and content analysis,this study examines China's healthc...Standardized datasets are foundational to healthcare informatization by enhancing data quality and unleashing the value of data elements.Using bibliometrics and content analysis,this study examines China's healthcare dataset standards from 2011 to 2025.It analyzes their evolution across types,applications,institutions,and themes,highlighting key achievements including substantial growth in quantity,optimized typology,expansion into innovative application scenarios such as health decision support,and broadened institutional involvement.The study also identifies critical challenges,including imbalanced development,insufficient quality control,and a lack of essential metadata—such as authoritative data element mappings and privacy annotations—which hampers the delivery of intelligent services.To address these challenges,the study proposes a multi-faceted strategy focused on optimizing the standard system's architecture,enhancing quality and implementation,and advancing both data governance—through authoritative tracing and privacy protection—and intelligent service provision.These strategies aim to promote the application of dataset standards,thereby fostering and securing the development of new productive forces in healthcare.展开更多
When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes...When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes a high-performance classification algorithm specifically designed for imbalanced datasets.The proposed method first uses a biased second-order cone programming support vectormachine(B-SOCP-SVM)to identify the support vectors(SVs)and non-support vectors(NSVs)in the imbalanced data.Then,it applies the synthetic minority over-sampling technique(SV-SMOTE)to oversample the support vectors of the minority class and uses the random under-sampling technique(NSV-RUS)multiple times to undersample the non-support vectors of the majority class.Combining the above-obtained minority class data set withmultiple majority class datasets can obtainmultiple new balanced data sets.Finally,SOCP-SVM is used to classify each data set,and the final result is obtained through the integrated algorithm.Experimental results demonstrate that the proposed method performs excellently on imbalanced datasets.展开更多
Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale opti...Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale optimization problems are solved using computing machines,leading to an enormous computational time being required,which may delay deriving timely solutions.Decomposition methods,which partition a large-scale optimization problem into lower-dimensional subproblems,represent a key approach to addressing time-efficiency issues.There has been significant progress in both applied mathematics and emerging artificial intelligence approaches on this front.This work aims at providing an overview of the decomposition methods from both the mathematics and computer science points of view.We also remark on the state-of-the-art developments and recent applications of the decomposition methods,and discuss the future research and development perspectives.展开更多
This article focuses on the management of large-scale machinery and equipment in highway construction,with the research objective of identifying issues at the management level and exploring more effective management m...This article focuses on the management of large-scale machinery and equipment in highway construction,with the research objective of identifying issues at the management level and exploring more effective management measures.Through practical observation and logical analysis,this article elaborates on the management connotations of large-scale machinery and equipment in highway construction,affirming its management value from different perspectives.On this basis,it carefully analyzes the problems existing in the management of large-scale machinery and equipment,providing a detailed interpretation of issues such as the weak foundation of the equipment management system and the disconnection between equipment selection and configuration from reality.Combining the manifestations of related problems,this article proposes strategies such as strengthening the institutional foundation of equipment management,selecting and configuring equipment based on actual conditions,aiming to provide references for large-scale machinery and equipment management to relevant enterprises.展开更多
It has been argued that the human brain,as an information-processing machine,operates near a phase transition point in a non-equilibrium state,where it violates detailed balance leading to entropy production.Thus,the ...It has been argued that the human brain,as an information-processing machine,operates near a phase transition point in a non-equilibrium state,where it violates detailed balance leading to entropy production.Thus,the assessment of irreversibility in brain networks can provide valuable insights into their non-equilibrium properties.In this study,we utilized an open-source whole-brain functional magnetic resonance imaging(fMRI)dataset from both resting and task states to evaluate the irreversibility of large-scale human brain networks.Our analysis revealed that the brain networks exhibited significant irreversibility,violating detailed balance,and generating entropy.Notably,both physical and cognitive tasks increased the extent of this violation compared to the resting state.Regardless of the state(rest or task),interactions between pairs of brain regions were the primary contributors to this irreversibility.Moreover,we observed that as global synchrony increased within brain networks,so did irreversibility.The first derivative of irreversibility with respect to synchronization peaked near the phase transition point,characterized by the moderate mean synchronization and maximized synchronization entropy of blood oxygenation level-dependent(BOLD)signals.These findings deepen our understanding of the non-equilibrium dynamics of large-scale brain networks,particularly in relation to their phase transition behaviors,and may have potential clinical applications for brain disorders.展开更多
Offshore wind power plays a crucial role in energy strategies.The results of traditional small-scale physical models may be unreliable when extrapolated to large field scales.This study addressed this limitation by co...Offshore wind power plays a crucial role in energy strategies.The results of traditional small-scale physical models may be unreliable when extrapolated to large field scales.This study addressed this limitation by conducting large-scale(1:13)experiments to investigate the scour hole pattern and equilibrium scour depth around both slender and large monopiles under irregular waves.The experiments adopted KeuleganeCarpenter number(NKC)values from 1.01 to 8.89 and diffraction parameter(D/L,where D is the diameter of the monopile,and L is the wave length)values from 0.016 to 0.056.The results showed that changes in the maximum scour location and scour hole shape around a slender monopile were associated with NKC,with differences observed between irregular and regular waves.Improving the calculation of NKC enhanced the accuracy of existing scour formulae under irregular waves.The maximum scour locations around a large monopile were consistently found on both sides,regardless of NKC and D/L,but the scour hole topography was influenced by both parameters.Notably,the scour range around a large monopile was at least as large as the monopile diameter.展开更多
In 2022, South China(SC) experienced record-breaking rainfall during its first rainy season, causing severe socioeconomic losses. This study examines the large-scale circulation anomalies responsible for this extreme ...In 2022, South China(SC) experienced record-breaking rainfall during its first rainy season, causing severe socioeconomic losses. This study examines the large-scale circulation anomalies responsible for this extreme event.Analysis reveals that the lower-tropospheric cyclonic anomaly over SC plays a crucial role. This cyclonic anomaly consists of extratropical northeasterly anomalies to the north of SC and tropical southwesterly anomalies to the south. Both components were particularly intense during the 2022 first rainy season, contributing to the heavy rainfall in SC. Moreover,the lower-tropospheric cyclonic anomaly is enhanced by its counterpart in the upper troposphere, which is associated with a wave train propagating from the North Atlantic to East Asia across the mid-high latitudes of the Eurasian continent.Further analysis indicates that the extratropical wave train correlates with sea surface temperature anomalies(SSTAs) in the North Atlantic. Additionally, the SSTAs over the North Indian Ocean also play a role in enhancing the tropical southwesterlies in the lower troposphere. This study highlights the combined influence of tropical and extratropical circulation anomalies, offering a comprehensive understanding of the record-breaking rainfall.展开更多
Detecting faces under occlusion remains a significant challenge in computer vision due to variations caused by masks,sunglasses,and other obstructions.Addressing this issue is crucial for applications such as surveill...Detecting faces under occlusion remains a significant challenge in computer vision due to variations caused by masks,sunglasses,and other obstructions.Addressing this issue is crucial for applications such as surveillance,biometric authentication,and human-computer interaction.This paper provides a comprehensive review of face detection techniques developed to handle occluded faces.Studies are categorized into four main approaches:feature-based,machine learning-based,deep learning-based,and hybrid methods.We analyzed state-of-the-art studies within each category,examining their methodologies,strengths,and limitations based on widely used benchmark datasets,highlighting their adaptability to partial and severe occlusions.The review also identifies key challenges,including dataset diversity,model generalization,and computational efficiency.Our findings reveal that deep learning methods dominate recent studies,benefiting from their ability to extract hierarchical features and handle complex occlusion patterns.More recently,researchers have increasingly explored Transformer-based architectures,such as Vision Transformer(ViT)and Swin Transformer,to further improve detection robustness under challenging occlusion scenarios.In addition,hybrid approaches,which aim to combine traditional andmodern techniques,are emerging as a promising direction for improving robustness.This review provides valuable insights for researchers aiming to develop more robust face detection systems and for practitioners seeking to deploy reliable solutions in real-world,occlusionprone environments.Further improvements and the proposal of broader datasets are required to developmore scalable,robust,and efficient models that can handle complex occlusions in real-world scenarios.展开更多
This study employs deformation monitoring data acquired during the construction of the Haoji railway large-scale bridge to investigate the displacement behavior of the subgrades,catenary columns,and tracks.Emphasis is...This study employs deformation monitoring data acquired during the construction of the Haoji railway large-scale bridge to investigate the displacement behavior of the subgrades,catenary columns,and tracks.Emphasis is placed on data acquisition and processing methods using total stations and automated monitoring systems.Through a comprehensive analysis of lateral,longitudinal,and vertical displacement data from 26 subgrade monitoring points,catenary columns,and track sections,this research evaluates how construction activities influence railway structures.The results show that displacement variations in the subgrades,catenary columns,and tracks remained within the established alert thresholds,exhibiting stable deformation trends and indicating that any adverse environmental impact was effectively contained.Furthermore,this paper proposes an early warning mechanism based on an automated monitoring system,which can promptly detect abnormal deformations and initiate emergency response procedures,thereby ensuring the safe operation of the railway.The integration of big data analysis and deformation prediction models offers a practical foundation for future safety management in railway construction.展开更多
基金Supported by the National Natural Science Foundation of China(Nos.52293472,22473096 and 22471164)。
文摘Among various architectures of polymers,end-group-free rings have attracted growing interests due to their distinct physicochemical performances over the linear counterparts which are exemplified by reduced hydrodynamic size and slower degradation.It is key to develop facile methods to large-scale synthesis of polymer rings with tunable compositions and microstructures.Recent progresses in large-scale synthesis of polymer rings against single-chain dynamic nanoparticles,and the example applications in synchronous enhancing toughness and strength of polymer nanocomposites are summarized.Once there is the breakthrough in rational design and effective large-scale synthesis of polymer rings and their functional derivatives,a family of cyclic functional hybrids would be available,thus providing a new paradigm in developing polymer science and engineering.
基金The work described in this paper was fully supported by a grant from Hong Kong Metropolitan University(RIF/2021/05).
文摘Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using deep learning algorithms further enhances performance;nevertheless,it is challenging due to the nature of small-scale and imbalanced PD datasets.This paper proposed a convolutional neural network-based deep support vector machine(CNN-DSVM)to automate the feature extraction process using CNN and extend the conventional SVM to a DSVM for better classification performance in small-scale PD datasets.A customized kernel function reduces the impact of biased classification towards the majority class(healthy candidates in our consideration).An improved generative adversarial network(IGAN)was designed to generate additional training data to enhance the model’s performance.For performance evaluation,the proposed algorithm achieves a sensitivity of 97.6%and a specificity of 97.3%.The performance comparison is evaluated from five perspectives,including comparisons with different data generation algorithms,feature extraction techniques,kernel functions,and existing works.Results reveal the effectiveness of the IGAN algorithm,which improves the sensitivity and specificity by 4.05%–4.72%and 4.96%–5.86%,respectively;and the effectiveness of the CNN-DSVM algorithm,which improves the sensitivity by 1.24%–57.4%and specificity by 1.04%–163%and reduces biased detection towards the majority class.The ablation experiments confirm the effectiveness of individual components.Two future research directions have also been suggested.
基金supported by the Deanship of Scientific Research,Vice Presidency for Graduate Studies and Scientific Research,King Faisal University,Saudi Arabia Grant No.KFU253765.
文摘Most predictive maintenance studies have emphasized accuracy but provide very little focus on Interpretability or deployment readiness.This study improves on prior methods by developing a small yet robust system that can predict when turbofan engines will fail.It uses the NASA CMAPSS dataset,which has over 200,000 engine cycles from260 engines.The process begins with systematic preprocessing,which includes imputation,outlier removal,scaling,and labelling of the remaining useful life.Dimensionality is reduced using a hybrid selection method that combines variance filtering,recursive elimination,and gradient-boosted importance scores,yielding a stable set of 10 informative sensors.To mitigate class imbalance,minority cases are oversampled,and class-weighted losses are applied during training.Benchmarking is carried out with logistic regression,gradient boosting,and a recurrent design that integrates gated recurrent units with long short-term memory networks.The Long Short-Term Memory–Gated Recurrent Unit(LSTM–GRU)hybrid achieved the strongest performance with an F1 score of 0.92,precision of 0.93,recall of 0.91,ReceiverOperating Characteristic–AreaUnder the Curve(ROC-AUC)of 0.97,andminority recall of 0.75.Interpretability testing using permutation importance and Shapley values indicates that sensors 13,15,and 11 are the most important indicators of engine wear.The proposed system combines imbalance handling,feature reduction,and Interpretability into a practical design suitable for real industrial settings.
基金supported by the research fund of Hanyang University(HY-202500000001616).
文摘Accurate purchase prediction in e-commerce critically depends on the quality of behavioral features.This paper proposes a layered and interpretable feature engineering framework that organizes user signals into three layers:Basic,Conversion&Stability(efficiency and volatility across actions),and Advanced Interactions&Activity(crossbehavior synergies and intensity).Using real Taobao(Alibaba’s primary e-commerce platform)logs(57,976 records for 10,203 users;25 November–03 December 2017),we conducted a hierarchical,layer-wise evaluation that holds data splits and hyperparameters fixed while varying only the feature set to quantify each layer’s marginal contribution.Across logistic regression(LR),decision tree,random forest,XGBoost,and CatBoost models with stratified 5-fold cross-validation,the performance improvedmonotonically fromBasic to Conversion&Stability to Advanced features.With LR,F1 increased from 0.613(Basic)to 0.962(Advanced);boosted models achieved high discrimination(0.995 AUC Score)and an F1 score up to 0.983.Calibration and precision–recall analyses indicated strong ranking quality and acknowledged potential dataset and period biases given the short(9-day)window.By making feature contributions measurable and reproducible,the framework complements model-centric advances and offers a transparent blueprint for production-grade behavioralmodeling.The code and processed artifacts are publicly available,and future work will extend the validation to longer,seasonal datasets and hybrid approaches that combine automated feature learning with domain-driven design.
基金support from the National Natural Science Foundation of China(Nos.U24B2034,U2139204)the China Petroleum Science and Technology Innovation Fund(2021DQ02-0501)the Science and Technology Support Project of Langfang(2024011073).
文摘Lithology identification is a critical aspect of geoenergy exploration,including geothermal energy development,gas hydrate extraction,and gas storage.In recent years,artificial intelligence techniques based on drill core images have made significant strides in lithology identification,achieving high accuracy.However,the current demand for advanced lithology identification models remains unmet due to the lack of high-quality drill core image datasets.This study successfully constructs and publicly releases the first open-source Drill Core Image Dataset(DCID),addressing the need for large-scale,high-quality datasets in lithology characterization tasks within geological engineering and establishing a standard dataset for model evaluation.DCID consists of 35 lithology categories and a total of 98,000 high-resolution images(512×512 pixels),making it the most comprehensive drill core image dataset in terms of lithology categories,image quantity,and resolution.This study also provides lithology identification accuracy benchmarks for popular convolutional neural networks(CNNs)such as VGG,ResNet,DenseNet,MobileNet,as well as for the Vision Transformer(ViT)and MLP-Mixer,based on DCID.Additionally,the sensitivity of model performance to various parameters and image resolution is evaluated.In response to real-world challenges,we propose a real-world data augmentation(RWDA)method,leveraging slightly defective images from DCID to enhance model robustness.The study also explores the impact of real-world lighting conditions on the performance of lithology identification models.Finally,we demonstrate how to rapidly evaluate model performance across multiple dimensions using low-resolution datasets,advancing the application and development of new lithology identification models for geoenergy exploration.
基金supported by the National Key Research and Development Program of China(2022YFE0206700)。
文摘1.Introduction Climate change mitigation pathways aimed at limiting global anthropogenic carbon dioxide(CO_(2))emissions while striving to constrain the global temperature increase to below 2℃—as outlined by the Intergovernmental Panel on Climate Change(IPCC)—consistently predict the widespread implementation of CO_(2)geological storage on a global scale.
基金the National Natural Science Foundation of China(Nos.41804047 and 42111540260)Fundamental Research Funds of the Institute of Geophysics,China Earthquake Administration(NO.DQJB19A0114)the Key Research Program of the Institute of Geology and Geophysics,Chinese Academy of Sciences(No.IGGCAS-201904).
文摘In recent years,artificial intelligence technology has exhibited great potential in seismic signal recognition,setting off a new wave of research.Vast amounts of high-quality labeled data are required to develop and apply artificial intelligence in seismology research.In this study,based on the 2013–2020 seismic cataloging reports of the China Earthquake Networks Center,we constructed an artificial intelligence seismological training dataset(“DiTing”)with the largest known total time length.Data were recorded using broadband and short-period seismometers.The obtained dataset included 2,734,748 threecomponent waveform traces from 787,010 regional seismic events,the corresponding P-and S-phase arrival time labels,and 641,025 P-wave first-motion polarity labels.All waveforms were sampled at 50 Hz and cut to a time length of 180 s starting from a random number of seconds before the occurrence of an earthquake.Each three-component waveform contained a considerable amount of descriptive information,such as the epicentral distance,back azimuth,and signal-to-noise ratios.The magnitudes of seismic events,epicentral distance,signal-to-noise ratio of P-wave data,and signal-to-noise ratio of S-wave data ranged from 0 to 7.7,0 to 330 km,–0.05 to 5.31 dB,and–0.05 to 4.73 dB,respectively.The dataset compiled in this study can serve as a high-quality benchmark for machine learning model development and data-driven seismological research on earthquake detection,seismic phase picking,first-motion polarity determination,earthquake magnitude prediction,early warning systems,and strong ground-motion prediction.Such research will further promote the development and application of artificial intelligence in seismology.
基金supported by the National Key Research&Development Program of China(Grant No.2023YFC3008404)the Key Laboratory of Earth Fissures Geological Disaster,Ministry of Natural Resources,China(Grant Nos.EFGD20240609 and EFGD20240610).
文摘The recent upsurge in metro construction emphasizes the necessity of understanding the mechanical performance of metro shield tunnel subjected to the influence of ground fissures.In this study,a largescale experiment,in combination with numerical simulation,was conducted to investigate the influence of ground fissures on a metro shield tunnel.The results indicate that the lining contact pressure at the vault increases in the hanging wall while decreases in the footwall,resulting in a two-dimensional stress state of vertical shear and axial tension-compression,and simultaneous vertical dislocation and axial tilt for the segments around the ground fissure.In addition,the damage to curved bolts includes tensile yield,flexural yield,and shear twist,leading to obvious concrete lining damage,particularly at the vault,arch bottom,and hance,indicating that the joints in these positions are weak areas.The shield tunnel orthogonal to the ground fissure ultimately experiences shear failure,suggesting that the maximum actual dislocation of ground fissure that the structure can withstand is approximately 20 cm,and five segment rings in the hanging wall and six segment rings in the footwall also need to be reinforced.This study could provide a reference for metro design in ground fissure sites.
基金Supported by the National Key R&D Program of China(2024YFE0114000)Science and Technology Project of China National Petroleum Corporation(2024DJ8702).
文摘Based on the analysis of typical lacustrine shale oil zones in China and their geological characteristics,this study elucidates the fundamental differences between the enrichment patterns of shale oil sweet spots and conventional oil and gas.The key parameters and evaluation methods for assessing the large-scale production potential of lacustrine shale oil are proposed.The results show that shale oil is a petroleum resource that exists in organic-rich shale formations,in other words,it is preserved in its source bed,following a different process of generation-accumulation-enrichment from conventional oil and gas.Thus,the concept of“reservoir”seems to be inapplicable to shale oil.In China,lacustrine shale oil is distributed widely,but the geological characteristics and sweet spots enrichment patterns of shale oil vary significantly in lacustrine basins where the water environment and the tectonic evolution and diagenetic transformation frameworks are distinct.The core of the evaluation of lacustrine shale oil is“sweet spot volume”.The key factors for evaluating the large-scale production of continental shale oil are the oil storage capacity,oil-bearing capacity and oil producing capacity.The key parameters for evaluating these capacities are total porosity,oil content,and free oil content,respectively.It is recommended to determine the total porosity of shale by combining helium porosity measurement with nuclear magnetic resonance(NMR)method,the oil content of key layers by using organic solvent extraction,NMR method and high pressure mercury intrusion methods,and the free oil content by using NMR fluid distribution secondary spectral stripping decomposition and logging.The research results contribute supplemental insights on continental shale oil deliverability in China,and provide a scientific basis for the rapid exploration and large-scale production of lacustrine shale oil.
基金National Natural Science Foundation of China(52375378)National Key Laboratory of Metal Forming Technology and Heavy Equipment(S2308100.W12)Huxiang High-Level Talent Gathering Project of Hunan Province(2021RC5001)。
文摘The titanium alloy strut serves as a key load-bearing component of aircraft landing gear,typically manufactured via forging.The friction condition has important influence on material flow and cavity filling during the forging process.Using the previously optimized shape and initial position of preform,the influence of the friction condition(friction factor m=0.1–0.3)on material flow and cavity filling was studied by numerical method with a shear friction model.A novel filling index was defined to reflect material flow into left and right flashes and zoom in on friction-induced results.The results indicate that the workpiece moves rigidly to the right direction,with the displacement decreasing as m increases.When m<0.18,the underfilling defect will occur in the left side of strut forging,while overflow occurs in the right forging die cavity.By combining the filling index and analyses of material flow and filling status,a reasonable friction factor interval of m=0.21–0.24 can be determined.Within this interval,the cavity filling behavior demonstrates robustness,with friction fluctuations exerting minimal influence.
基金Supported by Daqing City Philosophy and Social Sciences Planning Research Project(DSGB 2025011)the Heilongjiang Province Education Science Planning Key Project(GJB1320229).
文摘Based on questionnaire surveys and field interviews conducted with various types of agricultural production organizations across five districts and four counties in Daqing City,this study combines relevant theoretical frameworks to systematically examine the evolution,performance,and influencing factors of governance mechanisms within these organizations.Using both quantitative and inductive analytical methods,the paper proposes innovative designs and supporting measures for improving governance mechanisms.The findings reveal that,amid large-scale farmland circulation,the governance mechanisms of agricultural production organizations in Daqing City are evolving from traditional to modern structures.However,challenges remain in areas such as decision-making efficiency,benefit distribution,and supervision mechanisms.In response,this study proposes innovative governance designs focusing on decision-making processes,profit-sharing mechanisms,and risk prevention.Corresponding policy recommendations are also provided to support the sustainable development of agricultural modernization in China.
文摘Standardized datasets are foundational to healthcare informatization by enhancing data quality and unleashing the value of data elements.Using bibliometrics and content analysis,this study examines China's healthcare dataset standards from 2011 to 2025.It analyzes their evolution across types,applications,institutions,and themes,highlighting key achievements including substantial growth in quantity,optimized typology,expansion into innovative application scenarios such as health decision support,and broadened institutional involvement.The study also identifies critical challenges,including imbalanced development,insufficient quality control,and a lack of essential metadata—such as authoritative data element mappings and privacy annotations—which hampers the delivery of intelligent services.To address these challenges,the study proposes a multi-faceted strategy focused on optimizing the standard system's architecture,enhancing quality and implementation,and advancing both data governance—through authoritative tracing and privacy protection—and intelligent service provision.These strategies aim to promote the application of dataset standards,thereby fostering and securing the development of new productive forces in healthcare.
基金supported by the Natural Science Basic Research Program of Shaanxi(Program No.2024JC-YBMS-026).
文摘When dealing with imbalanced datasets,the traditional support vectormachine(SVM)tends to produce a classification hyperplane that is biased towards the majority class,which exhibits poor robustness.This paper proposes a high-performance classification algorithm specifically designed for imbalanced datasets.The proposed method first uses a biased second-order cone programming support vectormachine(B-SOCP-SVM)to identify the support vectors(SVs)and non-support vectors(NSVs)in the imbalanced data.Then,it applies the synthetic minority over-sampling technique(SV-SMOTE)to oversample the support vectors of the minority class and uses the random under-sampling technique(NSV-RUS)multiple times to undersample the non-support vectors of the majority class.Combining the above-obtained minority class data set withmultiple majority class datasets can obtainmultiple new balanced data sets.Finally,SOCP-SVM is used to classify each data set,and the final result is obtained through the integrated algorithm.Experimental results demonstrate that the proposed method performs excellently on imbalanced datasets.
基金The Australian Research Council(DP200101197,DP230101107).
文摘Formalizing complex processes and phenomena of a real-world problem may require a large number of variables and constraints,resulting in what is termed a large-scale optimization problem.Nowadays,such large-scale optimization problems are solved using computing machines,leading to an enormous computational time being required,which may delay deriving timely solutions.Decomposition methods,which partition a large-scale optimization problem into lower-dimensional subproblems,represent a key approach to addressing time-efficiency issues.There has been significant progress in both applied mathematics and emerging artificial intelligence approaches on this front.This work aims at providing an overview of the decomposition methods from both the mathematics and computer science points of view.We also remark on the state-of-the-art developments and recent applications of the decomposition methods,and discuss the future research and development perspectives.
文摘This article focuses on the management of large-scale machinery and equipment in highway construction,with the research objective of identifying issues at the management level and exploring more effective management measures.Through practical observation and logical analysis,this article elaborates on the management connotations of large-scale machinery and equipment in highway construction,affirming its management value from different perspectives.On this basis,it carefully analyzes the problems existing in the management of large-scale machinery and equipment,providing a detailed interpretation of issues such as the weak foundation of the equipment management system and the disconnection between equipment selection and configuration from reality.Combining the manifestations of related problems,this article proposes strategies such as strengthening the institutional foundation of equipment management,selecting and configuring equipment based on actual conditions,aiming to provide references for large-scale machinery and equipment management to relevant enterprises.
基金supported by the Fundamental Research Funds for the Central Universities(Grant Nos.lzujbky-2021-62 and lzujbky-2024-jdzx06)the National Natural Science Foundation of China(Grant No.12247101)+1 种基金the Natural Science Foundation of Gansu Province,China(Grant Nos.22JR5RA389 and 23JRRA1740)the‘111 Center’Fund(Grant No.B20063).
文摘It has been argued that the human brain,as an information-processing machine,operates near a phase transition point in a non-equilibrium state,where it violates detailed balance leading to entropy production.Thus,the assessment of irreversibility in brain networks can provide valuable insights into their non-equilibrium properties.In this study,we utilized an open-source whole-brain functional magnetic resonance imaging(fMRI)dataset from both resting and task states to evaluate the irreversibility of large-scale human brain networks.Our analysis revealed that the brain networks exhibited significant irreversibility,violating detailed balance,and generating entropy.Notably,both physical and cognitive tasks increased the extent of this violation compared to the resting state.Regardless of the state(rest or task),interactions between pairs of brain regions were the primary contributors to this irreversibility.Moreover,we observed that as global synchrony increased within brain networks,so did irreversibility.The first derivative of irreversibility with respect to synchronization peaked near the phase transition point,characterized by the moderate mean synchronization and maximized synchronization entropy of blood oxygenation level-dependent(BOLD)signals.These findings deepen our understanding of the non-equilibrium dynamics of large-scale brain networks,particularly in relation to their phase transition behaviors,and may have potential clinical applications for brain disorders.
基金supported by the National Nature Science Foundation of China National Outstanding Youth Science Fund Project(Grant No.52122109)the National Natural Science Foundation of China(Grants No.51861165102 and 52039005).
文摘Offshore wind power plays a crucial role in energy strategies.The results of traditional small-scale physical models may be unreliable when extrapolated to large field scales.This study addressed this limitation by conducting large-scale(1:13)experiments to investigate the scour hole pattern and equilibrium scour depth around both slender and large monopiles under irregular waves.The experiments adopted KeuleganeCarpenter number(NKC)values from 1.01 to 8.89 and diffraction parameter(D/L,where D is the diameter of the monopile,and L is the wave length)values from 0.016 to 0.056.The results showed that changes in the maximum scour location and scour hole shape around a slender monopile were associated with NKC,with differences observed between irregular and regular waves.Improving the calculation of NKC enhanced the accuracy of existing scour formulae under irregular waves.The maximum scour locations around a large monopile were consistently found on both sides,regardless of NKC and D/L,but the scour hole topography was influenced by both parameters.Notably,the scour range around a large monopile was at least as large as the monopile diameter.
基金Guangdong Major Project of Basic and Applied Basic Research (2020B0301030004)National Natural Science Foundation of China (42275041)Hainan Province Science and Technology Special Fund (SOLZSKY2025006)。
文摘In 2022, South China(SC) experienced record-breaking rainfall during its first rainy season, causing severe socioeconomic losses. This study examines the large-scale circulation anomalies responsible for this extreme event.Analysis reveals that the lower-tropospheric cyclonic anomaly over SC plays a crucial role. This cyclonic anomaly consists of extratropical northeasterly anomalies to the north of SC and tropical southwesterly anomalies to the south. Both components were particularly intense during the 2022 first rainy season, contributing to the heavy rainfall in SC. Moreover,the lower-tropospheric cyclonic anomaly is enhanced by its counterpart in the upper troposphere, which is associated with a wave train propagating from the North Atlantic to East Asia across the mid-high latitudes of the Eurasian continent.Further analysis indicates that the extratropical wave train correlates with sea surface temperature anomalies(SSTAs) in the North Atlantic. Additionally, the SSTAs over the North Indian Ocean also play a role in enhancing the tropical southwesterlies in the lower troposphere. This study highlights the combined influence of tropical and extratropical circulation anomalies, offering a comprehensive understanding of the record-breaking rainfall.
基金funded by A’Sharqiyah University,Sultanate of Oman,under Research Project grant number(BFP/RGP/ICT/22/490).
文摘Detecting faces under occlusion remains a significant challenge in computer vision due to variations caused by masks,sunglasses,and other obstructions.Addressing this issue is crucial for applications such as surveillance,biometric authentication,and human-computer interaction.This paper provides a comprehensive review of face detection techniques developed to handle occluded faces.Studies are categorized into four main approaches:feature-based,machine learning-based,deep learning-based,and hybrid methods.We analyzed state-of-the-art studies within each category,examining their methodologies,strengths,and limitations based on widely used benchmark datasets,highlighting their adaptability to partial and severe occlusions.The review also identifies key challenges,including dataset diversity,model generalization,and computational efficiency.Our findings reveal that deep learning methods dominate recent studies,benefiting from their ability to extract hierarchical features and handle complex occlusion patterns.More recently,researchers have increasingly explored Transformer-based architectures,such as Vision Transformer(ViT)and Swin Transformer,to further improve detection robustness under challenging occlusion scenarios.In addition,hybrid approaches,which aim to combine traditional andmodern techniques,are emerging as a promising direction for improving robustness.This review provides valuable insights for researchers aiming to develop more robust face detection systems and for practitioners seeking to deploy reliable solutions in real-world,occlusionprone environments.Further improvements and the proposal of broader datasets are required to developmore scalable,robust,and efficient models that can handle complex occlusions in real-world scenarios.
文摘This study employs deformation monitoring data acquired during the construction of the Haoji railway large-scale bridge to investigate the displacement behavior of the subgrades,catenary columns,and tracks.Emphasis is placed on data acquisition and processing methods using total stations and automated monitoring systems.Through a comprehensive analysis of lateral,longitudinal,and vertical displacement data from 26 subgrade monitoring points,catenary columns,and track sections,this research evaluates how construction activities influence railway structures.The results show that displacement variations in the subgrades,catenary columns,and tracks remained within the established alert thresholds,exhibiting stable deformation trends and indicating that any adverse environmental impact was effectively contained.Furthermore,this paper proposes an early warning mechanism based on an automated monitoring system,which can promptly detect abnormal deformations and initiate emergency response procedures,thereby ensuring the safe operation of the railway.The integration of big data analysis and deformation prediction models offers a practical foundation for future safety management in railway construction.