期刊文献+
共找到191,097篇文章
< 1 2 250 >
每页显示 20 50 100
竹木MLS吸声扩散体应用于可伸缩KTV房车的声学设计初探
1
作者 王波 赵汉清 +2 位作者 李昱庭 旦增德吉 戴文林 《城市建筑》 2026年第1期179-181,共3页
MLS吸声扩散体是在常规的MLS扩散体面板表面凹槽开设狭缝,狭缝内填玻璃棉。通过调整狭缝宽度、玻璃棉容重、厚度及玻璃棉后部空腔,以便提升传统MLS扩散体的吸声效能,并实现对MLS吸声扩散体吸声扩散频谱的有效调节。文章借鉴常规房车构造... MLS吸声扩散体是在常规的MLS扩散体面板表面凹槽开设狭缝,狭缝内填玻璃棉。通过调整狭缝宽度、玻璃棉容重、厚度及玻璃棉后部空腔,以便提升传统MLS扩散体的吸声效能,并实现对MLS吸声扩散体吸声扩散频谱的有效调节。文章借鉴常规房车构造,在一侧增加可伸缩的KTV座位区,以便增加KTV房车的使用空间,侧墙采用竹木MLS吸声扩散体形成吸声界面,利用MLS无规扩散及吸声特性,提升可伸缩KTV房车内部的混响品质及声场均匀度,并利用KTV房车室内陈设及侧窗布局,避免声染色缺陷,努力营造出符合声学需求的可伸缩KTV房车空间,以便拓展竹木MLS吸声扩散体的研究及应用,为完善可伸缩KTV房车声学设计提供借鉴。 展开更多
关键词 可伸缩KTV房车 竹木mlS吸声扩散体 声学设计
在线阅读 下载PDF
Advanced driver assistance system(ADAS)and machine learning(ML):The dynamic duo revolutionizing the automotive industry
2
作者 Harsh SHAH Karan SHAH +2 位作者 Kushagra DARJI Adit SHAH Manan SHAH 《虚拟现实与智能硬件(中英文)》 2025年第3期203-236,共34页
The advanced driver assistance system(ADAS)primarily serves to assist drivers in monitoring the speed of the car and helps them make the right decision,which leads to fewer fatal accidents and ensures higher safety.In... The advanced driver assistance system(ADAS)primarily serves to assist drivers in monitoring the speed of the car and helps them make the right decision,which leads to fewer fatal accidents and ensures higher safety.In the artificial Intelligence domain,machine learning(ML)was developed to make inferences with a degree of accuracy similar to that of humans;however,enormous amounts of data are required.Machine learning enhances the accuracy of the decisions taken by ADAS,by evaluating all the data received from various vehicle sensors.This study summarizes all the critical algorithms used in ADAS technologies and presents the evolution of ADAS technology.Initially,ADAS technology is introduced,along with its evolution,to understand the objectives of developing this technology.Subsequently,the critical algorithms used in ADAS technology,which include face detection,head-pose estimation,gaze estimation,and link detection are discussed.A further discussion follows on the impact of ML on each algorithm in different environments,leading to increased accuracy at the expense of additional computing,to increase efficiency.The aim of this study was to evaluate all the methods with or without ML for each algorithm. 展开更多
关键词 machine learning Face detection Advanced driver system
在线阅读 下载PDF
Machine Learning on Blockchain (MLOB): A New Paradigm for Computational Security in Engineering
3
作者 Zhiming Dong Weisheng Lu 《Engineering》 2025年第4期250-263,共14页
Machine learning(ML)has been increasingly adopted to solve engineering problems with performance gauged by accuracy,efficiency,and security.Notably,blockchain technology(BT)has been added to ML when security is a part... Machine learning(ML)has been increasingly adopted to solve engineering problems with performance gauged by accuracy,efficiency,and security.Notably,blockchain technology(BT)has been added to ML when security is a particular concern.Nevertheless,there is a research gap that prevailing solutions focus primarily on data security using blockchain but ignore computational security,making the traditional ML process vulnerable to off-chain risks.Therefore,the research objective is to develop a novel ML on blockchain(MLOB)framework to ensure both the data and computational process security.The central tenet is to place them both on the blockchain,execute them as blockchain smart contracts,and protect the execution records on-chain.The framework is established by developing a prototype and further calibrated using a case study of industrial inspection.It is shown that the MLOB framework,compared with existing ML and BT isolated solutions,is superior in terms of security(successfully defending against corruption on six designed attack scenario),maintaining accuracy(0.01%difference with baseline),albeit with a slightly compromised efficiency(0.231 second latency increased).The key finding is MLOB can significantly enhances the computational security of engineering computing without increasing computing power demands.This finding can alleviate concerns regarding the computational resource requirements of ML-BT integration.With proper adaption,the MLOB framework can inform various novel solutions to achieve computational security in broader engineering challenges. 展开更多
关键词 Engineering computing machine learning Blockchain Blockchain smart contract Deployable framework
在线阅读 下载PDF
Streamlining heart failure patient care with machine learning of thoracic cavity sound data
4
作者 Rony Marethianto Santoso Wilbert Huang +4 位作者 Ser Wee Bambang Budi Siswanto Amiliana Mardiani Soesanto Wisnu Jatmiko Aria Kekalih 《World Journal of Cardiology》 2025年第9期33-42,共10页
Together,the heart and lung sound comprise the thoracic cavity sound,which provides informative details that reflect patient conditions,particularly heart failure(HF)patients.However,due to the limitations of human he... Together,the heart and lung sound comprise the thoracic cavity sound,which provides informative details that reflect patient conditions,particularly heart failure(HF)patients.However,due to the limitations of human hearing,a limited amount of information can be auscultated from thoracic cavity sounds.With the aid of artificial intelligence–machine learning,these features can be analyzed and aid in the care of HF patients.Machine learning of thoracic cavity sound data involves sound data pre-processing by denoising,resampling,segmentation,and normalization.Afterwards,the most crucial step is feature extraction and se-lection where relevant features are selected to train the model.The next step is classification and model performance evaluation.This review summarizes the currently available studies that utilized different machine learning models,different feature extraction and selection methods,and different classifiers to generate the desired output.Most studies have analyzed the heart sound component of thoracic cavity sound to distinguish between normal and HF patients.Additionally,some studies have aimed to classify HF patients based on thoracic cavity sounds in their entirety,while others have focused on risk strati-fication and prognostic evaluation of HF patients using thoracic cavity sounds.Overall,the results from these studies demonstrate a promisingly high level of accuracy.Therefore,future prospective studies should incorporate these machine learning models to expedite their integration into daily clinical practice for managing HF patients. 展开更多
关键词 machine learning Heart failure Sound data Artificial intelligence Deep learning
暂未订购
Synergistic machine learning and DFT screening strategy:Accelerating discovery of efficient perovskite passivators
5
作者 Jianghao Liu Hongyan Lv +4 位作者 Pengyang Wang Guofu Hou Ying Zhao Xiaodan Zhang Qian Huang 《Journal of Energy Chemistry》 2026年第1期56-63,I0003,共9页
Efficient surface passivation is critical for achieving high-performance perovskite solar cells(PSCs),yet the discovery of optimal passivators remains a time-consuming,trial-and-error process.Here,we report a synergis... Efficient surface passivation is critical for achieving high-performance perovskite solar cells(PSCs),yet the discovery of optimal passivators remains a time-consuming,trial-and-error process.Here,we report a synergistic machine learning(ML)and density functional theory(DFT)approach that enables predictive and rapid identification of effective passivation materials.By training an XGBoost model(91.3%accuracy)with DFT-derived molecular descriptors and activity calculations,we identify 2-(4-aminophenyl)-3H-benzimidazol-5-amine(APBIA)as a promising passivator.Experimental validation demonstrates that APBIA effectively removes surface impurities and passivates defects within perovskite films,leading to a significant increase in power conversion efficiency(PCE)from 22.48%to 25.55%(certified as 25.02%).This ML-DFT framework provides a generalizable pathway for accelerating the development of advanced functional materials for photovoltaic applications. 展开更多
关键词 Perovskite solar cells machine learning(ml) Density functional theory(DFT) Passivators Organic molecule
在线阅读 下载PDF
Insights and analysis of machine learning for benzene hydrogenation to cyclohexene
6
作者 SUN Chao ZHANG Bin 《燃料化学学报(中英文)》 北大核心 2026年第2期133-139,共7页
Cyclohexene is an important raw material in the production of nylon.Selective hydrogenation of benzene is a key method for preparing cyclohexene.However,the Ru catalysts used in current industrial processes still face... Cyclohexene is an important raw material in the production of nylon.Selective hydrogenation of benzene is a key method for preparing cyclohexene.However,the Ru catalysts used in current industrial processes still face challenges,including high metal usage,high process costs,and low cyclohexene yield.This study utilizes existing literature data combined with machine learning methods to analyze the factors influencing benzene conversion,cyclohexene selectivity,and yield in the benzene hydrogenation to cyclohexene reaction.It constructs predictive models based on XGBoost and Random Forest algorithms.After analysis,it was found that reaction time,Ru content,and space velocity are key factors influencing cyclohexene yield,selectivity,and benzene conversion.Shapley Additive Explanations(SHAP)analysis and feature importance analysis further revealed the contribution of each variable to the reaction outcomes.Additionally,we randomly generated one million variable combinations using the Dirichlet distribution to attempt to predict high-yield catalyst formulations.This paper provides new insights into the application of machine learning in heterogeneous catalysis and offers some reference for further research. 展开更多
关键词 machine learning heterogeneous catalysis hydrogenation of benzene XGBoost
在线阅读 下载PDF
Machine learning-based investigation of uplift resistance in special-shaped shield tunnels using numerical finite element modeling
7
作者 ZHANG Wengang YE Wenyu +2 位作者 SUN Weixin LIU Zhicheng LI Zhengchuan 《土木与环境工程学报(中英文)》 北大核心 2026年第1期1-13,共13页
The uplift resistance of the soil overlying shield tunnels significantly impacts their anti-floating stability.However,research on uplift resistance concerning special-shaped shield tunnels is limited.This study combi... The uplift resistance of the soil overlying shield tunnels significantly impacts their anti-floating stability.However,research on uplift resistance concerning special-shaped shield tunnels is limited.This study combines numerical simulation with machine learning techniques to explore this issue.It presents a summary of special-shaped tunnel geometries and introduces a shape coefficient.Through the finite element software,Plaxis3D,the study simulates six key parameters—shape coefficient,burial depth ratio,tunnel’s longest horizontal length,internal friction angle,cohesion,and soil submerged bulk density—that impact uplift resistance across different conditions.Employing XGBoost and ANN methods,the feature importance of each parameter was analyzed based on the numerical simulation results.The findings demonstrate that a tunnel shape more closely resembling a circle leads to reduced uplift resistance in the overlying soil,whereas other parameters exhibit the contrary effects.Furthermore,the study reveals a diminishing trend in the feature importance of buried depth ratio,internal friction angle,tunnel longest horizontal length,cohesion,soil submerged bulk density,and shape coefficient in influencing uplift resistance. 展开更多
关键词 special-shaped tunnel shield tunnel uplift resistance numerical simulation machine learning
在线阅读 下载PDF
基于TimeGAN-LSTM-MLP的钻井溢流智能监测模型
8
作者 彭炽 万兴 +3 位作者 林铁军 李庆峰 苏昱 杨赟 《钻采工艺》 北大核心 2026年第1期171-183,共13页
由于钻井现场实钻溢流数据较少,导致智能溢流监测模型训练困难,准确度和泛化能力较差,为此,文章提出一种基于时间序列生成对抗网络(TimeGAN)的溢流时序数据扩增方法,通过真实溢流数据生成人工溢流样本,并利用长短期记忆神经网络(LSTM)... 由于钻井现场实钻溢流数据较少,导致智能溢流监测模型训练困难,准确度和泛化能力较差,为此,文章提出一种基于时间序列生成对抗网络(TimeGAN)的溢流时序数据扩增方法,通过真实溢流数据生成人工溢流样本,并利用长短期记忆神经网络(LSTM)提取井口多元时序特征,多层感知机(MLP)完成分类任务,构建溢流智能监测模型。利用四川盆地深层页岩气井实钻数据,分析了不平衡数据处理技术及样本不平衡比对模型监测性能的影响,同时通过消融实验探讨各模块对溢流识别的贡献。结果表明,TimeGAN优于其他数据平衡处理技术,模型在样本不平衡比为1时的准确率、召回率、精确率及F值最高,表明保证样本类别平衡是构建可靠溢流监测模型的关键。经现场验证,模型在四川某页岩气井成功实现高效准确的实钻溢流监测,展现出良好的应用潜力。 展开更多
关键词 TimeGAN 溢流监测 机器学习 时间序列 不平衡样本
在线阅读 下载PDF
Using mixed kernel support vector machine to improve the predictive accuracy of genome selection
9
作者 Jinbu Wang Wencheng Zong +6 位作者 Liangyu Shi Mianyan Li Jia Li Deming Ren Fuping Zhao Lixian Wang Ligang Wang 《Journal of Integrative Agriculture》 2026年第2期775-787,共13页
The advantages of genome selection(GS) in animal and plant breeding are self-evident.Traditional parametric models have disadvantage in better fit the increasingly large sequencing data and capture complex effects acc... The advantages of genome selection(GS) in animal and plant breeding are self-evident.Traditional parametric models have disadvantage in better fit the increasingly large sequencing data and capture complex effects accurately.Machine learning models have demonstrated remarkable potential in addressing these challenges.In this study,we introduced the concept of mixed kernel functions to explore the performance of support vector machine regression(SVR) in GS.Six single kernel functions(SVR_L,SVR_C,SVR_G,SVR_P,SVR_S,SVR_L) and four mixed kernel functions(SVR_GS,SVR_GP,SVR_LS,SVR_LP) were used to predict genome breeding values.The prediction accuracy,mean squared error(MSE) and mean absolute error(MAE) were used as evaluation indicators to compare with two traditional parametric models(GBLUP,BayesB) and two popular machine learning models(RF,KcRR).The results indicate that in most cases,the performance of the mixed kernel function model significantly outperforms that of GBLUP,BayesB and single kernel function.For instance,for T1 in the pig dataset,the predictive accuracy of SVR_GS is improved by 10% compared to GBLUP,and by approximately 4.4 and 18.6% compared to SVR_G and SVR_S respectively.For E1 in the wheat dataset,SVR_GS achieves 13.3% higher prediction accuracy than GBLUP.Among single kernel functions,the Laplacian and Gaussian kernel functions yield similar results,with the Gaussian kernel function performing better.The mixed kernel function notably reduces the MSE and MAE when compared to all single kernel functions.Furthermore,regarding runtime,SVR_GS and SVR_GP mixed kernel functions run approximately three times faster than GBLUP in the pig dataset,with only a slight increase in runtime compared to the single kernel function model.In summary,the mixed kernel function model of SVR demonstrates speed and accuracy competitiveness,and the model such as SVR_GS has important application potential for GS. 展开更多
关键词 genome selection machine learning support vector machine kernel function mixed kernel function
在线阅读 下载PDF
Machine learning-based dual-parameter inversion for estimating snowpack liquid water content and density using common offset GPR data
10
作者 Zohaib AKBAR Yuanjun JIANG +4 位作者 Ryan WEBB Anja KLOTZSCHE Yuanjia ZHU Aftab ANWAR Muhammad Mudassar REHMAN 《Science China Earth Sciences》 2026年第2期564-581,共18页
Accurate assessment of snowpack volumetric liquid water content and bulk density is essential for understanding snow hydrology,avalanche risk management,and monitoring cryosphere changes.This study presents a novel du... Accurate assessment of snowpack volumetric liquid water content and bulk density is essential for understanding snow hydrology,avalanche risk management,and monitoring cryosphere changes.This study presents a novel dual-parameter inversion framework that integrates synthetic electromagnetic modelling,dimensionality reduction,and machine learning algorithms to extract relative permittivity and log-resistivity from ground-penetrating radar(GPR)data.Traditional snowpack measurements are invasive,labor-intensive,and limited to point observations.To overcome these limitations,we developed a non-invasive,scalable,and data-driven framework that uses synthetic GPR datasets representing diverse snowpack conditions with variable moisture and density profiles.Synthetic 1D time series reflections(A-scans)are generated using finite-difference time-domain simulations in the state-of-the-art electromagnetic simulator gprMax.Principal component analysis(PCA)is applied to compress each A-scan while preserving key features,which significantly improved and enhanced the model training efficiency.Four machine learning models,including random forest,neural network,support vector machine,and eXtreme gradient boosting,are trained on PCA-reduced features.Among these,the neural network model achieved the best performance,with R^(2)>0.97 for permittivity and R 2>0.92 for resistivity.Gaussian noise(signal-to-noise ratio of 6 dB)is introduced to the synthetic data,and then targeted domain adaptation is employed to enhance generalization to field data.The framework is validated on two contrasting GPR transects in the Altay Mountains of the Chinese mainland,representing moist(T750)and wet(G125)snowpack conditions.The neural network model predictions are most consistent with the GPR derived estimates,Snowfork measurements,and snow pit data,achieving volumetric liquid water content deviation of≤1.5% and bulk density error within the range of 30-84 kg m^(-3).The results demonstrate that machine learning-based inversion,supported by realistic simulations and data augmentation enables scalable,non-invasive snowpack characterization with significant applications in hydrological forecasting,snow monitoring,and water resource management. 展开更多
关键词 SNOWPACK GPR gprMax machine learning INVERSION
原文传递
Revolutionizing sepsis therapy:Machine learning-driven co-crystallization reveals emodin's therapeutic potential
11
作者 Shuang Li Penghui Yuan +6 位作者 Xinyi Zhang Meiru Liu Dezhi Yang Linglei Kong Li Zhang Yang Lu Guanhua Dua 《Chinese Chemical Letters》 2026年第2期666-672,共7页
In the pharmaceutical field,machine learning can play an important role in drug development,production and treatment.Co-crystallization techniques have shown promising potential to enhance the properties of active pha... In the pharmaceutical field,machine learning can play an important role in drug development,production and treatment.Co-crystallization techniques have shown promising potential to enhance the properties of active pharmaceutical ingredients(APIs)such as solubility,permeability,and bioavailability,all without altering their chemical structure.This approach opens new avenues for developing natural products into effective drugs,especially those previously challenging in formulation.Emodin,an anthraquinone-based natural product,is a notable example due to its diverse biological activities;however,its physicochemical limitations,such as poor solubility and easy sublimation,restricted its clinical application.While various methods have improved emodin's physicochemical properties,research on its bioavailability remains limited.In our study,we summarize cocrystals and salts produced through co-crystallization technology and identify piperazine as a favorable coformer.Conflicting conclusions from computational chemistry and molecular modeling method and machine learning method regarding the formation of an emodin-piperazine cocrystal or salt led us to experimentally validate these possibilities.Ultimately,we successfully obtained the emodin-piperazine cocrystal,which were characterized and evaluated by several in vitro methods and pharmacokinetic studies.In addition,experiments have shown that emodin has a certain therapeutic effect on sepsis,so we also evaluated emodin-piperazine biological activity in a sepsis model.The results demonstrate that co-crystallization significantly enhances emodin's solubility,permeability,and bioavailability.Pharmacodynamic studies indicate that the emodin-piperazine cocrystal improves sepsis symptoms and provides protective effects against liver and kidney damage associated with sepsis.This study offers renewed hope for natural products with broad biological activities yet hindered by physicochemical limitations by advancing co-crystallization as a viable development approach. 展开更多
关键词 CO-CRYSTALLIZATION Properties BIOAVAILABILITY SEPSIS EMODIN machine learning
原文传递
Detection of human saliva using surface-enhanced Raman spectroscopy combined with fractionation processing and machine learning for noninvasive screening of nasopharyngeal carcinoma
12
作者 Zijie Wu Shihong Hou +2 位作者 Sufang Qiu Youliang Weng Duo Lin 《Journal of Innovative Optical Health Sciences》 2026年第1期87-95,共9页
Nasopharyngeal carcinoma(NPC)is a malignant tumor prevalent in southern China and Southeast Asia,where its early detection is crucial for improving patient prognosis and reducing mortality rates.However,existing scree... Nasopharyngeal carcinoma(NPC)is a malignant tumor prevalent in southern China and Southeast Asia,where its early detection is crucial for improving patient prognosis and reducing mortality rates.However,existing screening methods suffer from limitations in accuracy and accessibility,hindering their application in large-scale population screening.In this work,a surface-enhanced Raman spectroscopy(SERS)-based method was established to explore the profiles of different stratified components in saliva from NPC and healthy subjects after fractionation processing.The study findings indicate that all fractionated samples exhibit diseaseassociated molecular signaling differences,where small-molecule(molecular weight cut-offvalue is 10 kDa)demonstrating superior classification capabilities with sensitivity of 90.5%and speci-ficity of 75.6%,area under receiver operating characteristic(ROC)curve of 0:925±0:031.The primary objective of this study was to qualitatively explore patterns in saliva composition across groups.The proposed SERS detection strategy for fractionated saliva offers novel insights for enhancing the sensitivity and reliability of noninvasive NPC screening,laying the foundation for translational application in large-scale clinical settings. 展开更多
关键词 SALIVA SERS machine learning nasopharyngeal carcinoma SCREENING
原文传递
Evaluating land surface temperature trends and environmental interactions through machine learning and wavelet analysis
13
作者 Zeeshan ZAFAR Shiqiang ZHANG +1 位作者 Yuanyuan ZHA Hammad GILANI 《Science China Earth Sciences》 2026年第2期528-551,共24页
Accurate land surface temperature(LST)assessment is crucial for comprehending and reducing the impacts of climate change and understanding land use evolution.This study presents an innovative method by utilizing ensem... Accurate land surface temperature(LST)assessment is crucial for comprehending and reducing the impacts of climate change and understanding land use evolution.This study presents an innovative method by utilizing ensemble models,advanced correlation analysis,and trend analysis to investigate its environmental influences.Google Earth Engine(GEE)was utilized to process the datasets from Landsat-7 and Landsat-8 for the five big cities of Punjab,Pakistan,from 2001 to 2023.Results from this study show significant urban warming trends,and a strong correlation between environmental variables and LST was identified.The ensemble-based three machine learning models,including XGBoost,AdaBoost,and random forest(RF),were adopted to improve the accuracy of LST evaluation.Although XGBoost and AdaBoost attained modest levels of accuracy,with R^(2) values of 0.767 and 0.706,respectively,the RF model outperformed them by achieving an exceptional R^(2) of 0.796 and RMSE of 0.476.Moreover,Pearson correlation analysis revealed a negative relationship between LST and normalized difference latent heat index(NDLI)with r=-0.67,normalized difference vegetation index(NDVI)with r=-0.6,and modified normalized difference water index(MNDWI)with the value of r as -0.57.In addition,wavelet analysis showed that vegetation and water offer long-term LST cooling,lasting up to 64 months,while built-up areas and bare soil contribute to short-term warming,lasting 4 to 8 months.Latent heat indicated variable cooling periods,surpassing 60 months in cities.These findings enhance the understanding of LST changes and the impact of climate change on the environment. 展开更多
关键词 machine learning LST GEE SUSTAINABILITY Remote sensing
原文传递
Machine Learning and Deep Learning for Smart Urban Transportation Systems with GPS,GIS,and Advanced Analytics:A Comprehensive Analysis
14
作者 E.Kalaivanan S.Brindha 《Journal of Harbin Institute of Technology(New Series)》 2026年第1期81-96,共16页
As urbanization continues to accelerate,the challenges associated with managing transportation in metropolitan areas become increasingly complex.The surge in population density contributes to traffic congestion,impact... As urbanization continues to accelerate,the challenges associated with managing transportation in metropolitan areas become increasingly complex.The surge in population density contributes to traffic congestion,impacting travel experiences and posing safety risks.Smart urban transportation management emerges as a strategic solution,conceptualized here as a multidimensional big data problem.The success of this strategy hinges on the effective collection of information from diverse,extensive,and heterogeneous data sources,necessitating the implementation of full⁃stack Information and Communication Technology(ICT)solutions.The main idea of the work is to investigate the current technologies of Intelligent Transportation Systems(ITS)and enhance the safety of urban transportation systems.Machine learning models,trained on historical data,can predict traffic congestion,allowing for the implementation of preventive measures.Deep learning architectures,with their ability to handle complex data representations,further refine traffic predictions,contributing to more accurate and dynamic transportation management.The background of this research underscores the challenges posed by traffic congestion in metropolitan areas and emphasizes the need for advanced technological solutions.By integrating GPS and GIS technologies with machine learning algorithms,this work aims to pay attention to the development of intelligent transportation systems that not only address current challenges but also pave the way for future advancements in urban transportation management. 展开更多
关键词 machine learning deep learning smart transportation
在线阅读 下载PDF
Machine learning of chaotic characteristics in classical nonlinear dynamics using variational quantum circuit
15
作者 Sheng-Chen Bai Shi-Ju Ran 《Chinese Physics B》 2026年第2期322-328,共7页
Replicating the chaotic characteristics inherent in nonlinear dynamical systems via machine learning(ML)is a key challenge in this rapidly advancing interdisciplinary field.In this work,we explore the potential of var... Replicating the chaotic characteristics inherent in nonlinear dynamical systems via machine learning(ML)is a key challenge in this rapidly advancing interdisciplinary field.In this work,we explore the potential of variational quantum circuits(VQC)for learning the stochastic properties of classical nonlinear dynamical systems.Specifically,we focus on the one-and two-dimensional logistic maps,which,while simple,remain under-explored in the context of learning dynamical characteristics.Our findings reveal that,even for such simple dynamical systems,accurately replicating longterm characteristics is hindered by a pronounced sensitivity to overfitting.While increasing the parameter complexity of the ML model typically enhances short-term prediction accuracy,it also leads to a degradation in the model’s ability to replicate long-term characteristics,primarily due to the detrimental effects of overfitting on generalization power.By comparing the VQC with two widely recognized classical ML techniques,which are long short-term memory(LSTM)networks for timeseries processing and reservoir computing,we demonstrate that VQC outperforms these methods in terms of replicating long-term characteristics.Our results suggest that for the ML of dynamics,it is demanded to develop more compact and efficient models(such as VQC)rather than more complicated and large-scale ones. 展开更多
关键词 variational quantum circuit machine learning CHAOS
原文传递
Review of machine learning tight-binding models:Route to accurate and scalable electronic simulations
16
作者 Jijie Zou Zhanghao Zhouyin +1 位作者 Shishir Kumar Pandey Qiangqiang Gu 《Chinese Physics B》 2026年第1期2-12,共11页
The rapid advancement of machine learning based tight-binding Hamiltonian(MLTB)methods has opened new avenues for efficient and accurate electronic structure simulations,particularly in large-scale systems and long-ti... The rapid advancement of machine learning based tight-binding Hamiltonian(MLTB)methods has opened new avenues for efficient and accurate electronic structure simulations,particularly in large-scale systems and long-time scenarios.This review begins with a concise overview of traditional tight-binding(TB)models,including both(semi-)empirical and first-principles approaches,establishing the foundation for understanding MLTB developments.We then present a systematic classification of existing MLTB methodologies,grouped into two major categories:direct prediction of TB Hamiltonian elements and inference of empirical parameters.A comparative analysis with other ML-based electronic structure models is also provided,highlighting the advancement of MLTB approaches.Finally,we explore the emerging MLTB application ecosystem,highlighting how the integration of MLTB models with a diverse suite of post-processing tools from linear-scaling solvers to quantum transport frameworks and molecular dynamics interfaces is essential for tackling complex scientific problems across different domains.The continued advancement of this integrated paradigm promises to accelerate materials discovery and open new frontiers in the predictive simulation of complex quantum phenomena. 展开更多
关键词 machine learning tight-binding model electronic simulations
原文传递
Viscosity prediction of refining slag based on machine learning with domain knowledge
17
作者 Jianhua Chen Yijie Feng +4 位作者 Yixin Zhang Jun Luan Xionggang Lu Zhigang Yu Kuochih Chou 《International Journal of Minerals,Metallurgy and Materials》 2026年第2期555-566,共12页
The viscosity of refining slags plays a critical role in metallurgical processes.However,obtaining accurate viscosity data remains challenging due to the complexities of high-temperature experiments,often relying on e... The viscosity of refining slags plays a critical role in metallurgical processes.However,obtaining accurate viscosity data remains challenging due to the complexities of high-temperature experiments,often relying on empirical models with limited predictive capabilities.This study focuses on the influence of optical basicity on viscosity in CaO-Al_(2)O_(3)-based refining slags,leveraging machine learning to address data scarcity and improve prediction accuracy.An automated framework for algorithm integration,parameter tuning,and evaluation ranking framework(Auto-APE)is employed to develop customized data-driven models for various slag systems,including CaO-Al_(2)O_(3)-SiO_(2),CaO-Al_(2)O_(3)-CaF_(2),CaO-Al_(2)O_(3)-SiO_(2)-MgO,and CaO-Al_(2)O_(3)-SiO_(2)-MgO-CaF_(2).By incorporating optical basicity as a key feature,the models achieve an average validation error of 8.0%to 15.1%,significantly outperforming traditional empirical models.Additionally,symbolic regression is introduced to rapidly construct domain-specific features,such as optical basicity-like descriptors,offering a potential breakthrough in performance prediction for small datasets.This work highlights the critical role of domain-specific knowledge in understanding and predicting viscosity,providing a robust machine learning-based approach for optimizing refining slag properties. 展开更多
关键词 refining slag viscosity prediction machine learning domain knowledge
在线阅读 下载PDF
MELODI:An explainable machine learning method for mechanistic disentanglement of battery calendar aging
18
作者 Wenkai Ye Xiaoru Chen +6 位作者 Xu Hao Yilin Xie Fuda Gong Liangxi He Xuebing Han Hewu Wang Minggao Ouyang 《Journal of Energy Chemistry》 2026年第1期804-813,I0018,共11页
Lithium-ion batteries(LIBs)are widely deployed,from grid-scale storage to electric vehicles.LIBs remain stationary most of their service life,where calendar aging degrades capacity.Understanding the mechanisms of LIB ... Lithium-ion batteries(LIBs)are widely deployed,from grid-scale storage to electric vehicles.LIBs remain stationary most of their service life,where calendar aging degrades capacity.Understanding the mechanisms of LIB calendar aging is crucial for extending battery lifespan.However,LIB calendar aging is influenced by multiple factors,including battery material,its state,and storage environment.Calendar aging experiments are also time-consuming,costly,and lack standardized testing conditions.This study employs a data-driven approach to establish a cross-scale database linking materials,side-reaction mechanisms,and calendar aging of LIBs.MELODI(Mechanism-informed,Explainable,Learning-based Optimization for Degradation Identification)is proposed to identify calendar aging mechanisms and quantify the effects of multi-scale factors.Results reveal that cathode material loss drives up to 91.42%of calendar aging degradation in high-nickel(Ni)batteries,while solid electrolyte interphase growth dominates in lithium iron phosphate(LFP)and low-Ni batteries,contributing up to 82.43%of degradation in LFP batteries and 99.10%of decay in low-Ni batteries,respectively.This study systematically quantifies calendar aging in commercial LIBs under varying materials,states of charge,and temperatures.These findings offer quantitative guidance for experimental design or battery use,and implications for emerging applications like aerial robotics,vehicle-to-grid,and embodied intelligence systems. 展开更多
关键词 Data-driven model Degradation mechanism Lithium-ion battery machine learning
在线阅读 下载PDF
Machine Learning-Driven Prediction of the Glass Transition Temperature of Styrene-Butadiene Rubber
19
作者 Zhanglei Wang ShuoYan +4 位作者 Jingyu Gao Haoyu Wu Baili Wang Xiuying Zhao Shikai Hu 《Computers, Materials & Continua》 2026年第4期532-547,共16页
The glass transition temperature(T_(g))of styrene-butadiene rubber(SBR)is a key parameter determining its low-temperature flexibility and processing performance.Accurate prediction of T_(g)is crucial formaterial desig... The glass transition temperature(T_(g))of styrene-butadiene rubber(SBR)is a key parameter determining its low-temperature flexibility and processing performance.Accurate prediction of T_(g)is crucial formaterial design and application optimisation.Addressing the limitations of traditional experimental measurements and theoretical models in terms of efficiency,cost,and accuracy,this study proposes a machine learning prediction framework that integrates multi-model ensemble and Bayesian optimization by constructing a multi-component feature dataset and algorithm optimization strategy.Based on the constructed high-quality dataset containing 96 SBR samples,ninemachine learning models were employed to predict the T_(g)of SBR and compare their prediction performance.Ultimately,aGPR-XGBoost mixed model was constructed through model ensemble,achieving high-precision prediction with R^(2)values greater than 0.9 on both the training and test sets.Further feature attribution and local effect analysis were conducted using feature analysis methods such as SHAP and ALE,revealing the nonlinear influence patterns of various components on T_(g),providing a theoretical basis for SBR formulation design and T_(g)regulation.The machine learning prediction framework established in this study combines high-precision prediction with interpretability,significantly enhancing the prediction performance of the T_(g)of SBR.It offers an efficient tool for SBR molecular design and holds great potential for promotion and application. 展开更多
关键词 machine learning styrene-butadiene rubber glass transition temperature
在线阅读 下载PDF
Landslide susceptibility on the Qinghai-Tibet Plateau:Key driving factors identified through machine learning
20
作者 YANG Wanqing GE Quansheng +3 位作者 TAO Zexing XU Duanyang WANG Yuan HAO Zhixin 《Journal of Geographical Sciences》 2026年第1期199-218,共20页
Landslides pose a formidable natural hazard across the Qinghai-Tibet Plateau(QTP),endangering both ecosystems and human life.Identifying the driving factors behind landslides and accurately assessing susceptibility ar... Landslides pose a formidable natural hazard across the Qinghai-Tibet Plateau(QTP),endangering both ecosystems and human life.Identifying the driving factors behind landslides and accurately assessing susceptibility are key to mitigating disaster risk.This study integrated multi-source historical landslide data with 15 predictive factors and used several machine learning models—Random Forest(RF),Gradient Boosting Regression Trees(GBRT),Extreme Gradient Boosting(XGBoost),and Categorical Boosting(CatBoost)—to generate susceptibility maps.The Shapley additive explanation(SHAP)method was applied to quantify factor importance and explore their nonlinear effects.The results showed that:(1)CatBoost was the best-performing model(CA=0.938,AUC=0.980)in assessing landslide susceptibility,with altitude emerging as the most significant factor,followed by distance to roads and earthquake sites,precipitation,and slope;(2)the SHAP method revealed critical nonlinear thresholds,demonstrating that historical landslides were concentrated at mid-altitudes(1400-4000 m)and decreased markedly above 4000 m,with a parallel reduction in probability beyond 700 m from roads;and(3)landslide-prone areas,comprising 13%of the QTP,were concentrated in the southeastern and northeastern parts of the plateau.By integrating machine learning and SHAP analysis,this study revealed landslide hazard-prone areas and their driving factors,providing insights to support disaster management strategies and sustainable regional planning. 展开更多
关键词 landslide susceptibility machine learning SHAP driving factors nonlinear effects
原文传递
上一页 1 2 250 下一页 到第
使用帮助 返回顶部