Accuracy allocation is crucial in the accuracy design of machining tools.Current accuracy allocation methods primarily focus on positional deviation,with little consideration for tool direction deviation.To address th...Accuracy allocation is crucial in the accuracy design of machining tools.Current accuracy allocation methods primarily focus on positional deviation,with little consideration for tool direction deviation.To address this issue,we propose a geometric error cost sensitivity-based accuracy allocation method for five-axis machine tools.A geometric error model consisting of 4l error components is constructed based on homogeneous transformation matrices.Volumetric points with positional and tool direction deviations are randomly sampled to evaluate the accuracy of the machine tool.The sensitivity of each error component at these sampling points is analyzed using the Sobol method.To balance the needs of geometric precision and manufacturing cost,a geometric error cost sensitivity function is developed to estimate the required cost.By allocating error components affecting tool direction deviation first and the remaining components second,this allocation scheme ensures that both deviations meet the requirements.We also perform numerical simulation of a BC-type(B-axis and C-axis type)five-axis machine tool to validate the method.The results show that the new allocation scheme reduces the total geometric error cost by 27.8%compared to a uniform allocation scheme,and yields the same positional and tool direction machining accuracies.展开更多
Aiming at the problem of low machining accu- racy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are resea...Aiming at the problem of low machining accu- racy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of tem- perature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC- NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 pm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.展开更多
In order to satisfy the machining requirements of aero-engine casing in modern aviation industry, this paper investigates three main issues during the design and development process of a five-axis machine tool with hi...In order to satisfy the machining requirements of aero-engine casing in modern aviation industry, this paper investigates three main issues during the design and development process of a five-axis machine tool with high accuracy, stiffness and efficiency, including whole structure design,key components design, and supporting stiffness design. First, an appropriate structure of five-axis machine tool is determined considering the processing characteristics of aero-engine casing. Then, a dual drive swing head and a compact motorized spindle are designed with enough drive capability and stiffness, and related structure, assembly method, cooling technology, and performance simulation are given in detail. Next, a design method of supporting stiffness of guide is proposed through the deformation prediction of the spindle end. Based on above work, a prototype of machine tool is developed, and some experiments are carried out, including performance tests of swing head and motorized spindle, and machining of a simulated workpiece of aero-engine casing. All experimental results show that the machine tool has satisfactory accuracy, stiffness and efficiency, which meets the machining requirements of aero-engine casing. The main work can be used as references for engineers and technicians, which are meaningful in practice.展开更多
Compared with the traditional non-cutting measurement,machining tests can more accurately reflect the kinematic errors of five-axis machine tools in the actual machining process for the users.However,measurement and c...Compared with the traditional non-cutting measurement,machining tests can more accurately reflect the kinematic errors of five-axis machine tools in the actual machining process for the users.However,measurement and calculation of the machining tests in the literature are quite difficult and time-consuming.A new method of the machining tests for the trunnion axis of five-axis machine tool is proposed.Firstly,a simple mathematical model of the cradle-type five-axis machine tool was established by optimizing the coordinate system settings based on robot kinematics.Then,the machining tests based on error-sensitive directions were proposed to identify the kinematic errors of the trunnion axis of cradle-type five-axis machine tool.By adopting the error-sensitive vectors in the matrix calculation,the functional relationship equations between the machining errors of the test piece in the error-sensitive directions and the kinematic errors of C-axis and A-axis of five-axis machine tool rotary table was established based on the model of the kinematic errors.According to our previous work,the kinematic errors of C-axis can be treated as the known quantities,and the kinematic errors of A-axis can be obtained from the equations.This method was tested in Mikron UCP600 vertical machining center.The machining errors in the error-sensitive directions can be obtained by CMM inspection from the finished test piece to identify the kinematic errors of five-axis machine tool trunnion axis.Experimental results demonstrated that the proposed method can reduce the complexity,cost,and the time consumed substantially,and has a wider applicability.This paper proposes a new method of the machining tests for the trunnion axis of five-axis machine tool.展开更多
Material removal is one of the most used processes in manufacturing. Five-axis CNC machines are believed to be the best tools in sculptured surface machining. In this study, a generic and unified kinematic model was d...Material removal is one of the most used processes in manufacturing. Five-axis CNC machines are believed to be the best tools in sculptured surface machining. In this study, a generic and unified kinematic model was developed as a viable alternative to the particular solutions that are only applicable to individual machine configurations. This versatile model is then used to verify the feasibility of the two rotational joints within the kinematic chain of three main types of a five-axis machine-tool. This versatile model is very useful applied to the design of five-axis machine tools.展开更多
In this paper, the definition of NURBS curve and a speed-controlled interpolation in which the feed rate is automatically adjusted in order to meet the specified chord error limit were discussed. Besides those, a defi...In this paper, the definition of NURBS curve and a speed-controlled interpolation in which the feed rate is automatically adjusted in order to meet the specified chord error limit were discussed. Besides those, a definition of linear interpolation error of post-processed data was proposed, which should be paid more attention to because it will not only reduce quality of the surface but also may cause interference and other unexpected trouble. In order to control the error, a robust algorithm was proposed, which successfully met a desired error limit through interpolating some essential CL data. The excellence of the proposed algorithm, in terms of its reliability and self-adaptiveness, has been proved by simulation results.展开更多
Thermal errors in CNC machine tools,particularly those involving the spindle,significantly affect machining accuracy and performance.These errors,caused by temperature fluctuations in the spindle and surrounding compo...Thermal errors in CNC machine tools,particularly those involving the spindle,significantly affect machining accuracy and performance.These errors,caused by temperature fluctuations in the spindle and surrounding components,result in dimensional deviations that can lead to poor part quality and reduced precision in high-speed manufacturing processes.This paper explores thermal error modeling and compensation methods for the spindle of five-axis CNC machine tools.A detailed analysis of the heat generation,transfer mechanisms,and finite element analysis(FEA)is presented to develop accurate thermal error models.Compensation techniques,such as model-based methods,sensor-based methods,real-time compensation algorithms,and hybrid approaches,are critically reviewed.This study also discusses the challenges in real-time compensation and the integration of thermal error compensation with machine tool control systems.The objective is to provide a comprehensive understanding of thermal error phenomena and their compensation strategies,ultimately contributing to the enhancement of machining accuracy in advanced manufacturing applications.展开更多
Tracking interferometer based on bi-rotary milling head is a novel scheme to conduct volumetric accuracy measurement of a five-axis machine tool.The laser beam direction of the interferometer can be regulated to follo...Tracking interferometer based on bi-rotary milling head is a novel scheme to conduct volumetric accuracy measurement of a five-axis machine tool.The laser beam direction of the interferometer can be regulated to follow the retroreflector by moving the bi-rotary head.This is a low-cost implementation of multilateration measurement,and its measurement accuracy is mainly affected by the error motion of the rotary axes.This paper proposes an improved multilateration principle to identify the positionindependent geometric errors of rotary axis and laser beam,and minimize their impact on the measurement uncertainty.A closed-loop tracking interferometer system installed on the spindle is developed to perform the measurement with high tracking accuracy.The device can be installed on an ordinary five-axis machine tool without modifying the machine tool structure.The proposed scheme is conducive to improving the accuracy and practical application of the tracking interferometer based on birotary milling head.Experiments with the corresponding closed-loop tracking interferometer and uncertainty analysis are conducted to verify the performance of the proposed measurement scheme.展开更多
Cyclohexene is an important raw material in the production of nylon.Selective hydrogenation of benzene is a key method for preparing cyclohexene.However,the Ru catalysts used in current industrial processes still face...Cyclohexene is an important raw material in the production of nylon.Selective hydrogenation of benzene is a key method for preparing cyclohexene.However,the Ru catalysts used in current industrial processes still face challenges,including high metal usage,high process costs,and low cyclohexene yield.This study utilizes existing literature data combined with machine learning methods to analyze the factors influencing benzene conversion,cyclohexene selectivity,and yield in the benzene hydrogenation to cyclohexene reaction.It constructs predictive models based on XGBoost and Random Forest algorithms.After analysis,it was found that reaction time,Ru content,and space velocity are key factors influencing cyclohexene yield,selectivity,and benzene conversion.Shapley Additive Explanations(SHAP)analysis and feature importance analysis further revealed the contribution of each variable to the reaction outcomes.Additionally,we randomly generated one million variable combinations using the Dirichlet distribution to attempt to predict high-yield catalyst formulations.This paper provides new insights into the application of machine learning in heterogeneous catalysis and offers some reference for further research.展开更多
The uplift resistance of the soil overlying shield tunnels significantly impacts their anti-floating stability.However,research on uplift resistance concerning special-shaped shield tunnels is limited.This study combi...The uplift resistance of the soil overlying shield tunnels significantly impacts their anti-floating stability.However,research on uplift resistance concerning special-shaped shield tunnels is limited.This study combines numerical simulation with machine learning techniques to explore this issue.It presents a summary of special-shaped tunnel geometries and introduces a shape coefficient.Through the finite element software,Plaxis3D,the study simulates six key parameters—shape coefficient,burial depth ratio,tunnel’s longest horizontal length,internal friction angle,cohesion,and soil submerged bulk density—that impact uplift resistance across different conditions.Employing XGBoost and ANN methods,the feature importance of each parameter was analyzed based on the numerical simulation results.The findings demonstrate that a tunnel shape more closely resembling a circle leads to reduced uplift resistance in the overlying soil,whereas other parameters exhibit the contrary effects.Furthermore,the study reveals a diminishing trend in the feature importance of buried depth ratio,internal friction angle,tunnel longest horizontal length,cohesion,soil submerged bulk density,and shape coefficient in influencing uplift resistance.展开更多
The advantages of genome selection(GS) in animal and plant breeding are self-evident.Traditional parametric models have disadvantage in better fit the increasingly large sequencing data and capture complex effects acc...The advantages of genome selection(GS) in animal and plant breeding are self-evident.Traditional parametric models have disadvantage in better fit the increasingly large sequencing data and capture complex effects accurately.Machine learning models have demonstrated remarkable potential in addressing these challenges.In this study,we introduced the concept of mixed kernel functions to explore the performance of support vector machine regression(SVR) in GS.Six single kernel functions(SVR_L,SVR_C,SVR_G,SVR_P,SVR_S,SVR_L) and four mixed kernel functions(SVR_GS,SVR_GP,SVR_LS,SVR_LP) were used to predict genome breeding values.The prediction accuracy,mean squared error(MSE) and mean absolute error(MAE) were used as evaluation indicators to compare with two traditional parametric models(GBLUP,BayesB) and two popular machine learning models(RF,KcRR).The results indicate that in most cases,the performance of the mixed kernel function model significantly outperforms that of GBLUP,BayesB and single kernel function.For instance,for T1 in the pig dataset,the predictive accuracy of SVR_GS is improved by 10% compared to GBLUP,and by approximately 4.4 and 18.6% compared to SVR_G and SVR_S respectively.For E1 in the wheat dataset,SVR_GS achieves 13.3% higher prediction accuracy than GBLUP.Among single kernel functions,the Laplacian and Gaussian kernel functions yield similar results,with the Gaussian kernel function performing better.The mixed kernel function notably reduces the MSE and MAE when compared to all single kernel functions.Furthermore,regarding runtime,SVR_GS and SVR_GP mixed kernel functions run approximately three times faster than GBLUP in the pig dataset,with only a slight increase in runtime compared to the single kernel function model.In summary,the mixed kernel function model of SVR demonstrates speed and accuracy competitiveness,and the model such as SVR_GS has important application potential for GS.展开更多
Accurate assessment of snowpack volumetric liquid water content and bulk density is essential for understanding snow hydrology,avalanche risk management,and monitoring cryosphere changes.This study presents a novel du...Accurate assessment of snowpack volumetric liquid water content and bulk density is essential for understanding snow hydrology,avalanche risk management,and monitoring cryosphere changes.This study presents a novel dual-parameter inversion framework that integrates synthetic electromagnetic modelling,dimensionality reduction,and machine learning algorithms to extract relative permittivity and log-resistivity from ground-penetrating radar(GPR)data.Traditional snowpack measurements are invasive,labor-intensive,and limited to point observations.To overcome these limitations,we developed a non-invasive,scalable,and data-driven framework that uses synthetic GPR datasets representing diverse snowpack conditions with variable moisture and density profiles.Synthetic 1D time series reflections(A-scans)are generated using finite-difference time-domain simulations in the state-of-the-art electromagnetic simulator gprMax.Principal component analysis(PCA)is applied to compress each A-scan while preserving key features,which significantly improved and enhanced the model training efficiency.Four machine learning models,including random forest,neural network,support vector machine,and eXtreme gradient boosting,are trained on PCA-reduced features.Among these,the neural network model achieved the best performance,with R^(2)>0.97 for permittivity and R 2>0.92 for resistivity.Gaussian noise(signal-to-noise ratio of 6 dB)is introduced to the synthetic data,and then targeted domain adaptation is employed to enhance generalization to field data.The framework is validated on two contrasting GPR transects in the Altay Mountains of the Chinese mainland,representing moist(T750)and wet(G125)snowpack conditions.The neural network model predictions are most consistent with the GPR derived estimates,Snowfork measurements,and snow pit data,achieving volumetric liquid water content deviation of≤1.5% and bulk density error within the range of 30-84 kg m^(-3).The results demonstrate that machine learning-based inversion,supported by realistic simulations and data augmentation enables scalable,non-invasive snowpack characterization with significant applications in hydrological forecasting,snow monitoring,and water resource management.展开更多
In the pharmaceutical field,machine learning can play an important role in drug development,production and treatment.Co-crystallization techniques have shown promising potential to enhance the properties of active pha...In the pharmaceutical field,machine learning can play an important role in drug development,production and treatment.Co-crystallization techniques have shown promising potential to enhance the properties of active pharmaceutical ingredients(APIs)such as solubility,permeability,and bioavailability,all without altering their chemical structure.This approach opens new avenues for developing natural products into effective drugs,especially those previously challenging in formulation.Emodin,an anthraquinone-based natural product,is a notable example due to its diverse biological activities;however,its physicochemical limitations,such as poor solubility and easy sublimation,restricted its clinical application.While various methods have improved emodin's physicochemical properties,research on its bioavailability remains limited.In our study,we summarize cocrystals and salts produced through co-crystallization technology and identify piperazine as a favorable coformer.Conflicting conclusions from computational chemistry and molecular modeling method and machine learning method regarding the formation of an emodin-piperazine cocrystal or salt led us to experimentally validate these possibilities.Ultimately,we successfully obtained the emodin-piperazine cocrystal,which were characterized and evaluated by several in vitro methods and pharmacokinetic studies.In addition,experiments have shown that emodin has a certain therapeutic effect on sepsis,so we also evaluated emodin-piperazine biological activity in a sepsis model.The results demonstrate that co-crystallization significantly enhances emodin's solubility,permeability,and bioavailability.Pharmacodynamic studies indicate that the emodin-piperazine cocrystal improves sepsis symptoms and provides protective effects against liver and kidney damage associated with sepsis.This study offers renewed hope for natural products with broad biological activities yet hindered by physicochemical limitations by advancing co-crystallization as a viable development approach.展开更多
Nasopharyngeal carcinoma(NPC)is a malignant tumor prevalent in southern China and Southeast Asia,where its early detection is crucial for improving patient prognosis and reducing mortality rates.However,existing scree...Nasopharyngeal carcinoma(NPC)is a malignant tumor prevalent in southern China and Southeast Asia,where its early detection is crucial for improving patient prognosis and reducing mortality rates.However,existing screening methods suffer from limitations in accuracy and accessibility,hindering their application in large-scale population screening.In this work,a surface-enhanced Raman spectroscopy(SERS)-based method was established to explore the profiles of different stratified components in saliva from NPC and healthy subjects after fractionation processing.The study findings indicate that all fractionated samples exhibit diseaseassociated molecular signaling differences,where small-molecule(molecular weight cut-offvalue is 10 kDa)demonstrating superior classification capabilities with sensitivity of 90.5%and speci-ficity of 75.6%,area under receiver operating characteristic(ROC)curve of 0:925±0:031.The primary objective of this study was to qualitatively explore patterns in saliva composition across groups.The proposed SERS detection strategy for fractionated saliva offers novel insights for enhancing the sensitivity and reliability of noninvasive NPC screening,laying the foundation for translational application in large-scale clinical settings.展开更多
Accurate land surface temperature(LST)assessment is crucial for comprehending and reducing the impacts of climate change and understanding land use evolution.This study presents an innovative method by utilizing ensem...Accurate land surface temperature(LST)assessment is crucial for comprehending and reducing the impacts of climate change and understanding land use evolution.This study presents an innovative method by utilizing ensemble models,advanced correlation analysis,and trend analysis to investigate its environmental influences.Google Earth Engine(GEE)was utilized to process the datasets from Landsat-7 and Landsat-8 for the five big cities of Punjab,Pakistan,from 2001 to 2023.Results from this study show significant urban warming trends,and a strong correlation between environmental variables and LST was identified.The ensemble-based three machine learning models,including XGBoost,AdaBoost,and random forest(RF),were adopted to improve the accuracy of LST evaluation.Although XGBoost and AdaBoost attained modest levels of accuracy,with R^(2) values of 0.767 and 0.706,respectively,the RF model outperformed them by achieving an exceptional R^(2) of 0.796 and RMSE of 0.476.Moreover,Pearson correlation analysis revealed a negative relationship between LST and normalized difference latent heat index(NDLI)with r=-0.67,normalized difference vegetation index(NDVI)with r=-0.6,and modified normalized difference water index(MNDWI)with the value of r as -0.57.In addition,wavelet analysis showed that vegetation and water offer long-term LST cooling,lasting up to 64 months,while built-up areas and bare soil contribute to short-term warming,lasting 4 to 8 months.Latent heat indicated variable cooling periods,surpassing 60 months in cities.These findings enhance the understanding of LST changes and the impact of climate change on the environment.展开更多
As urbanization continues to accelerate,the challenges associated with managing transportation in metropolitan areas become increasingly complex.The surge in population density contributes to traffic congestion,impact...As urbanization continues to accelerate,the challenges associated with managing transportation in metropolitan areas become increasingly complex.The surge in population density contributes to traffic congestion,impacting travel experiences and posing safety risks.Smart urban transportation management emerges as a strategic solution,conceptualized here as a multidimensional big data problem.The success of this strategy hinges on the effective collection of information from diverse,extensive,and heterogeneous data sources,necessitating the implementation of full⁃stack Information and Communication Technology(ICT)solutions.The main idea of the work is to investigate the current technologies of Intelligent Transportation Systems(ITS)and enhance the safety of urban transportation systems.Machine learning models,trained on historical data,can predict traffic congestion,allowing for the implementation of preventive measures.Deep learning architectures,with their ability to handle complex data representations,further refine traffic predictions,contributing to more accurate and dynamic transportation management.The background of this research underscores the challenges posed by traffic congestion in metropolitan areas and emphasizes the need for advanced technological solutions.By integrating GPS and GIS technologies with machine learning algorithms,this work aims to pay attention to the development of intelligent transportation systems that not only address current challenges but also pave the way for future advancements in urban transportation management.展开更多
Replicating the chaotic characteristics inherent in nonlinear dynamical systems via machine learning(ML)is a key challenge in this rapidly advancing interdisciplinary field.In this work,we explore the potential of var...Replicating the chaotic characteristics inherent in nonlinear dynamical systems via machine learning(ML)is a key challenge in this rapidly advancing interdisciplinary field.In this work,we explore the potential of variational quantum circuits(VQC)for learning the stochastic properties of classical nonlinear dynamical systems.Specifically,we focus on the one-and two-dimensional logistic maps,which,while simple,remain under-explored in the context of learning dynamical characteristics.Our findings reveal that,even for such simple dynamical systems,accurately replicating longterm characteristics is hindered by a pronounced sensitivity to overfitting.While increasing the parameter complexity of the ML model typically enhances short-term prediction accuracy,it also leads to a degradation in the model’s ability to replicate long-term characteristics,primarily due to the detrimental effects of overfitting on generalization power.By comparing the VQC with two widely recognized classical ML techniques,which are long short-term memory(LSTM)networks for timeseries processing and reservoir computing,we demonstrate that VQC outperforms these methods in terms of replicating long-term characteristics.Our results suggest that for the ML of dynamics,it is demanded to develop more compact and efficient models(such as VQC)rather than more complicated and large-scale ones.展开更多
The rapid advancement of machine learning based tight-binding Hamiltonian(MLTB)methods has opened new avenues for efficient and accurate electronic structure simulations,particularly in large-scale systems and long-ti...The rapid advancement of machine learning based tight-binding Hamiltonian(MLTB)methods has opened new avenues for efficient and accurate electronic structure simulations,particularly in large-scale systems and long-time scenarios.This review begins with a concise overview of traditional tight-binding(TB)models,including both(semi-)empirical and first-principles approaches,establishing the foundation for understanding MLTB developments.We then present a systematic classification of existing MLTB methodologies,grouped into two major categories:direct prediction of TB Hamiltonian elements and inference of empirical parameters.A comparative analysis with other ML-based electronic structure models is also provided,highlighting the advancement of MLTB approaches.Finally,we explore the emerging MLTB application ecosystem,highlighting how the integration of MLTB models with a diverse suite of post-processing tools from linear-scaling solvers to quantum transport frameworks and molecular dynamics interfaces is essential for tackling complex scientific problems across different domains.The continued advancement of this integrated paradigm promises to accelerate materials discovery and open new frontiers in the predictive simulation of complex quantum phenomena.展开更多
The viscosity of refining slags plays a critical role in metallurgical processes.However,obtaining accurate viscosity data remains challenging due to the complexities of high-temperature experiments,often relying on e...The viscosity of refining slags plays a critical role in metallurgical processes.However,obtaining accurate viscosity data remains challenging due to the complexities of high-temperature experiments,often relying on empirical models with limited predictive capabilities.This study focuses on the influence of optical basicity on viscosity in CaO-Al_(2)O_(3)-based refining slags,leveraging machine learning to address data scarcity and improve prediction accuracy.An automated framework for algorithm integration,parameter tuning,and evaluation ranking framework(Auto-APE)is employed to develop customized data-driven models for various slag systems,including CaO-Al_(2)O_(3)-SiO_(2),CaO-Al_(2)O_(3)-CaF_(2),CaO-Al_(2)O_(3)-SiO_(2)-MgO,and CaO-Al_(2)O_(3)-SiO_(2)-MgO-CaF_(2).By incorporating optical basicity as a key feature,the models achieve an average validation error of 8.0%to 15.1%,significantly outperforming traditional empirical models.Additionally,symbolic regression is introduced to rapidly construct domain-specific features,such as optical basicity-like descriptors,offering a potential breakthrough in performance prediction for small datasets.This work highlights the critical role of domain-specific knowledge in understanding and predicting viscosity,providing a robust machine learning-based approach for optimizing refining slag properties.展开更多
Lithium-ion batteries(LIBs)are widely deployed,from grid-scale storage to electric vehicles.LIBs remain stationary most of their service life,where calendar aging degrades capacity.Understanding the mechanisms of LIB ...Lithium-ion batteries(LIBs)are widely deployed,from grid-scale storage to electric vehicles.LIBs remain stationary most of their service life,where calendar aging degrades capacity.Understanding the mechanisms of LIB calendar aging is crucial for extending battery lifespan.However,LIB calendar aging is influenced by multiple factors,including battery material,its state,and storage environment.Calendar aging experiments are also time-consuming,costly,and lack standardized testing conditions.This study employs a data-driven approach to establish a cross-scale database linking materials,side-reaction mechanisms,and calendar aging of LIBs.MELODI(Mechanism-informed,Explainable,Learning-based Optimization for Degradation Identification)is proposed to identify calendar aging mechanisms and quantify the effects of multi-scale factors.Results reveal that cathode material loss drives up to 91.42%of calendar aging degradation in high-nickel(Ni)batteries,while solid electrolyte interphase growth dominates in lithium iron phosphate(LFP)and low-Ni batteries,contributing up to 82.43%of degradation in LFP batteries and 99.10%of decay in low-Ni batteries,respectively.This study systematically quantifies calendar aging in commercial LIBs under varying materials,states of charge,and temperatures.These findings offer quantitative guidance for experimental design or battery use,and implications for emerging applications like aerial robotics,vehicle-to-grid,and embodied intelligence systems.展开更多
基金supported by the Key R&D Program of Zhejiang Province(Nos.2023C01166 and 2024SJCZX0046)the Zhejiang Provincial Natural Science Foundation of China(Nos.LDT23E05013E05 and LD24E050009)the Natural Science Foundation of Ningbo(No.2021J150),China.
文摘Accuracy allocation is crucial in the accuracy design of machining tools.Current accuracy allocation methods primarily focus on positional deviation,with little consideration for tool direction deviation.To address this issue,we propose a geometric error cost sensitivity-based accuracy allocation method for five-axis machine tools.A geometric error model consisting of 4l error components is constructed based on homogeneous transformation matrices.Volumetric points with positional and tool direction deviations are randomly sampled to evaluate the accuracy of the machine tool.The sensitivity of each error component at these sampling points is analyzed using the Sobol method.To balance the needs of geometric precision and manufacturing cost,a geometric error cost sensitivity function is developed to estimate the required cost.By allocating error components affecting tool direction deviation first and the remaining components second,this allocation scheme ensures that both deviations meet the requirements.We also perform numerical simulation of a BC-type(B-axis and C-axis type)five-axis machine tool to validate the method.The results show that the new allocation scheme reduces the total geometric error cost by 27.8%compared to a uniform allocation scheme,and yields the same positional and tool direction machining accuracies.
基金Supported by National Natural Science Foundation of China(Grant No.51305244)Shandong Provincal Natural Science Foundation of China(Grant No.ZR2013EEL015)
文摘Aiming at the problem of low machining accu- racy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of tem- perature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC- NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 pm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.
基金co-supported by the Natural Science Foundation of Beijing(No.3214043)the Project of State Key Lab of Tribology of Tsinghua University(No.SKLT2021D16)the National Natural Science Foundation of China(No.51975319)。
文摘In order to satisfy the machining requirements of aero-engine casing in modern aviation industry, this paper investigates three main issues during the design and development process of a five-axis machine tool with high accuracy, stiffness and efficiency, including whole structure design,key components design, and supporting stiffness design. First, an appropriate structure of five-axis machine tool is determined considering the processing characteristics of aero-engine casing. Then, a dual drive swing head and a compact motorized spindle are designed with enough drive capability and stiffness, and related structure, assembly method, cooling technology, and performance simulation are given in detail. Next, a design method of supporting stiffness of guide is proposed through the deformation prediction of the spindle end. Based on above work, a prototype of machine tool is developed, and some experiments are carried out, including performance tests of swing head and motorized spindle, and machining of a simulated workpiece of aero-engine casing. All experimental results show that the machine tool has satisfactory accuracy, stiffness and efficiency, which meets the machining requirements of aero-engine casing. The main work can be used as references for engineers and technicians, which are meaningful in practice.
基金Supported by National Nature Science Foundation of China(Grant No.51175461)Science Fund for Creative Research Groups of National Natural Science Foundation of China(Grant No.51221004)Program for Zhejiang Leading Team of S&T Innovation of China(Grant No.2009R50008)
文摘Compared with the traditional non-cutting measurement,machining tests can more accurately reflect the kinematic errors of five-axis machine tools in the actual machining process for the users.However,measurement and calculation of the machining tests in the literature are quite difficult and time-consuming.A new method of the machining tests for the trunnion axis of five-axis machine tool is proposed.Firstly,a simple mathematical model of the cradle-type five-axis machine tool was established by optimizing the coordinate system settings based on robot kinematics.Then,the machining tests based on error-sensitive directions were proposed to identify the kinematic errors of the trunnion axis of cradle-type five-axis machine tool.By adopting the error-sensitive vectors in the matrix calculation,the functional relationship equations between the machining errors of the test piece in the error-sensitive directions and the kinematic errors of C-axis and A-axis of five-axis machine tool rotary table was established based on the model of the kinematic errors.According to our previous work,the kinematic errors of C-axis can be treated as the known quantities,and the kinematic errors of A-axis can be obtained from the equations.This method was tested in Mikron UCP600 vertical machining center.The machining errors in the error-sensitive directions can be obtained by CMM inspection from the finished test piece to identify the kinematic errors of five-axis machine tool trunnion axis.Experimental results demonstrated that the proposed method can reduce the complexity,cost,and the time consumed substantially,and has a wider applicability.This paper proposes a new method of the machining tests for the trunnion axis of five-axis machine tool.
文摘Material removal is one of the most used processes in manufacturing. Five-axis CNC machines are believed to be the best tools in sculptured surface machining. In this study, a generic and unified kinematic model was developed as a viable alternative to the particular solutions that are only applicable to individual machine configurations. This versatile model is then used to verify the feasibility of the two rotational joints within the kinematic chain of three main types of a five-axis machine-tool. This versatile model is very useful applied to the design of five-axis machine tools.
文摘In this paper, the definition of NURBS curve and a speed-controlled interpolation in which the feed rate is automatically adjusted in order to meet the specified chord error limit were discussed. Besides those, a definition of linear interpolation error of post-processed data was proposed, which should be paid more attention to because it will not only reduce quality of the surface but also may cause interference and other unexpected trouble. In order to control the error, a robust algorithm was proposed, which successfully met a desired error limit through interpolating some essential CL data. The excellence of the proposed algorithm, in terms of its reliability and self-adaptiveness, has been proved by simulation results.
文摘Thermal errors in CNC machine tools,particularly those involving the spindle,significantly affect machining accuracy and performance.These errors,caused by temperature fluctuations in the spindle and surrounding components,result in dimensional deviations that can lead to poor part quality and reduced precision in high-speed manufacturing processes.This paper explores thermal error modeling and compensation methods for the spindle of five-axis CNC machine tools.A detailed analysis of the heat generation,transfer mechanisms,and finite element analysis(FEA)is presented to develop accurate thermal error models.Compensation techniques,such as model-based methods,sensor-based methods,real-time compensation algorithms,and hybrid approaches,are critically reviewed.This study also discusses the challenges in real-time compensation and the integration of thermal error compensation with machine tool control systems.The objective is to provide a comprehensive understanding of thermal error phenomena and their compensation strategies,ultimately contributing to the enhancement of machining accuracy in advanced manufacturing applications.
基金supported by the National Natural Science Foundation of China(Grant No.51875357)the State Key Program of National Natural Science Foundation of China(Grant No.U21B2081)the National Defense Science and Technology Excellence Youth Foundation(Grant No.2020-JCJQ-ZQ-079)。
文摘Tracking interferometer based on bi-rotary milling head is a novel scheme to conduct volumetric accuracy measurement of a five-axis machine tool.The laser beam direction of the interferometer can be regulated to follow the retroreflector by moving the bi-rotary head.This is a low-cost implementation of multilateration measurement,and its measurement accuracy is mainly affected by the error motion of the rotary axes.This paper proposes an improved multilateration principle to identify the positionindependent geometric errors of rotary axis and laser beam,and minimize their impact on the measurement uncertainty.A closed-loop tracking interferometer system installed on the spindle is developed to perform the measurement with high tracking accuracy.The device can be installed on an ordinary five-axis machine tool without modifying the machine tool structure.The proposed scheme is conducive to improving the accuracy and practical application of the tracking interferometer based on birotary milling head.Experiments with the corresponding closed-loop tracking interferometer and uncertainty analysis are conducted to verify the performance of the proposed measurement scheme.
基金Supported by CAS Basic and Interdisciplinary Frontier Scientific Research Pilot Project(XDB1190300,XDB1190302)Youth Innovation Promotion Association CAS(Y2021056)+1 种基金Joint Fund of the Yulin University and the Dalian National Laboratory for Clean Energy(YLU-DNL Fund 2022007)The special fund for Science and Technology Innovation Teams of Shanxi Province(202304051001007)。
文摘Cyclohexene is an important raw material in the production of nylon.Selective hydrogenation of benzene is a key method for preparing cyclohexene.However,the Ru catalysts used in current industrial processes still face challenges,including high metal usage,high process costs,and low cyclohexene yield.This study utilizes existing literature data combined with machine learning methods to analyze the factors influencing benzene conversion,cyclohexene selectivity,and yield in the benzene hydrogenation to cyclohexene reaction.It constructs predictive models based on XGBoost and Random Forest algorithms.After analysis,it was found that reaction time,Ru content,and space velocity are key factors influencing cyclohexene yield,selectivity,and benzene conversion.Shapley Additive Explanations(SHAP)analysis and feature importance analysis further revealed the contribution of each variable to the reaction outcomes.Additionally,we randomly generated one million variable combinations using the Dirichlet distribution to attempt to predict high-yield catalyst formulations.This paper provides new insights into the application of machine learning in heterogeneous catalysis and offers some reference for further research.
基金Guangzhou Metro Scientific Research Project(No.JT204-100111-23001)Chongqing Municipal Special Project for Technological Innovation and Application Development(No.CSTB2022TIAD-KPX0101)Science and Technology Research and Development Program of China State Railway Group Co.,Ltd.(No.N2023G045)。
文摘The uplift resistance of the soil overlying shield tunnels significantly impacts their anti-floating stability.However,research on uplift resistance concerning special-shaped shield tunnels is limited.This study combines numerical simulation with machine learning techniques to explore this issue.It presents a summary of special-shaped tunnel geometries and introduces a shape coefficient.Through the finite element software,Plaxis3D,the study simulates six key parameters—shape coefficient,burial depth ratio,tunnel’s longest horizontal length,internal friction angle,cohesion,and soil submerged bulk density—that impact uplift resistance across different conditions.Employing XGBoost and ANN methods,the feature importance of each parameter was analyzed based on the numerical simulation results.The findings demonstrate that a tunnel shape more closely resembling a circle leads to reduced uplift resistance in the overlying soil,whereas other parameters exhibit the contrary effects.Furthermore,the study reveals a diminishing trend in the feature importance of buried depth ratio,internal friction angle,tunnel longest horizontal length,cohesion,soil submerged bulk density,and shape coefficient in influencing uplift resistance.
基金supported by the China Agriculture Research System of MOF and MARAthe National Natural Science Foundation of China (31872337 and 31501919)the Agricultural Science and Technology Innovation Project,China (ASTIP-IAS02)。
文摘The advantages of genome selection(GS) in animal and plant breeding are self-evident.Traditional parametric models have disadvantage in better fit the increasingly large sequencing data and capture complex effects accurately.Machine learning models have demonstrated remarkable potential in addressing these challenges.In this study,we introduced the concept of mixed kernel functions to explore the performance of support vector machine regression(SVR) in GS.Six single kernel functions(SVR_L,SVR_C,SVR_G,SVR_P,SVR_S,SVR_L) and four mixed kernel functions(SVR_GS,SVR_GP,SVR_LS,SVR_LP) were used to predict genome breeding values.The prediction accuracy,mean squared error(MSE) and mean absolute error(MAE) were used as evaluation indicators to compare with two traditional parametric models(GBLUP,BayesB) and two popular machine learning models(RF,KcRR).The results indicate that in most cases,the performance of the mixed kernel function model significantly outperforms that of GBLUP,BayesB and single kernel function.For instance,for T1 in the pig dataset,the predictive accuracy of SVR_GS is improved by 10% compared to GBLUP,and by approximately 4.4 and 18.6% compared to SVR_G and SVR_S respectively.For E1 in the wheat dataset,SVR_GS achieves 13.3% higher prediction accuracy than GBLUP.Among single kernel functions,the Laplacian and Gaussian kernel functions yield similar results,with the Gaussian kernel function performing better.The mixed kernel function notably reduces the MSE and MAE when compared to all single kernel functions.Furthermore,regarding runtime,SVR_GS and SVR_GP mixed kernel functions run approximately three times faster than GBLUP in the pig dataset,with only a slight increase in runtime compared to the single kernel function model.In summary,the mixed kernel function model of SVR demonstrates speed and accuracy competitiveness,and the model such as SVR_GS has important application potential for GS.
基金supported by the National Key R&D Program of China(Grant Nos.2023YFC3008300&2023YFC3008305)the National Natural Science Foundation of China(Grant No.42172320)+1 种基金the Key Laboratory of Mountain Hazards and Engineering Resilience,Institute of Mountain Hazards and Environment,Chinese Academy of Sciences(Grant Nos.KLMHER-Z06&KLMHER-T07)the Science and Technology Research Program of Institute of Mountain Hazards and Environment,Chinese Academy of Sciences(Grant No.IMHE-CXTD.04).
文摘Accurate assessment of snowpack volumetric liquid water content and bulk density is essential for understanding snow hydrology,avalanche risk management,and monitoring cryosphere changes.This study presents a novel dual-parameter inversion framework that integrates synthetic electromagnetic modelling,dimensionality reduction,and machine learning algorithms to extract relative permittivity and log-resistivity from ground-penetrating radar(GPR)data.Traditional snowpack measurements are invasive,labor-intensive,and limited to point observations.To overcome these limitations,we developed a non-invasive,scalable,and data-driven framework that uses synthetic GPR datasets representing diverse snowpack conditions with variable moisture and density profiles.Synthetic 1D time series reflections(A-scans)are generated using finite-difference time-domain simulations in the state-of-the-art electromagnetic simulator gprMax.Principal component analysis(PCA)is applied to compress each A-scan while preserving key features,which significantly improved and enhanced the model training efficiency.Four machine learning models,including random forest,neural network,support vector machine,and eXtreme gradient boosting,are trained on PCA-reduced features.Among these,the neural network model achieved the best performance,with R^(2)>0.97 for permittivity and R 2>0.92 for resistivity.Gaussian noise(signal-to-noise ratio of 6 dB)is introduced to the synthetic data,and then targeted domain adaptation is employed to enhance generalization to field data.The framework is validated on two contrasting GPR transects in the Altay Mountains of the Chinese mainland,representing moist(T750)and wet(G125)snowpack conditions.The neural network model predictions are most consistent with the GPR derived estimates,Snowfork measurements,and snow pit data,achieving volumetric liquid water content deviation of≤1.5% and bulk density error within the range of 30-84 kg m^(-3).The results demonstrate that machine learning-based inversion,supported by realistic simulations and data augmentation enables scalable,non-invasive snowpack characterization with significant applications in hydrological forecasting,snow monitoring,and water resource management.
基金funded by the National Natural Science Foundation of China(No.22278443)CAMS Innovation Fund for Medical Sciences(No.2022-I2M-1-015)+3 种基金the Key R&D Program of Shandong Province(No.2021ZDSYS26)Xinjiang Uygur Autonomous Region Innovation Environment Construction Special Fund and Technology Innovation Base Construction Key Laboratory Open Project(No.2023D04065)2023 Xinjiang Uygur Autonomous Region Innovation Tianchi Talent Introduction Program for financial supportthe Key Project of Natural Science of Bengbu Medical University(No.2024byzd138).
文摘In the pharmaceutical field,machine learning can play an important role in drug development,production and treatment.Co-crystallization techniques have shown promising potential to enhance the properties of active pharmaceutical ingredients(APIs)such as solubility,permeability,and bioavailability,all without altering their chemical structure.This approach opens new avenues for developing natural products into effective drugs,especially those previously challenging in formulation.Emodin,an anthraquinone-based natural product,is a notable example due to its diverse biological activities;however,its physicochemical limitations,such as poor solubility and easy sublimation,restricted its clinical application.While various methods have improved emodin's physicochemical properties,research on its bioavailability remains limited.In our study,we summarize cocrystals and salts produced through co-crystallization technology and identify piperazine as a favorable coformer.Conflicting conclusions from computational chemistry and molecular modeling method and machine learning method regarding the formation of an emodin-piperazine cocrystal or salt led us to experimentally validate these possibilities.Ultimately,we successfully obtained the emodin-piperazine cocrystal,which were characterized and evaluated by several in vitro methods and pharmacokinetic studies.In addition,experiments have shown that emodin has a certain therapeutic effect on sepsis,so we also evaluated emodin-piperazine biological activity in a sepsis model.The results demonstrate that co-crystallization significantly enhances emodin's solubility,permeability,and bioavailability.Pharmacodynamic studies indicate that the emodin-piperazine cocrystal improves sepsis symptoms and provides protective effects against liver and kidney damage associated with sepsis.This study offers renewed hope for natural products with broad biological activities yet hindered by physicochemical limitations by advancing co-crystallization as a viable development approach.
基金financially supported by National Natural Science Foundation ofChina(No.12374405)Provincial Science Foundation for Distinguished Young Scholars of Fujian(No.2024J010024)+1 种基金Natural Science Foundation of Fujian Province of China(No.2023J011267)Major Research Projects for Young and Middle-aged Researchers of Fujian Provincial Health Commission(No.2021ZQNZD010).
文摘Nasopharyngeal carcinoma(NPC)is a malignant tumor prevalent in southern China and Southeast Asia,where its early detection is crucial for improving patient prognosis and reducing mortality rates.However,existing screening methods suffer from limitations in accuracy and accessibility,hindering their application in large-scale population screening.In this work,a surface-enhanced Raman spectroscopy(SERS)-based method was established to explore the profiles of different stratified components in saliva from NPC and healthy subjects after fractionation processing.The study findings indicate that all fractionated samples exhibit diseaseassociated molecular signaling differences,where small-molecule(molecular weight cut-offvalue is 10 kDa)demonstrating superior classification capabilities with sensitivity of 90.5%and speci-ficity of 75.6%,area under receiver operating characteristic(ROC)curve of 0:925±0:031.The primary objective of this study was to qualitatively explore patterns in saliva composition across groups.The proposed SERS detection strategy for fractionated saliva offers novel insights for enhancing the sensitivity and reliability of noninvasive NPC screening,laying the foundation for translational application in large-scale clinical settings.
基金supported by the National Natural Science Foundation of China(Grant Nos.52479045,52279042)the Key Research and Development Program in Guangxi(Grant No.AB23026021)the Open Research Fund of Guangxi Key Laboratory of Water Engineering Materials and Structures,Guangxi Institute of Water Resources Research(Grant No.GXHRIWEMS-2022-07).
文摘Accurate land surface temperature(LST)assessment is crucial for comprehending and reducing the impacts of climate change and understanding land use evolution.This study presents an innovative method by utilizing ensemble models,advanced correlation analysis,and trend analysis to investigate its environmental influences.Google Earth Engine(GEE)was utilized to process the datasets from Landsat-7 and Landsat-8 for the five big cities of Punjab,Pakistan,from 2001 to 2023.Results from this study show significant urban warming trends,and a strong correlation between environmental variables and LST was identified.The ensemble-based three machine learning models,including XGBoost,AdaBoost,and random forest(RF),were adopted to improve the accuracy of LST evaluation.Although XGBoost and AdaBoost attained modest levels of accuracy,with R^(2) values of 0.767 and 0.706,respectively,the RF model outperformed them by achieving an exceptional R^(2) of 0.796 and RMSE of 0.476.Moreover,Pearson correlation analysis revealed a negative relationship between LST and normalized difference latent heat index(NDLI)with r=-0.67,normalized difference vegetation index(NDVI)with r=-0.6,and modified normalized difference water index(MNDWI)with the value of r as -0.57.In addition,wavelet analysis showed that vegetation and water offer long-term LST cooling,lasting up to 64 months,while built-up areas and bare soil contribute to short-term warming,lasting 4 to 8 months.Latent heat indicated variable cooling periods,surpassing 60 months in cities.These findings enhance the understanding of LST changes and the impact of climate change on the environment.
文摘As urbanization continues to accelerate,the challenges associated with managing transportation in metropolitan areas become increasingly complex.The surge in population density contributes to traffic congestion,impacting travel experiences and posing safety risks.Smart urban transportation management emerges as a strategic solution,conceptualized here as a multidimensional big data problem.The success of this strategy hinges on the effective collection of information from diverse,extensive,and heterogeneous data sources,necessitating the implementation of full⁃stack Information and Communication Technology(ICT)solutions.The main idea of the work is to investigate the current technologies of Intelligent Transportation Systems(ITS)and enhance the safety of urban transportation systems.Machine learning models,trained on historical data,can predict traffic congestion,allowing for the implementation of preventive measures.Deep learning architectures,with their ability to handle complex data representations,further refine traffic predictions,contributing to more accurate and dynamic transportation management.The background of this research underscores the challenges posed by traffic congestion in metropolitan areas and emphasizes the need for advanced technological solutions.By integrating GPS and GIS technologies with machine learning algorithms,this work aims to pay attention to the development of intelligent transportation systems that not only address current challenges but also pave the way for future advancements in urban transportation management.
基金Project supported in part by Beijing Natural Science Foundation(Grant No.1232025)Peng Huanwu Visiting Pro-fessor Program,and Academy for Multidisciplinary Studies,Capital Normal University.
文摘Replicating the chaotic characteristics inherent in nonlinear dynamical systems via machine learning(ML)is a key challenge in this rapidly advancing interdisciplinary field.In this work,we explore the potential of variational quantum circuits(VQC)for learning the stochastic properties of classical nonlinear dynamical systems.Specifically,we focus on the one-and two-dimensional logistic maps,which,while simple,remain under-explored in the context of learning dynamical characteristics.Our findings reveal that,even for such simple dynamical systems,accurately replicating longterm characteristics is hindered by a pronounced sensitivity to overfitting.While increasing the parameter complexity of the ML model typically enhances short-term prediction accuracy,it also leads to a degradation in the model’s ability to replicate long-term characteristics,primarily due to the detrimental effects of overfitting on generalization power.By comparing the VQC with two widely recognized classical ML techniques,which are long short-term memory(LSTM)networks for timeseries processing and reservoir computing,we demonstrate that VQC outperforms these methods in terms of replicating long-term characteristics.Our results suggest that for the ML of dynamics,it is demanded to develop more compact and efficient models(such as VQC)rather than more complicated and large-scale ones.
基金supported by the Advanced Materials-National Science and Technology Major Project(Grant No.2025ZD0618401)the National Natural Science Foundation of China(Grant No.12504285)+1 种基金the Natural Science Foundation of Jiangsu Province(Grant No.BK20250472)NFSG grant from BITS-Pilani,Dubai campus。
文摘The rapid advancement of machine learning based tight-binding Hamiltonian(MLTB)methods has opened new avenues for efficient and accurate electronic structure simulations,particularly in large-scale systems and long-time scenarios.This review begins with a concise overview of traditional tight-binding(TB)models,including both(semi-)empirical and first-principles approaches,establishing the foundation for understanding MLTB developments.We then present a systematic classification of existing MLTB methodologies,grouped into two major categories:direct prediction of TB Hamiltonian elements and inference of empirical parameters.A comparative analysis with other ML-based electronic structure models is also provided,highlighting the advancement of MLTB approaches.Finally,we explore the emerging MLTB application ecosystem,highlighting how the integration of MLTB models with a diverse suite of post-processing tools from linear-scaling solvers to quantum transport frameworks and molecular dynamics interfaces is essential for tackling complex scientific problems across different domains.The continued advancement of this integrated paradigm promises to accelerate materials discovery and open new frontiers in the predictive simulation of complex quantum phenomena.
基金supported by the National Key Research and Development Program of China(No.2023YFB3712401),the National Natural Science Foundation of China(No.52274301)the Aeronautical Science Foundation of China(No.2023Z0530S6005)the Ningbo Yongjiang Talent-Introduction Programme(No.2022A-023-C).
文摘The viscosity of refining slags plays a critical role in metallurgical processes.However,obtaining accurate viscosity data remains challenging due to the complexities of high-temperature experiments,often relying on empirical models with limited predictive capabilities.This study focuses on the influence of optical basicity on viscosity in CaO-Al_(2)O_(3)-based refining slags,leveraging machine learning to address data scarcity and improve prediction accuracy.An automated framework for algorithm integration,parameter tuning,and evaluation ranking framework(Auto-APE)is employed to develop customized data-driven models for various slag systems,including CaO-Al_(2)O_(3)-SiO_(2),CaO-Al_(2)O_(3)-CaF_(2),CaO-Al_(2)O_(3)-SiO_(2)-MgO,and CaO-Al_(2)O_(3)-SiO_(2)-MgO-CaF_(2).By incorporating optical basicity as a key feature,the models achieve an average validation error of 8.0%to 15.1%,significantly outperforming traditional empirical models.Additionally,symbolic regression is introduced to rapidly construct domain-specific features,such as optical basicity-like descriptors,offering a potential breakthrough in performance prediction for small datasets.This work highlights the critical role of domain-specific knowledge in understanding and predicting viscosity,providing a robust machine learning-based approach for optimizing refining slag properties.
基金supported by the National Key Research and Development Program of China(2024YFE0213000)the Postdoctoral Innovative Talents Support Program(BX20240232)+1 种基金the Natural Science Foundation of China for Young Scholars(72304031)the Fundamental Research Funds for the Central Universities(FRF-TP-22-024A1).
文摘Lithium-ion batteries(LIBs)are widely deployed,from grid-scale storage to electric vehicles.LIBs remain stationary most of their service life,where calendar aging degrades capacity.Understanding the mechanisms of LIB calendar aging is crucial for extending battery lifespan.However,LIB calendar aging is influenced by multiple factors,including battery material,its state,and storage environment.Calendar aging experiments are also time-consuming,costly,and lack standardized testing conditions.This study employs a data-driven approach to establish a cross-scale database linking materials,side-reaction mechanisms,and calendar aging of LIBs.MELODI(Mechanism-informed,Explainable,Learning-based Optimization for Degradation Identification)is proposed to identify calendar aging mechanisms and quantify the effects of multi-scale factors.Results reveal that cathode material loss drives up to 91.42%of calendar aging degradation in high-nickel(Ni)batteries,while solid electrolyte interphase growth dominates in lithium iron phosphate(LFP)and low-Ni batteries,contributing up to 82.43%of degradation in LFP batteries and 99.10%of decay in low-Ni batteries,respectively.This study systematically quantifies calendar aging in commercial LIBs under varying materials,states of charge,and temperatures.These findings offer quantitative guidance for experimental design or battery use,and implications for emerging applications like aerial robotics,vehicle-to-grid,and embodied intelligence systems.