The dendritic cell algorithm(DCA)is an excellent prototype for developing Machine Learning inspired by the function of the powerful natural immune system.Too many parameters increase complexity and lead to plenty of c...The dendritic cell algorithm(DCA)is an excellent prototype for developing Machine Learning inspired by the function of the powerful natural immune system.Too many parameters increase complexity and lead to plenty of criticism in the signal fusion procedure of DCA.The loss function of DCA is ambiguous due to its complexity.To reduce the uncertainty,several researchers simplified the algorithm program;some introduced gradient descent to optimize parameters;some utilized searching methods to find the optimal parameter combination.However,these studies are either time-consuming or need to be revised in the case of non-convex functions.To overcome the problems,this study models the parameter optimization into a black-box optimization problem without knowing the information about its loss function.This study hybridizes bayesian optimization hyperband(BOHB)with DCA to propose a novel DCA version,BHDCA,for accomplishing parameter optimization in the signal fusion process.The BHDCA utilizes the bayesian optimization(BO)of BOHB to find promising parameter configurations and applies the hyperband of BOHB to allocate the suitable budget for each potential configuration.The experimental results show that the proposed algorithm has significant advantages over the otherDCAexpansion algorithms in terms of signal fusion.展开更多
Boosting algorithms have been widely utilized in the development of landslide susceptibility mapping(LSM)studies.However,these algorithms possess distinct computational strategies and hyperparameters,making it challen...Boosting algorithms have been widely utilized in the development of landslide susceptibility mapping(LSM)studies.However,these algorithms possess distinct computational strategies and hyperparameters,making it challenging to propose an ideal LSM model.To investigate the impact of different boosting algorithms and hyperparameter optimization algorithms on LSM,this study constructed a geospatial database comprising 12 conditioning factors,such as elevation,stratum,and annual average rainfall.The XGBoost(XGB),LightGBM(LGBM),and CatBoost(CB)algorithms were employed to construct the LSM model.Furthermore,the Bayesian optimization(BO),particle swarm optimization(PSO),and Hyperband optimization(HO)algorithms were applied to optimizing the LSM model.The boosting algorithms exhibited varying performances,with CB demonstrating the highest precision,followed by LGBM,and XGB showing poorer precision.Additionally,the hyperparameter optimization algorithms displayed different performances,with HO outperforming PSO and BO showing poorer performance.The HO-CB model achieved the highest precision,boasting an accuracy of 0.764,an F1-score of 0.777,an area under the curve(AUC)value of 0.837 for the training set,and an AUC value of 0.863 for the test set.The model was interpreted using SHapley Additive exPlanations(SHAP),revealing that slope,curvature,topographic wetness index(TWI),degree of relief,and elevation significantly influenced landslides in the study area.This study offers a scientific reference for LSM and disaster prevention research.This study examines the utilization of various boosting algorithms and hyperparameter optimization algorithms in Wanzhou District.It proposes the HO-CB-SHAP framework as an effective approach to accurately forecast landslide disasters and interpret LSM models.However,limitations exist concerning the generalizability of the model and the data processing,which require further exploration in subsequent studies.展开更多
Hyperparameters play a vital impact in the performance of most machine learning algorithms.It is a challenge for traditional methods to con-figure hyperparameters of the capsule network to obtain high-performance manu...Hyperparameters play a vital impact in the performance of most machine learning algorithms.It is a challenge for traditional methods to con-figure hyperparameters of the capsule network to obtain high-performance manually.Some swarm intelligence or evolutionary computation algorithms have been effectively employed to seek optimal hyperparameters as a com-binatorial optimization problem.However,these algorithms are prone to get trapped in the local optimal solution as random search strategies are adopted.The inspiration for the hybrid rice optimization(HRO)algorithm is from the breeding technology of three-line hybrid rice in China,which has the advantages of easy implementation,less parameters and fast convergence.In the paper,genetic search is combined with the hybrid rice optimization algorithm(GHRO)and employed to obtain the optimal hyperparameter of the capsule network automatically,that is,a probability search technique and a hybridization strategy belong with the primary HRO.Thirteen benchmark functions are used to evaluate the performance of GHRO.Furthermore,the MNIST,Chest X-Ray(pneumonia),and Chest X-Ray(COVID-19&pneumonia)datasets are also utilized to evaluate the capsule network learnt by GHRO.The experimental results show that GHRO is an effective method for optimizing the hyperparameters of the capsule network,which is able to boost the performance of the capsule network on image classification.展开更多
Hyperparameter optimization is considered as one of the most challenges in deep learning and dominates the precision of model in a certain.Recent proposals tried to solve this issue through the particle swarm optimiza...Hyperparameter optimization is considered as one of the most challenges in deep learning and dominates the precision of model in a certain.Recent proposals tried to solve this issue through the particle swarm optimization(PSO),but its native defect may result in the local optima trapped and convergence difficulty.In this paper,the genetic operations are introduced to the PSO,which makes the best hyperparameter combination scheme for specific network architecture be located easier.Spe-cifically,to prevent the troubles caused by the different data types and value scopes,a mixed coding method is used to ensure the effectiveness of particles.Moreover,the crossover and mutation opera-tions are added to the process of particles updating,to increase the diversity of particles and avoid local optima in searching.Verified with three benchmark datasets,MNIST,Fashion-MNIST,and CIFAR10,it is demonstrated that the proposed scheme can achieve accuracies of 99.58%,93.39%,and 78.96%,respectively,improving the accuracy by about 0.1%,0.5%,and 2%,respectively,compared with that of the PSO.展开更多
With the rapid adoption of artificial intelligence(AI)in domains such as power,transportation,and finance,the number of machine learning and deep learning models has grown exponentially.However,challenges such as dela...With the rapid adoption of artificial intelligence(AI)in domains such as power,transportation,and finance,the number of machine learning and deep learning models has grown exponentially.However,challenges such as delayed retraining,inconsistent version management,insufficient drift monitoring,and limited data security still hinder efficient and reliable model operations.To address these issues,this paper proposes the Intelligent Model Lifecycle Management Algorithm(IMLMA).The algorithm employs a dual-trigger mechanism based on both data volume thresholds and time intervals to automate retraining,and applies Bayesian optimization for adaptive hyperparameter tuning to improve performance.A multi-metric replacement strategy,incorporating MSE,MAE,and R2,ensures that new models replace existing ones only when performance improvements are guaranteed.A versioning and traceability database supports comparison and visualization,while real-time monitoring with stability analysis enables early warnings of latency and drift.Finally,hash-based integrity checks secure both model files and datasets.Experimental validation in a power metering operation scenario demonstrates that IMLMA reduces model update delays,enhances predictive accuracy and stability,and maintains low latency under high concurrency.This work provides a practical,reusable,and scalable solution for intelligent model lifecycle management,with broad applicability to complex systems such as smart grids.展开更多
基金National Natural Science Foundation of China with the Grant Number 61877045。
文摘The dendritic cell algorithm(DCA)is an excellent prototype for developing Machine Learning inspired by the function of the powerful natural immune system.Too many parameters increase complexity and lead to plenty of criticism in the signal fusion procedure of DCA.The loss function of DCA is ambiguous due to its complexity.To reduce the uncertainty,several researchers simplified the algorithm program;some introduced gradient descent to optimize parameters;some utilized searching methods to find the optimal parameter combination.However,these studies are either time-consuming or need to be revised in the case of non-convex functions.To overcome the problems,this study models the parameter optimization into a black-box optimization problem without knowing the information about its loss function.This study hybridizes bayesian optimization hyperband(BOHB)with DCA to propose a novel DCA version,BHDCA,for accomplishing parameter optimization in the signal fusion process.The BHDCA utilizes the bayesian optimization(BO)of BOHB to find promising parameter configurations and applies the hyperband of BOHB to allocate the suitable budget for each potential configuration.The experimental results show that the proposed algorithm has significant advantages over the otherDCAexpansion algorithms in terms of signal fusion.
基金funded by the Natural Science Foundation of Chongqing(Grants No.CSTB2022NSCQ-MSX0594)the Humanities and Social Sciences Research Project of the Ministry of Education(Grants No.16YJCZH061).
文摘Boosting algorithms have been widely utilized in the development of landslide susceptibility mapping(LSM)studies.However,these algorithms possess distinct computational strategies and hyperparameters,making it challenging to propose an ideal LSM model.To investigate the impact of different boosting algorithms and hyperparameter optimization algorithms on LSM,this study constructed a geospatial database comprising 12 conditioning factors,such as elevation,stratum,and annual average rainfall.The XGBoost(XGB),LightGBM(LGBM),and CatBoost(CB)algorithms were employed to construct the LSM model.Furthermore,the Bayesian optimization(BO),particle swarm optimization(PSO),and Hyperband optimization(HO)algorithms were applied to optimizing the LSM model.The boosting algorithms exhibited varying performances,with CB demonstrating the highest precision,followed by LGBM,and XGB showing poorer precision.Additionally,the hyperparameter optimization algorithms displayed different performances,with HO outperforming PSO and BO showing poorer performance.The HO-CB model achieved the highest precision,boasting an accuracy of 0.764,an F1-score of 0.777,an area under the curve(AUC)value of 0.837 for the training set,and an AUC value of 0.863 for the test set.The model was interpreted using SHapley Additive exPlanations(SHAP),revealing that slope,curvature,topographic wetness index(TWI),degree of relief,and elevation significantly influenced landslides in the study area.This study offers a scientific reference for LSM and disaster prevention research.This study examines the utilization of various boosting algorithms and hyperparameter optimization algorithms in Wanzhou District.It proposes the HO-CB-SHAP framework as an effective approach to accurately forecast landslide disasters and interpret LSM models.However,limitations exist concerning the generalizability of the model and the data processing,which require further exploration in subsequent studies.
基金supported by National Natural Science Foundation of China (Grant:41901296,62202147).
文摘Hyperparameters play a vital impact in the performance of most machine learning algorithms.It is a challenge for traditional methods to con-figure hyperparameters of the capsule network to obtain high-performance manually.Some swarm intelligence or evolutionary computation algorithms have been effectively employed to seek optimal hyperparameters as a com-binatorial optimization problem.However,these algorithms are prone to get trapped in the local optimal solution as random search strategies are adopted.The inspiration for the hybrid rice optimization(HRO)algorithm is from the breeding technology of three-line hybrid rice in China,which has the advantages of easy implementation,less parameters and fast convergence.In the paper,genetic search is combined with the hybrid rice optimization algorithm(GHRO)and employed to obtain the optimal hyperparameter of the capsule network automatically,that is,a probability search technique and a hybridization strategy belong with the primary HRO.Thirteen benchmark functions are used to evaluate the performance of GHRO.Furthermore,the MNIST,Chest X-Ray(pneumonia),and Chest X-Ray(COVID-19&pneumonia)datasets are also utilized to evaluate the capsule network learnt by GHRO.The experimental results show that GHRO is an effective method for optimizing the hyperparameters of the capsule network,which is able to boost the performance of the capsule network on image classification.
基金the National Key Research and Development Program of China(No.2022ZD0119003)the National Natural Science Foundation of China(No.61834005).
文摘Hyperparameter optimization is considered as one of the most challenges in deep learning and dominates the precision of model in a certain.Recent proposals tried to solve this issue through the particle swarm optimization(PSO),but its native defect may result in the local optima trapped and convergence difficulty.In this paper,the genetic operations are introduced to the PSO,which makes the best hyperparameter combination scheme for specific network architecture be located easier.Spe-cifically,to prevent the troubles caused by the different data types and value scopes,a mixed coding method is used to ensure the effectiveness of particles.Moreover,the crossover and mutation opera-tions are added to the process of particles updating,to increase the diversity of particles and avoid local optima in searching.Verified with three benchmark datasets,MNIST,Fashion-MNIST,and CIFAR10,it is demonstrated that the proposed scheme can achieve accuracies of 99.58%,93.39%,and 78.96%,respectively,improving the accuracy by about 0.1%,0.5%,and 2%,respectively,compared with that of the PSO.
基金funded by Anhui NARI ZT Electric Co.,Ltd.,entitled“Research on the Shared Operation and Maintenance Service Model for Metering Equipment and Platform Development for the Modern Industrial Chain”(Grant No.524636250005).
文摘With the rapid adoption of artificial intelligence(AI)in domains such as power,transportation,and finance,the number of machine learning and deep learning models has grown exponentially.However,challenges such as delayed retraining,inconsistent version management,insufficient drift monitoring,and limited data security still hinder efficient and reliable model operations.To address these issues,this paper proposes the Intelligent Model Lifecycle Management Algorithm(IMLMA).The algorithm employs a dual-trigger mechanism based on both data volume thresholds and time intervals to automate retraining,and applies Bayesian optimization for adaptive hyperparameter tuning to improve performance.A multi-metric replacement strategy,incorporating MSE,MAE,and R2,ensures that new models replace existing ones only when performance improvements are guaranteed.A versioning and traceability database supports comparison and visualization,while real-time monitoring with stability analysis enables early warnings of latency and drift.Finally,hash-based integrity checks secure both model files and datasets.Experimental validation in a power metering operation scenario demonstrates that IMLMA reduces model update delays,enhances predictive accuracy and stability,and maintains low latency under high concurrency.This work provides a practical,reusable,and scalable solution for intelligent model lifecycle management,with broad applicability to complex systems such as smart grids.