In the evolving landscape of artificial intelligence and machine learning, the choice of optimization algorithm can significantly impact the success of model training and the accuracy of predictions. This paper embark...In the evolving landscape of artificial intelligence and machine learning, the choice of optimization algorithm can significantly impact the success of model training and the accuracy of predictions. This paper embarks on a rigorous and comprehensive exploration of widely adopted optimization techniques, specifically focusing on their performance when applied to the notoriously challenging Rosenbrock function. As a benchmark problem known for its deceptive curvature and narrow valleys, the Rosenbrock function provides a fertile ground for examining the nuances and intricacies of algorithmic behavior. The study delves into a diverse array of optimization methods, including traditional Gradient Descent, its stochastic variant (SGD), and the more sophisticated Gradient Descent with Momentum. The investigation further extends to adaptive methods like RMSprop, AdaGrad, and the highly regarded Adam optimizer. By meticulously analyzing and visualizing the optimization paths, convergence rates, and gradient norms, this paper uncovers critical insights into the strengths and limitations of each technique. Our findings not only illuminate the intricate dynamics of these algorithms but also offer actionable guidance for their deployment in complex, real-world optimization problems. This comparative analysis promises to intrigue and inspire researchers and practitioners alike, as it reveals the subtle yet profound impacts of algorithmic choices in the quest for optimization excellence.展开更多
A class of parallel Rosenbrock methods for differential algebraic equations are presented in this paper. The local truncation errors are defined and the order conditions are established by using the DA-trees and DA-se...A class of parallel Rosenbrock methods for differential algebraic equations are presented in this paper. The local truncation errors are defined and the order conditions are established by using the DA-trees and DA-series. The paper also deals with the convergence of the parallel Rosenbrock methods for h -> 0 and states the bounds for the global errors of the methods. Some particular methods are obtained by solving the order equations and a numerical example is given, from which the theoretical orders are actually observed.展开更多
Condition assessment of bridges has become increasingly important. In order to accurately simulate the real bridge, finite element (FE) model updating method is often applied. This paper presents the calibration of ...Condition assessment of bridges has become increasingly important. In order to accurately simulate the real bridge, finite element (FE) model updating method is often applied. This paper presents the calibration of the FE model of a reinforced concrete tied-arch bridge using Douglas-Reid method in combination with Rosenbrock optimization algorithm. Based on original drawings and topographic survey, a FE model of the investigated bridge is created. Eight global modes of vibration of the bridge are identified by ambient vibration tests and the frequency domain decomposition technique. Then, eight structural parameters are selected for FE model updating procedure through sensitivity analysis. Finally, the optimal structural parameters are identified using Rosenbrock optimization algorithm. Results show that although the identified parameters lead to a perfect agreement between approximate and measured natural frequencies, they may not be the optimal variables which minimize the differences between numerical and experimental modal data. However, a satisfied agreement between them is still presented. Hence, FE model updating based on Douglas-Reid method and Rosenbrock optimization algorithm could be used as an alternative to other complex updating procedures.展开更多
针对标准灰狼优化(grey wolf optimization,GWO)算法存在后期收敛速度慢,求解精度不高,易出现早熟收敛现象等问题,提出了一种基于对立学习策略和Rosenbrock局部搜索的混合灰狼优化(hybrid GWO,HGWO)算法。该算法首先采用对立学习策略取...针对标准灰狼优化(grey wolf optimization,GWO)算法存在后期收敛速度慢,求解精度不高,易出现早熟收敛现象等问题,提出了一种基于对立学习策略和Rosenbrock局部搜索的混合灰狼优化(hybrid GWO,HGWO)算法。该算法首先采用对立学习策略取代随机初始化生成初始种群,以保证群体的多样性;然后对当前群体中最优个体进行Rosenbrock局部搜索,以增强局部搜索能力和加快收敛速度;最后为了避免算法出现早熟收敛现象,利用精英对立学习方法产生精英对立个体。对6个标准测试函数进行仿真实验,并与其他算法进行比较,结果表明,HGWO算法收敛速度快,求解精度高。展开更多
文摘In the evolving landscape of artificial intelligence and machine learning, the choice of optimization algorithm can significantly impact the success of model training and the accuracy of predictions. This paper embarks on a rigorous and comprehensive exploration of widely adopted optimization techniques, specifically focusing on their performance when applied to the notoriously challenging Rosenbrock function. As a benchmark problem known for its deceptive curvature and narrow valleys, the Rosenbrock function provides a fertile ground for examining the nuances and intricacies of algorithmic behavior. The study delves into a diverse array of optimization methods, including traditional Gradient Descent, its stochastic variant (SGD), and the more sophisticated Gradient Descent with Momentum. The investigation further extends to adaptive methods like RMSprop, AdaGrad, and the highly regarded Adam optimizer. By meticulously analyzing and visualizing the optimization paths, convergence rates, and gradient norms, this paper uncovers critical insights into the strengths and limitations of each technique. Our findings not only illuminate the intricate dynamics of these algorithms but also offer actionable guidance for their deployment in complex, real-world optimization problems. This comparative analysis promises to intrigue and inspire researchers and practitioners alike, as it reveals the subtle yet profound impacts of algorithmic choices in the quest for optimization excellence.
基金the National Natural Science Foundation of China (No. 19871080)
文摘A class of parallel Rosenbrock methods for differential algebraic equations are presented in this paper. The local truncation errors are defined and the order conditions are established by using the DA-trees and DA-series. The paper also deals with the convergence of the parallel Rosenbrock methods for h -> 0 and states the bounds for the global errors of the methods. Some particular methods are obtained by solving the order equations and a numerical example is given, from which the theoretical orders are actually observed.
文摘Condition assessment of bridges has become increasingly important. In order to accurately simulate the real bridge, finite element (FE) model updating method is often applied. This paper presents the calibration of the FE model of a reinforced concrete tied-arch bridge using Douglas-Reid method in combination with Rosenbrock optimization algorithm. Based on original drawings and topographic survey, a FE model of the investigated bridge is created. Eight global modes of vibration of the bridge are identified by ambient vibration tests and the frequency domain decomposition technique. Then, eight structural parameters are selected for FE model updating procedure through sensitivity analysis. Finally, the optimal structural parameters are identified using Rosenbrock optimization algorithm. Results show that although the identified parameters lead to a perfect agreement between approximate and measured natural frequencies, they may not be the optimal variables which minimize the differences between numerical and experimental modal data. However, a satisfied agreement between them is still presented. Hence, FE model updating based on Douglas-Reid method and Rosenbrock optimization algorithm could be used as an alternative to other complex updating procedures.
文摘针对标准灰狼优化(grey wolf optimization,GWO)算法存在后期收敛速度慢,求解精度不高,易出现早熟收敛现象等问题,提出了一种基于对立学习策略和Rosenbrock局部搜索的混合灰狼优化(hybrid GWO,HGWO)算法。该算法首先采用对立学习策略取代随机初始化生成初始种群,以保证群体的多样性;然后对当前群体中最优个体进行Rosenbrock局部搜索,以增强局部搜索能力和加快收敛速度;最后为了避免算法出现早熟收敛现象,利用精英对立学习方法产生精英对立个体。对6个标准测试函数进行仿真实验,并与其他算法进行比较,结果表明,HGWO算法收敛速度快,求解精度高。