期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Anisotropic temperature-dependent lattice parameters and elastic constants from first principles
1
作者 Samare Rostami Matteo Giantomassi Xavier Gonze 《npj Computational Materials》 2025年第1期2936-2950,共15页
Wepresent an efficient implementation of the Zero Static Internal Stress Approximation(ZSISA)within the Quasi-Harmonic Approximation framework to compute anisotropic thermal expansion and elastic constants from first ... Wepresent an efficient implementation of the Zero Static Internal Stress Approximation(ZSISA)within the Quasi-Harmonic Approximation framework to compute anisotropic thermal expansion and elastic constants from first principles.By replacing the costly multidimensional minimization with a gradientbased method that leverages second-order derivatives of the vibrational free energy,the number of required phonon band structure calculations is significantly reduced:only six are needed for hexagonal,trigonal,and tetragonal systems,and 10–28 for lower-symmetry systems to determine the temperature dependence of lattice parameters and thermal expansion.This approach enables accurate modeling of anisotropic thermal expansion while substantially lowering computational cost compared to standard ZSISA method.The implementation is validated on a range of materials with symmetries from cubic to triclinic and is extended to compute temperature-dependent elastic constants with only a few additional phonon band structure calculations. 展开更多
关键词 phonon band structure calculations multidimensional minimization zero static internal stress approximation zsisa within gradientbased method zero static internal stress approximation anisotropic thermal expansion elastic constants first principles
原文传递
Speed up Training of the Recurrent Neural Network Based on Constrained optimization Techniques
2
作者 陈珂 包威权 迟惠生 《Journal of Computer Science & Technology》 SCIE EI CSCD 1996年第6期581-588,共8页
In this paper, the constrained optimization technique for a substantial prob-lem is explored, that is accelerating training the globally recurrent neural net-work. Unlike most of the previous methods in feedforward ne... In this paper, the constrained optimization technique for a substantial prob-lem is explored, that is accelerating training the globally recurrent neural net-work. Unlike most of the previous methods in feedforward neuxal networks, the authors adopt the constrained optimization technique to improve the gradiellt-based algorithm of the globally recuxrent neural network for the adaptive learn-ing rate during training. Using the recurrent network with the improved algo-rithm, some experiments in two real-world problems, namely filtering additive noises in acoustic data and classification of temporal signals for speaker identifi-cation, have been performed. The experimental results show that the recurrent neural network with the improved learning algorithm yields significantly faster training and achieves the satisfactory performance. 展开更多
关键词 Recurrent neural network adaptive learning rate gradientbased algorithm
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部