期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Fast and Fourier features for transfer learning of interatomic potentials
1
作者 Pietro Novelli Giacomo Meanti +4 位作者 Pedro J.Buigues Lorenzo Rosasco Michele Parrinello Massimiliano Pontil Luigi Bonati 《npj Computational Materials》 2025年第1期3189-3201,共13页
Training machine learning interatomic potentials that are both computationally and data-efficient is a key challenge for enabling their routine use in atomistic simulations.To this effect,we introduce franken,a scalab... Training machine learning interatomic potentials that are both computationally and data-efficient is a key challenge for enabling their routine use in atomistic simulations.To this effect,we introduce franken,a scalable and lightweight transfer learning framework that extracts atomic descriptors from pre-trained graph neural networks and transfers them to new systems using random Fourier features—an efficient and scalable approximation of kernel methods.It also provides a closed-form finetuning strategy for general-purpose potentials such as MACE-MP0,enabling fast and accurate adaptation to new systems or levels of quantum mechanical theory with minimal hyperparameter tuning.On a benchmark dataset of 27 transition metals,franken outperforms optimized kernelbased methods in both training time and accuracy,reducing model training from tens of hours to minutes on a single GPU.We further demonstrate the framework’s strong data-efficiency by training stable and accurate potentials for bulk water and the Pt(111)/water interface using just tens of training structures.Our open-source implementation(https://franken.readthedocs.io)offers a fast and practical solution for training potentials and deploying them for molecular dynamics simulations across diverse systems. 展开更多
关键词 transfer learning kernel methodsit interatomic potentials transfers them atomic descriptors atomistic simulationsto graph neural networks transfer learning framework
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部