摘要
Efficiently creating a concise but comprehensive data set for training machine-learned interatomic potentials(MLIPs)is an under-explored problem.Active learning,which uses biased or unbiased molecular dynamics(MD)to generate candidate pools,aims to address this objective.Existing biased and unbiased MD-simulation methods,however,are prone to miss either rare events or extrapolative regions—areas of the configurational space where unreliable predictions are made.This work demonstrates that MD,when biased by the MLIP’s energy uncertainty,simultaneously captures extrapolative regions and rare events,which is crucial for developing uniformly accurate MLIPs.Furthermore,exploiting automatic differentiation,we enhance bias-forces-driven MD with the concept of bias stress.
基金
Funded by Deutsche Forschungsgemeinschaft(DFG,German Research Foundation)under Germany’s Excellence Strategy-EXC 2075-390740016。