On-the-fly fine-tuning of foundational neural network potentials: a Bayesian neural network approach

Abstract

Due to the computational complexity of evaluating interatomic forces from first principles, the creation of interatomic machine learning force fields has become a highly active field of research. However, the generation of training datasets of sufficient size and sample diversity itself comes with a computational burden that can make this approach impractical for modeling rare events or systems with a large configuration space. Fine-tuning foundation models that have been pre-trained on large-scale material or molecular databases offers a promising opportunity to reduce the amount of training data necessary to reach a desired level of accuracy. However, even if this approach requires less training data overall, creating a suitable training dataset can still be a very challenging problem, especially for systems with rare events and for end-users who don't have an extensive background in machine learning. In on-the-fly learning, the creation of a training dataset can be largely automated by using model uncertainty during the simulation to decide if the model is accurate enough or if a structure should be recalculated with quantum-chemical methods and used to update the model. A key challenge for applying this form of active learning to the fine-tuning of foundation models is how to assess the uncertainty of those models during the fine-tuning process, even though most foundation models lack any form of uncertainty quantification. In this paper, we overcome this challenge by introducing a fine-tuning approach based on Bayesian neural network methods and a subsequent on-the-fly workflow that automatically fine-tunes the model while maintaining a pre-specified accuracy and can detect rare events such as transition states and sample them at an increased rate relative to their occurrence.

Graphical abstract: On-the-fly fine-tuning of foundational neural network potentials: a Bayesian neural network approach

Supplementary files

Article information

Article type
Paper
Submitted
01 Sep 2025
Accepted
17 Mar 2026
First published
07 Apr 2026
This article is Open Access
Creative Commons BY license

Digital Discovery, 2026, Advance Article

On-the-fly fine-tuning of foundational neural network potentials: a Bayesian neural network approach

T. Rensmeyer, D. Kramer and O. Niggemann, Digital Discovery, 2026, Advance Article , DOI: 10.1039/D5DD00392J

This article is licensed under a Creative Commons Attribution 3.0 Unported Licence. You can use material from this article in other publications without requesting further permissions from the RSC, provided that the correct acknowledgement is given.

Read more about how to correctly acknowledge RSC content.

Social activity

Spotlight

Advertisements