Transfer learning of GW Bethe--Salpeter Equation excitation energies
Abstract
A persistent challenge in machine learning for electronic-structure calculations is the sharp imbalance between abundant low-fidelity data like DFT or TDDFT results and the scarcity of high-fidelity data like many-body perturbation theory labels. We show that transfer learning provides an effective route to bridge this gap: graph neural networks pretrained on DFT and TDDFT properties can be finetuned with limited qsGW and qsGW-BSE data to yield accurate predictions of quasiparticle and excitation energies. Assessing both full-model and readout-only finetuning across chemically diverse test sets, we find that pretraining improves accuracy, reduces reliance on costly qsGW data, and mitigates large predictive outliers even for molecules larger or chemically distinct from those seen during finetuning. Our results demonstrate that multi-fidelity transfer learning can substantially extend the reach of many-body-level predictions across chemical space.
Please wait while we load your content...