Issue 5, 2024

Molecular graph transformer: stepping beyond ALIGNN into long-range interactions

Abstract

Graph Neural Networks (GNNs) have revolutionized material property prediction by learning directly from the structural information of molecules and materials. However, conventional GNN models rely solely on local atomic interactions, such as bond lengths and angles, neglecting crucial long-range electrostatic forces that affect certain properties. To address this, we introduce the Molecular Graph Transformer (MGT), a novel GNN architecture that combines local attention mechanisms with message passing on both bond graphs and their line graphs, explicitly capturing long-range interactions. Benchmarking on MatBench and Quantum MOF (QMOF) datasets demonstrates that MGT's improved understanding of electrostatic interactions significantly enhances the prediction accuracy of properties like exfoliation energy and refractive index, while maintaining state-of-the-art performance on all other properties. This breakthrough paves the way for the development of highly accurate and efficient materials design tools across diverse applications.

Graphical abstract: Molecular graph transformer: stepping beyond ALIGNN into long-range interactions

Supplementary files

Article information

Article type
Paper
Submitted
19 Jan 2024
Accepted
23 Apr 2024
First published
23 Apr 2024
This article is Open Access
Creative Commons BY license

Digital Discovery, 2024,3, 1048-1057

Molecular graph transformer: stepping beyond ALIGNN into long-range interactions

M. Anselmi, G. Slabaugh, R. Crespo-Otero and D. Di Tommaso, Digital Discovery, 2024, 3, 1048 DOI: 10.1039/D4DD00014E

This article is licensed under a Creative Commons Attribution 3.0 Unported Licence. You can use material from this article in other publications without requesting further permissions from the RSC, provided that the correct acknowledgement is given.

Read more about how to correctly acknowledge RSC content.

Social activity

Spotlight

Advertisements