Abstract
Graph Neural Networks (GNNs) have revolutionized material property prediction by learning directly from the structural information of molecules and materials. However, conventional GNN models rely solely on local atomic interactions, such as bond lengths and angles, neglecting crucial longrange electrostatic forces that aect certain properties. To address this, we introduce the Molecular Graph Transformer (MGT), a novel GNN architecture that combines local attention mechanisms with message passing on both bond graphs and their line graphs, explicitly capturing long-range interactions. Benchmarking on MatBench and Quantum MOF (QMOF) datasets demonstrates that MGT's improved understanding of electrostatic interactions signicantly enhances the prediction accuracy of properties like exfoliation energy and refractive index, while maintaining state-of-theart performance on all other properties. This breakthrough paves the way for the development of highly accurate and efficient materials design tools across diverse applications. Code is available at: https://github.com/MolecularGraphTransformer/MGT
Supplementary materials
Title
Electronic Supporting Information
Description
Supporting Information for “Molecular Graph Transformer: Stepping Beyond ALIGNN Into LongRange Interactions”
Actions