ChemRxiv
These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
GraphTransformer.pdf (380.4 kB)

Path-Augmented Graph Transformer Network

preprint
submitted on 03.06.2019, 21:27 and posted on 04.06.2019, 15:57 by Benson Chen, Regina Barzilay, Tommi S Jaakkola
Much of the recent work on learning molecular representations has been based on Graph Convolution Networks (GCN). These models rely on local aggregation operations and can therefore miss higher-order graph properties. To remedy this, we propose Path-Augmented Graph Transformer Networks (PAGTN) that are explicitly built on longer-range dependencies in graphstructured data. Specifically, we use path features in molecular graphs to create global attention layers. We compare our PAGTN model against the GCN model and show that our model consistently
outperforms GCNs on molecular property prediction datasets including quantum chemistry (QM7, QM8, QM9), physical chemistry (ESOL, Lipophilictiy) and biochemistry (BACE, BBBP)2.

History

Email Address of Submitting Author

bensonc@csail.mit.edu

Institution

MIT

Country

USA

ORCID For Submitting Author

0000-0002-0129-7868

Declaration of Conflict of Interest

No conflict of interest

Exports

Logo branding

Exports