These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
Transformer_Binding_Prediction_fixed-pages-deleted.pdf (3.56 MB)

Predicting Binding from Screening Assays with Transformer Network Embeddings

submitted on 15.01.2020, 22:27 and posted on 24.01.2020, 21:34 by Paul Morris, Rachel St Clair, Elan Barenholtz, William Edward Hahn
Cheminformatics aims to assist in chemistry applications that depend on molecular interactions, structural characteristics, and functional properties. The arrival of deep learning and the abundance of easily accessible chemical data from repositories like PubChem have enabled advancements in computer-aided drug discovery. Virtual High-Throughput Screening (vHTS) is one such technique that integrates chemical domain knowledge to perform in silico biomolecular simulations, but prediction of binding affinity is restricted due to limited availability of ground-truth binding assay results. Here, text representations of 83,000,000 molecules are leveraged to enable single-target binding affinity prediction directly on the outcome of screening assays. The embedding of an end-to-end Transformer neural network, trained to encode the structural characteristics of a molecule via a text-based translation task, is repurposed through transfer learning to classify binding affinity to a single target. Classifiers trained on the embedding outperform those trained on SMILES strings for multiple tasks, receiving between 0.67-0.99 AUC. Visualization reveals organization of structural and functional properties in the learned embedding useful for binding prediction. The proposed model is suitable for parallel computing, enabling rapid screening as a complement to virtual screening techniques when limited data is available.


Email Address of Submitting Author


Florida Atlantic University


United States of America

ORCID For Submitting Author


Declaration of Conflict of Interest