These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
2 files

Toward Learned Chemical Perception of Force Field Typing Rules

revised on 12.10.2018, 20:29 and posted on 15.10.2018, 17:44 by Camila Zanette, Caitlin C. Bannan, Christopher I. Bayly, Josh Fass, Michael Gilson, Michael R. Shirts, John Chodera, David Mobley
Molecular mechanics force fields define how the energy and forces of a molecular system are computed from its atomic positions, and enable the study of such systems through computational methods like molecular dynamics and Monte Carlo simulations. Despite progress toward automated force field parameterization, considerable human expertise is required to develop or extend force fields.
In particular, human input has long been required to define atom types, which encode chemically unique environments that determine which parameters must be assigned. However, relying on humans to establish atom types is suboptimal: the resulting atom types are often unjustified from a statistical perspective, leading to over- or under-fitting; they are difficult to extend in a systematic and consistent manner when new chemistries must be modeled or new data becomes available; and human effort is not scalable when force fields must be generated for new (bio)polymers or materials. We aim to replace human specification of atom types with an automated approach, based on solid statistics and driven by experimental and/or quantum chemical reference data. Here, we describe a novel technology for this purpose, termed SMARTY, which generalizes atom typing by using direct chemical perception with SMARTS strings, and adopting a hierarchical approach to type assignment. The SMARTY technology enables creation of a move set in atom-typing space that can be used in a Monte Carlo optimization approach to atom typing. We demonstrate the power of this approach with a fully automated procedure that is able to re-discover human-defined atom types in the traditional small molecule force field parm99/parm@Frosst. Furthermore, we show how an extension of this approach that makes use of SMIRKS strings to match multiple atoms, which we term SMIRKY, allows us to take full advantage of the advances in direct chemical perception for valence types (bonds, angles, and torsions) afforded by the recently-proposed SMIRNOFF direct chemical perception force field typing language. We assess these approaches using several molecular datasets, including one which covers a diverse molecular subset from DrugBank.


DLM, CCB, and CZ appreciate the financial support from the National Science Foundation (CHE 1352608) and the National Institutes of Health (1R01GM108889-01) and computing support from the UCI GreenPlanet cluster, supported in part by NSF Grant CHE-0840513. CCB is also supported by a fellowship from The Molecular Sciences Software Institute under NSF grant ACI-1547580. CZ also appreciates the Brazilian Science without Borders scholarship (Capes - BEX 1865612-9). MKG appreciates financial support from the NIH (GM061300). JDC appreciates support from the Sloan Kettering Institute and NIH grant P30 758 CA008748, Cycle for Survival, and the Sloan Kettering Institute. JF acknowledges support from NSF grant CHE-1738979. MRS acknowledges support from NSF grant CHE-173897. CIB acknowledges support by OpenEye Scientific Software for his sabbatical term contributing to this work.


Email Address of Submitting Author


University of California, Irvine


United States of America

ORCID For Submitting Author


Declaration of Conflict of Interest

MKG has an equity interest in, and is a co-founder and scientific advisor of, VeraChem LLC. DLM serves on the scientific advisory board of OpenEye Scientific Software. JDC serves on the scientific advisory board of Schrödinger. As far as we know no conflict of interest exist as this work is independent of these entities and free and open source.