Parametrically Managed Activation Function for a Fitting a Neural Network Potential with Physical Behavior Enforced by a Low-Dimensional Potential

27 April 2023, Version 2
This content is a preprint and has not undergone peer review at the time of posting.

Abstract

Machine-learned representations of potential energy surfaces generated in the output layer of a feedforward neural network are becoming increasingly popular. One difficulty with neural-network output is that it is often unreliable in regions where training data is missing or sparse. Human-designed potentials often build in proper extrapolation behavior by choice of functional form. Because machine learning is very efficient, it is desirable to learn how to add human intelligence to machine-learned potentials in a convenient way. One example is the well understood feature of interaction potentials that they vanish when subsystems are too far separated to interact. In this article, we present a way to add a new kind of activation function to a neural network to enforce low-dimensional constraints. In particular the activation function depends parametrically on all the input variables. We illustrate the use of this step by showing how it can force an interaction potential to go to zero at large subsystem separations with either inputting a specific functional form for the potential or adding data to the training set in the asymptotic region of geometries where the subsystems are separated. In the process of illustrating this, we present an improved set of potential energy surfaces for the 14 lowest 3A´ states of O3. The method is more general than this example, and it may be used to add other low-dimensional knowledge or lower-level knowledge to machine-learned potentials. In addition to the O3 example, we present a greater-generality method called parametrically managed diabatization by deep neural network (PM-DDNN) that is an improvement on our previously presented permutationally restrained diabatization by deep neural network (PR-DDNN).

Keywords

Potential Energy Surface
Nonadiabatic
Neural Network
Machine Learning
Oxygen
Activation Function

Comments

Comments are not moderated before they are posted, but they can be removed by the site moderators if they are found to be in contravention of our Commenting Policy [opens in a new tab] - please read this policy before you post. Comments should be used for scholarly discussion of the content in question. You can find more information about how to use the commenting feature here [opens in a new tab] .
This site is protected by reCAPTCHA and the Google Privacy Policy [opens in a new tab] and Terms of Service [opens in a new tab] apply.