TY - JOUR
T1 - Parametrically Managed Activation Function for Fitting a Neural Network Potential with Physical Behavior Enforced by a Low-Dimensional Potential
AU - Akher, Farideh Badichi
AU - Shu, Yinan
AU - Varga, Zoltan
AU - Bhaumik, Suman
AU - Truhlar, Donald G.
N1 - Publisher Copyright:
© 2023 American Chemical Society.
PY - 2023/6/22
Y1 - 2023/6/22
N2 - Machine-learned representations of potential energy surfaces generated in the output layer of a feedforward neural network are becoming increasingly popular. One difficulty with neural network output is that it is often unreliable in regions where training data is missing or sparse. Human-designed potentials often build in proper extrapolation behavior by choice of functional form. Because machine learning is very efficient, it is desirable to learn how to add human intelligence to machine-learned potentials in a convenient way. One example is the well-understood feature of interaction potentials that they vanish when subsystems are too far separated to interact. In this article, we present a way to add a new kind of activation function to a neural network to enforce low-dimensional constraints. In particular, the activation function depends parametrically on all of the input variables. We illustrate the use of this step by showing how it can force an interaction potential to go to zero at large subsystem separations without either inputting a specific functional form for the potential or adding data to the training set in the asymptotic region of geometries where the subsystems are separated. In the process of illustrating this, we present an improved set of potential energy surfaces for the 14 lowest 3A′ states of O3. The method is more general than this example, and it may be used to add other low-dimensional knowledge or lower-level knowledge to machine-learned potentials. In addition to the O3 example, we present a greater-generality method called parametrically managed diabatization by deep neural network (PM-DDNN) that is an improvement on our previously presented permutationally restrained diabatization by deep neural network (PR-DDNN).
AB - Machine-learned representations of potential energy surfaces generated in the output layer of a feedforward neural network are becoming increasingly popular. One difficulty with neural network output is that it is often unreliable in regions where training data is missing or sparse. Human-designed potentials often build in proper extrapolation behavior by choice of functional form. Because machine learning is very efficient, it is desirable to learn how to add human intelligence to machine-learned potentials in a convenient way. One example is the well-understood feature of interaction potentials that they vanish when subsystems are too far separated to interact. In this article, we present a way to add a new kind of activation function to a neural network to enforce low-dimensional constraints. In particular, the activation function depends parametrically on all of the input variables. We illustrate the use of this step by showing how it can force an interaction potential to go to zero at large subsystem separations without either inputting a specific functional form for the potential or adding data to the training set in the asymptotic region of geometries where the subsystems are separated. In the process of illustrating this, we present an improved set of potential energy surfaces for the 14 lowest 3A′ states of O3. The method is more general than this example, and it may be used to add other low-dimensional knowledge or lower-level knowledge to machine-learned potentials. In addition to the O3 example, we present a greater-generality method called parametrically managed diabatization by deep neural network (PM-DDNN) that is an improvement on our previously presented permutationally restrained diabatization by deep neural network (PR-DDNN).
UR - http://www.scopus.com/inward/record.url?scp=85163756970&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85163756970&partnerID=8YFLogxK
U2 - 10.1021/acs.jpca.3c02627
DO - 10.1021/acs.jpca.3c02627
M3 - Article
C2 - 37307218
AN - SCOPUS:85163756970
SN - 1089-5639
VL - 127
SP - 5287
EP - 5297
JO - Journal of Physical Chemistry A
JF - Journal of Physical Chemistry A
IS - 24
ER -