Neural Networks in Chemical Reaction Dynamics

Author(s):  
Lionel Raff ◽  
Ranga Komanduri ◽  
Martin Hagan ◽  
Satish Bukkapatnam

This monograph presents recent advances in neural network (NN) approaches and applications to chemical reaction dynamics. Topics covered include: (i) the development of ab initio potential-energy surfaces (PES) for complex multichannel systems using modified novelty sampling and feedforward NNs; (ii) methods for sampling the configuration space of critical importance, such as trajectory and novelty sampling methods and gradient fitting methods; (iii) parametrization of interatomic potential functions using a genetic algorithm accelerated with a NN; (iv) parametrization of analytic interatomic potential functions using NNs; (v) self-starting methods for obtaining analytic PES from ab inito electronic structure calculations using direct dynamics; (vi) development of a novel method, namely, combined function derivative approximation (CFDA) for simultaneous fitting of a PES and its corresponding force fields using feedforward neural networks; (vii) development of generalized PES using many-body expansions, NNs, and moiety energy approximations; (viii) NN methods for data analysis, reaction probabilities, and statistical error reduction in chemical reaction dynamics; (ix) accurate prediction of higher-level electronic structure energies (e.g. MP4 or higher) for large databases using NNs, lower-level (Hartree-Fock) energies, and small subsets of the higher-energy database; and finally (x) illustrative examples of NN applications to chemical reaction dynamics of increasing complexity starting from simple near equilibrium structures (vibrational state studies) to more complex non-adiabatic reactions. The monograph is prepared by an interdisciplinary group of researchers working as a team for nearly two decades at Oklahoma State University, Stillwater, OK with expertise in gas phase reaction dynamics; neural networks; various aspects of MD and Monte Carlo (MC) simulations of nanometric cutting, tribology, and material properties at nanoscale; scaling laws from atomistic to continuum; and neural networks applications to chemical reaction dynamics. It is anticipated that this emerging field of NN in chemical reaction dynamics will play an increasingly important role in MD, MC, and quantum mechanical studies in the years to come.

Author(s):  
Raymond Fox

This monograph presents recent advances in neural network (NN) approaches and applications to chemical reaction dynamics. Topics covered include: (i) the development of ab initio potential-energy surfaces (PES) for complex multichannel systems using modified novelty sampling and feedforward NNs; (ii) methods for sampling the configuration space of critical importance, such as trajectory and novelty sampling methods and gradient fitting methods; (iii) parametrization of interatomic potential functions using a genetic algorithm accelerated with a NN; (iv) parametrization of analytic interatomic potential functions using NNs; (v) self-starting methods for obtaining analytic PES from ab inito electronic structure calculations using direct dynamics; (vi) development of a novel method, namely, combined function derivative approximation (CFDA) for simultaneous fitting of a PES and its corresponding force fields using feedforward neural networks; (vii) development of generalized PES using many-body expansions, NNs, and moiety energy approximations; (viii) NN methods for data analysis, reaction probabilities, and statistical error reduction in chemical reaction dynamics; (ix) accurate prediction of higher-level electronic structure energies (e.g. MP4 or higher) for large databases using NNs, lower-level (Hartree-Fock) energies, and small subsets of the higher-energy database; and finally (x) illustrative examples of NN applications to chemical reaction dynamics of increasing complexity starting from simple near equilibrium structures (vibrational state studies) to more complex non-adiabatic reactions. The monograph is prepared by an interdisciplinary group of researchers working as a team for nearly two decades at Oklahoma State University, Stillwater, OK with expertise in gas phase reaction dynamics; neural networks; various aspects of MD and Monte Carlo (MC) simulations of nanometric cutting, tribology, and material properties at nanoscale; scaling laws from atomistic to continuum; and neural networks applications to chemical reaction dynamics. It is anticipated that this emerging field of NN in chemical reaction dynamics will play an increasingly important role in MD, MC, and quantum mechanical studies in the years to come.


Author(s):  
Lionel Raff ◽  
Ranga Komanduri ◽  
Martin Hagan ◽  
Satish Bukkapatnam

In this section, we want to give a brief introduction to neural networks (NNs). It is written for readers who are not familiar with neural networks but are curious about how they can be applied to practical problems in chemical reaction dynamics. The field of neural networks covers a very broad area. It is not possible to discuss all types of neural networks. Instead, we will concentrate on the most common neural network architecture, namely, the multilayer perceptron (MLP). We will describe the basics of this architecture, discuss its capabilities, and show how it has been used on several different chemical reaction dynamics problems (for introductions to other types of networks, the reader is referred to References 105-107). For the purposes of this document, we will look at neural networks as function approximators. As shown in Figure 3-1, we have some unknown function that we wish to approximate. We want to adjust the parameters of the network so that it will produce the same response as the unknown function, if the same input is applied to both systems. For our applications, the unknown function may correspond to the relationship between the atomic structure variables and the resulting potential energy and forces. The multilayer perceptron neural network is built up of simple components. We will begin with a single-input neuron, which we will then extend to multiple inputs. We will next stack these neurons together to produce layers. Finally, we will cascade the layers together to form the network. A single-input neuron is shown in Figure 3-2. The scalar input p is multiplied by the scalar weight w to form wp, one of the terms that is sent to the summer. The other input, 1, is multiplied by a bias b and then passed to the summer. The summer output n, often referred to as the net input, goes into a transfer function f, which produces the scalar neuron output a.


2006 ◽  
Vol 73 (6) ◽  
Author(s):  
Eric R. Hudson ◽  
Christopher Ticknor ◽  
Brian C. Sawyer ◽  
Craig A. Taatjes ◽  
H. J. Lewandowski ◽  
...  

2021 ◽  
pp. 133047
Author(s):  
Yuta Mizuno ◽  
Mikoto Takigawa ◽  
Saki Miyashita ◽  
Yutaka Nagahata ◽  
Hiroshi Teramoto ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document