Neurodynamics and Neural Networks
Individual neurons are modelled as nonlinear oscillators that rely on bistability and homoclinic orbits to produce spiking potentials. Simplified mathematical models, like the Fitzhugh–Nagumo and NaK models, capture successively more sophisticated behavior of individual neurons, such as thresholds and spiking. Artificial neurons are introduced that are composed of three simple features: summation of inputs, referencing to a threshold, and saturating output. Artificial networks of neurons are defined through specific network architectures that included the perceptron, feedforward networks with hidden layers that are trained using the Delta Rule, and recurrent networks with feedback. A prevalent example of a recurrent network is the Hopfield network, which performs operations such as associative recall. The dynamic trajectories of the Hopfield network have basins of attraction in state space that correspond to stored memories.