I. Introduction to Neural Networks
I. Supervised Learning A. Introduction to Perceptron Perceptron refers to a class of neural networks developed more than forty years ago. The learning rule associated with it is a modification of an earlier algorithm called the Hebb rule. A learning rule is a means of finding a suitable set of weights to match the input patterns to their target outputs. The process is an iterative one in which the sequence of patterns are presented to the network and the weights, initially set to zero or to random values, are updated according to the particular rule used. The patterns are repeated until the outputs match or are arbitrarily close to the target vales. The number of iterations is referred to as training epochs.
The Perceptron algorithm is shown below. It assumes bipolar outputs and is based on the rationale that by adding to or subtracting from each weight the product of the input and target output values on the associated line, the outputs move in a direction toward their target values.