Neural nets
Principles

Neuron is the basic element of every neural networks; it processes input data by a chosen method.

Fig. 7.2: Neuron

According to the type of input data, neurons are further classified as binary and continuous, i.e. processing multi-value information. The term "continuous" points out the philosophy of signal processing, where in the interpretation is discretized in absolute majority of cases as neuron and neural networks are simulated on a computer. From original HW implementations, only the class of optical implementation remains important.

An important term in the field of neural networks is a transfer function, sometimes referred to as activation function. This function converts the internal potential of the neuron to the range of output values. The most important functions are shown in the figures.

Fig. 7.3: Linear transfer function
Fig. 7.4: Sigmoid transfer function
Fig. 7.5: Bounded function

Neural networks work basically in two phases – learning phase (adaptation) and active phase (relaxation).

During learning, NN are change in such a way that the network is being adapted for solution of given problem. The learning is realized by setting weights between nodes. In practice, this is achieved by assignment of initial values, either random, or chosen upon experience, or according to some similar problem previously solved. This is followed by introduction of a training input.

The learning is further divided into supervised learning and unsupervised learning.

During supervised learning, NN learns by comparing actual input with the required output such that the weights are adapted towards the best match. Decreasing the difference is controlled by a learning algorithm.

On the other hand, unsupervised learning does not have any specific validity criterion. The learning proceeds such that the algorithm searches in the input data for samples with similar properties. Such learning is sometimes called self-arrangement.

During active phase (relaxation), a state outside equilibrium appears in the output layer based on data inputs. The values stored within neurons begin to change by effect of other neurons, until a stable equilibrium is established.