Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.astro.spbu.ru/DOP/A-PERC/PAPER2/node3.html
Дата изменения: Fri Nov 19 12:11:47 2010
Дата индексирования: Mon Oct 1 23:30:37 2012
Кодировка:

Поисковые слова: п п п п п п п п п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п
Neural network approach next up previous
Next: Model realization and some Up: paper2 Previous: Model of clusters

Neutral network approach

One of the widely used models of neural network is the multi-layered perceptron [6]. The model network has layers of input and output elements called neurons. Several "hidden" layers can be located between these layers. Each neuron in a layer is connected to the neurons in the adjacent layer with a modifiable connection, but there are no connections between the neurons of the same layer. An input vector is put on the input layer, passes through the hidden layers, and arrives at the output layer. The corresponding output vector is determined by calculation of the neuron activity level for each layer using already known values of neuron activity of the previous layers.

To train such a perceptron, we used the following algorithm of back error propagation. Small random values were assigned to connections before the begin of training. Training was an iterative process. Each iteration consisted of two phases. An input vector was put to the network during the first phase. Then the signal passed through the network and generated an output vector. The received output vector was compared to the required one. If they coincided, training of the perceptron for this input vector was not carried out. In another case, the difference between the actual and required output values was calculated and sent consecutively from the output layer to the input one. According to this information, a modification of connections was fulfilled using the generalized delta-rule [6]. The weight modification was applied to each input-output pair. Training was continued until the errors did not exceed a given value.

Perceptron has been trained on a limited set of the input/output parameters. In the process of training it has built a surface in the space of these parameters and defined and memorized a continuous function, associating the input and output parameters.
next up previous
Next: Model realization and some Up: paper2 Previous: Model of clusters
root 2003-04-11