Make your own free website on Tripod.com

The image in the top frame represents a backpropagation network. Click on the different parts of the net to get an explanation.

[next page] [index] [mail me]
Thomas Riga, University of Genoa, Italy





Input pattern:

contains the values that codificate the stimulus. Often the pattern contains binary values.




Input layer:

contains the input units, which receive the values of the input pattern, and compute the value to send to the hidden units. In a backpropagation network a sigmoid, or logistic function is used to compute this value.




Weights:

every connection between the different layers is weighted. The value of each input unit is multiplicated with the weight of the connection, and the sum of these weighted values is received as input by the hidden units.




Hidden layer:

contains the hidden units, which receive the weighted values of the input units, and compute the value to send to the output units. The values of the hidden units together with the weights of the connections form the internal representation of the network, and are entirely computed by the network, without human interference. This internal representation contains implicit knowledge of the relations that lie between the different patterns. Some researchers focus on extracting explicit knowledge out of this implicit knowledge, for example Clark and Karmiloff-Smith who forwarded the representional redescription hypothesis in 1993.




Weights:

every connection between the different layers is weighted. The value of each hidden unit is multiplicated with the weight of the connection, and the sum of these weighted values is received as input by the output units.




Output layer:

the output units in the output layer receive the weighted output of the hidden units and then compute the output pattern, using the sigmoid function.




Output pattern:

the output units produce the output pattern, which represents the response to the stimulus received. In a backpropagation network, the output pattern is compared to a target output pattern, and this comparison produces an error value, which is propagated back to the hidden and input layer, and the weights of the connections are adjusted in a way that, in the following cycle, the error value will have diminuished.

[next page] [index] [mail me]
Thomas Riga, University of Genoa, Italy