Monday, October 27, 2003
11:00 AM
CSB 209
Geoffrey Hinton
U. Toronto
Contrastive Backpropagation
I will describe a way of modeling high-dimensional data, such as images, by using an unsupervised, non-linear, multilayer neural network in which the activity of each neuron-like unit makes an additive contribution to a global energy score that indicates how surprised the network is by the current data. Units whose activities represent violations of learned constraints contribute positively to the global energy and units whose activities represent the presence of familiar features contribute negatively. The connection weights which determine how the activity of each unit depends on the activities in earlier layers are learned by minimizing the energy assigned to data-vectors that are actually observed and maximizing the energy assigned to "confabulations.'' The confabulations are generated by perturbing an observed data-vector in a direction that decreases its energy under the current model. This learning rule eliminates any systematic differences between the data and the confabulations. Backpropagation of energy derivatives through the multilayer network is used both for computing the derivatives that are needed to adjust the weights and for computing how to perturb an observed data-vector to produce a confabulation that has lower energy. When shown patches of natural images, the learning procedure generates nice topographic maps of oriented filters that exhibit continuity in orientation, scale, and position. When given the 3-D coordinates of the joints in an arm, the learning procedure discovers the highly non-linear constraints that result from the fixed lengths of the links and it works well even if some of the coordinates are missing.