NeuroProse preprint announcement

Chris Webber webber at signal.dra.hmg.gb
Tue Jan 25 04:25:54 EST 1994


FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/webber.self-org.ps.Z


The file "webber.self-org.ps.Z" is available for
copying from the Neuroprose preprint archive: 

TITLE: 	Self-organization of transformation-invariant 
	neural detectors for constituents which recur
	within different perceptual patterns

AUTHOR:	Chris J.S. Webber (Cambridge University)

(21 pages, preprint of article submitted to "Network" journal.)


ABSTRACT:

A simple self-organizing dynamics for governing 
the adaptation of individual neural perception units 
to the statistics of their input patterns is presented. 
The dynamics has a single adjustable parameter 
associated with each neuron, which directly 
controls the proportion of the patterns experienced 
that can induce response in the neuron, 
and thereby controls the nature of the neuron's 
response-preferences after the convergence 
of its adaptation. 

Neurons are driven by this dynamics to develop into 
detectors for the various individual pattern-constituents 
that recur frequently within the different patterns 
experienced: the elementary building-blocks 
which, in various combinations, make up those patterns. 
A detector develops so as to respond invariantly 
to those patterns which contain its trigger constituent. 
The development of discriminating detectors for specific 
faces, through adaptation to many photo-montages 
of combinations of different faces, is demonstrated. 

The characteristic property observed in the convergent states 
of this dynamics is that a neuron's synaptic vector becomes 
aligned symmetrically between pattern-vectors 
to which the neuron responds, so that those patterns 
project equal lengths onto the synaptic vector. 
Consequently, the neuron's response becomes invariant 
under the transformations which relate those patterns 
to one another. 

Transformation invariances that can develop 
in multi-layered systems of neurons, adapting 
according to this dynamics, include shape tolerance 
and local position tolerance. This is demonstrated 
using a two-level hierarchy, adapted to 
montages of cartoon faces generated to exhibit 
variability in facial expression and shape: 
neurons at the higher level of this hierarchy  
can discriminate between different faces 
invariantly with respect to expression, 
shape deformation, and local shift in position. 
These tolerances develop so as to correspond to the 
variability experienced during adaptation: 
the development of transformation invariances is driven 
entirely by statistical associations 
within patterns from the environment, 
and is not enforced by any constraints imposed on 
the architecture of neural connections. 




More information about the Connectionists mailing list