New TR available

Dr L S Smith (Staff) lss at compsci.stirling.ac.uk
Wed Apr 13 09:33:50 EDT 1994


***DO NOT FORWARD TO OTHER GROUPS***

University of Stirling (Scotland), Centre for Cognitive and Computational 
Neuroscience....

CCCN Technical report CCCN-15

Activation Functions, Computational Goals and Learning Rules
for Local Processors with Contextual Guidance.

Information about context can enable local
processors to  discover latent variables that are relevant to the context
within which they occur, and it can also guide short-term processing.
For example, Becker and Hinton (1992) have shown how context can
guide learning, and 
Hummel and Biederman (1992) have shown how it can guide processing in 
a large neural net for object
recognition. This paper therefore studies the basic capabilities of a
local processor with two distinct classes of inputs : receptive field
inputs that provide the primary drive and contextual inputs that
modulate their effects. The contextual predictions are used to guide
processing without confusing them with the receptive field inputs. 
The processor's transfer function must
therefore distinguish these two roles. Given these two classes of input
the information in the output can be decomposed into four disjoint
components to provide a space of possible goals in which the
unsupervised learning of Linsker (1988) and the internally supervised
learning of Becker and Hinton (1992) are special cases. Learning
rules are derived from an information-theoretic objective function, and
simulations show that a local processor trained with these rules and
using an appropriate activation function has the elementary properties
required.

This report is available by anonymous FTP from

ftp.cs.stir.ac.uk

in the directory

pub/tr/cccn

The filename is

TR15.ps.Z

(and, as usual, this needs decompress'd, and the postscript printed.)

As a last resort, hard copy may be available: email lss at cs.stir.ac.uk
with your postal address...



More information about the Connectionists mailing list