TRs announcemnt

Javier Movellan jm2z+ at andrew.cmu.edu
Wed Jan 22 12:25:25 EST 1992


             **** DO NOT FORWARD TO OTHER GROUPS ****

We have recently produced two technical reports, the first in a new
series devoted to issues in PDP and Cognitive Neuroscience.  They are
described below, followed by instructions for obtaining copies.

--------------------------------------------------------------


                 TOWARD A THEORY OF INFORMATION PROCESSING IN
                    GRADED, RANDOM, INTERACTIVE NETWORKS

                             James L. McClelland

                         Technical Report PDP.CNS.91.1

A set of principles for information processing in parallel distributed
processing systems is described.  In brief, the principles state that
processing is graded and gradual, interactive and competitive, and
subject to intrinsic variability.  Networks that adhere to these
principles are called GRAIN networks.  Four goals of a theory of
information processing based on these principles are described: 1) to
unify the asymptotic accuracy, reaction-time, and time accuracy
paradigms; 2) to examine how simple general laws might emerge from
systems conforming to the principles and to predict and/or explain
cases in which the general laws do not hold; and 3) to explore the
explanatory role of the various principles in different aspects of
observed empirical phenomena.  Two case studies are presented.  In the
first, a general regularity here called Morton's independence law of
the joint effects of context and stimulus information on perceptual
identification is shown to be an emergent property of Grain networks
that obey a specific architectural constraint.  In the second case
study, the general shape of time-accuracy curves produced by networks
that conform to the principles is examined.  A regularity here called
Wickelgren's law, concerning the approximate shape of time accuracy
curves, is found to be consistent with some GRAIN networks.  While the
exact conditions that give rise to standard time accuracy curves
remain to be established, one observation is made concerning
conditions that can lead to a strong violation of Wickelgren's law,
and an experimental study that can be interpreted as meeting this
condition is simulated.  In both case studies, the joint dependency of
the emergent characteristics of information processing on the
different principles is considered. 

---------------------------------------------------------------


      LEARNING CONTINUOUS PROBABILITY DISTRIBUTIONS WITH THE CONTRASTIVE
                             HEBBIAN ALGORITHM

                           Javier R. Movellan
                                   &
                           James L. McClelland

                      Technical Report PDP.CNS.91.2

We show that contrastive Hebbian learning (CHL), a well known learning
rule previously used to train Boltzmann machines and Hopfield models,
can also be used to train networks that adhere to the principles of
graded, random, and interactive propagation of information. We show
that when applied to such networks, CHL performs gradient descent on a
contrastive function which captures the difference between desired and
obtained continuous multivariate probability distributions. This
allows the learning algorithm to go beyond expected values of output
units and to approximate complete probability distributions on
continuous multivariate activation spaces.  Simulations show that such
networks can indeed be trained with the CHL rule to approximate
discrete and continuous probability distributions of various types. We
briefly discuss the implications of stochasticity in our
interpretation of information processing concepts.

--------------------------------------------------------------

To Obtain Copies:

To minimize printing/mailing costs, we strongly encourage interested readers
of this mailing list to get the reports via ftp. 

The filenames for the reports are 

* pdp.cns.91.1.ps.Z for the McClelland paper (42 pages, no figures)

* pdp.cns.91.2.ps.Z for the Movellan and McClelland paper (42 pages,
  includes figures).

Full  instructions are given below.

For those who do not have access to ftp, physical copies can be
requested from:

bd1q+ at andrew.cmu.edu

You can also request just the figures of the McClelland paper.

In your email please indicate exactly what you are requesting.

Figures for the McClelland paper will be sent promptly; physical
copies of either complete TR will be sent within a few weeks of
receipt of the request.

Instructions:

To obtain copies via ftp use the following commands:

unix> ftp 128.2.248.152
Name: anonymous
Password: pdp.cns
ftp> cd pub/pdp.cns
ftp> binary
ftp> get pdp.cns.91.1.ps.Z (or pdp.cns.91.2.ps.Z)
ftp> quit
unix> uncompress pdp.cns.91.1.ps.Z | lpr 

------------------------------------------------------------------------------
If you need further help, please contact me:

Javier R. Movellan......  jm2z at andrew.cmu.edu
Department of Psychology.....  412/268-5145(voice)
Carnegie Mellon University                                       
412/268-5060(Fax)
Pittsburgh, PA  15213-3890





				- Javier




More information about the Connectionists mailing list