Two new Tech Reports

Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU
Mon May 13 13:31:04 EDT 1991


The following two tech reports have been placed in the neuroprose database
at Ohio State.  Instructions for accessing them via anonymous FTP are
included at the end of this message.  (Maybe everyone should copy down
these instructions once and for all so that we can stop sending repeating
them with each announcement.)
---------------------------------------------------------------------------
Tech Report CMU-CS-91-100

The Recurrent Cascade-Correlation Architecture

Scott E. Fahlman

Recurrent Cascade-Correlation (RCC) is a recurrent version of the
Cascade-Correlation learning architecture of Fahlman and Lebiere
\cite{fahlman:cascor}.  RCC can learn from examples to map a sequence of
inputs into a desired sequence of outputs.  New hidden units with recurrent
connections are added to the network one at a time, as they are needed
during training.  In effect, the network builds up a finite-state machine
tailored specifically for the current problem.  RCC retains the advantages
of Cascade-Correlation: fast learning, good generalization, automatic
construction of a near-minimal multi-layered network, and the ability to
learn complex behaviors through a sequence of simple lessons.
The power of RCC is demonstrated on two tasks: learning a finite-state
grammar from examples of legal strings, and learning to recognize
characters in Morse code.

Note: This TR is essentially the same as the the paper of the same name in
the NIPS 3 proceedings (due to appear very soon).  The TR version includes
some additional experimental data and a few explanatory diagrams that had to
be cut in the NIPS version.
---------------------------------------------------------------------------
Tech report CMU-CS-91-130

Learning with Limited Numerical Precision Using the Cascade-Correlation
Algorithm

Markus Hoehfeld and Scott E. Fahlman

A key question in the design of specialized hardware for simulation of
neural networks is whether fixed-point arithmetic of limited numerical
precision can be used with existing learning algorithms.  We present an
empirical study of the effects of limited precision in Cascade-Correlation
networks on three different learning problems.  We show that learning can
fail abruptly as the precision of network weights or weight-update
calculations is reduced below 12 bits.  We introduce techniques for dynamic
rescaling and probabilistic rounding that allow reliable convergence down
to 6 bits of precision, with only a gradual reduction in the quality of the
solutions.

Note: The experiments described here were conducted during a visit by
Markus Hoehfeld to Carnegie Mellon in the fall of 1990.  Markus Hoehfeld's
permanent address is Siemens AG, ZFE IS INF 2, Otto-Hahn-Ring 6, W-8000
Munich 83, Germany.
---------------------------------------------------------------------------
To access these tech reports in postscript form via anonymous FTP,
do the following:

unix> ftp cheops.cis.ohio-state.edu  (or, ftp 128.146.8.62)
  Name: anonymous
  Password: neuron
ftp> cd pub/neuroprose
ftp> binary
ftp> get <filename.ps.Z>
ftp> quit
unix> uncompress <filename.ps.Z>
unix> lpr <filename.ps>   (use flag your printer needs for Postscript) 

The TRs described above are stored as "fahlman.rcc.ps.Z" and
"hoehfeld.precision.ps.Z".  Older reports "fahlman.quickprop-tr.ps.Z" and
"fahlman.cascor-tr.ps.Z" may also be of interest.

Your local version of ftp and other unix utilities may be different.
Consult your local system wizards for details.
---------------------------------------------------------------------------
Hardopy versions are now being printed and will be available soon, but
because of the high demand and tight budget, our school has has
(reluctantly) instituted a charge for mailing out tech reports in hardcopy:
$3 per copy within the U.S. and $5 per copy elsewhere, and the payment must
be in U.S. dollars.  To order hardcopies, contact:

Ms. Catherine Copetas
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213
U.S.A.



More information about the Connectionists mailing list