new paper: Random wired cascade-correlation

henrik@robots.ox.ac.uk henrik at robots.ox.ac.uk
Mon Dec 21 14:02:42 EST 1992


I just have placed this preprint (the paper is submitted to MicroNeuro 93)
in the neuroprose archive, file klagges.rndwired-cascor.ps.Z. There also
is an accompanying picture of a sample network topology created by LFCC,
called klagges.rndwired-topology.GIF (it is a gif file).

Cheers, Henrik (henrik at robots.ox.ac.uk)

Abstract((

The success of new learning algorithms like Cascade Correlation (CC) 
lies partly in topology construction strategies which are difficult
to map onto SIMD-parallel neurcomputers. A CC variation that limits
the connection fan-in and random-wires the neurons was invented to
ease the SIMD-implementation. Surprisingly, the method produced  
superior and very compact networks with improved generalization. In 
particular, solutions of the 2-spirals problem improved from 133 +- 27
total weights for standard CC down to 60 +- 10 with 75% less connection
crossings. 
Performance increased with candidate pool size and was correlated with a 
reduction of artefacts in the receptive field visualizations. We argue that,
for general neural network learning, construction algorithms are as important 
as weight adaption rules. This requires sparse matrix support from neurocomputer
hardware.
))

FTP instructions:
$ ftp archive.cis.ohio-state.edu

ftp> user ftp
ftp> password <your email-address>
ftp> binary
ftp> cd pub/neuroprose
ftp> get Getps
ftp> bye

$ chmod +x Getps
$ Getps klagges.rndwired-cascor.ps.Z
$ Getps klagges.rndwired-topology.GIF
$ uncompress kl*.ps.Z
$ lpr -Plp kl*.ps      (or whatever cmd you use for your postscript printer)
$ xview kl*.GIF		   (or " " " " viewing gifs.)
=========


More information about the Connectionists mailing list