No subject
Mon Jun 5 16:42:55 EDT 2006
Abstract
----------
This paper describes how graph grammars with attributes may be used to
grow neural networks. The grammar
facilitates a very compact and declarative description
of every aspect of a neural architecture; this is important
from a software/neural engineering point of view, since
the descriptions are much easier to write and maintain
than programs written in a high-level language, such as C++,
and do not require programming ability.
The output of the growth process is a neural network
that can be transformed into a Postscript representation
for display purposes, or simulated using a separate
neural network simulation program, or mapped directly
into hardware in some cases.
In this approach, there is no separate learning algorithm; learning
proceeds (if at all) as an intrinsic part of the network behaviour.
This has interesting application in the evolution of neural nets,
since now it is possible to evolve all aspects of a network
(including the learning `algorithm') within a single unified
paradigm. As an example, a grammar is given for growing a
multi-layer perceptron with active weights that
has the error back-propagation learning algorithm embedded
in its structure.
This paper is available through my web page:
http://esewww.essex.ac.uk/~sml
or via anonymous ftp:
ftp tarifa.essex.ac.uk
cd /images/sml/reports
get esann95.ps
-----------------------------------------------------------------------
More information about the Connectionists
mailing list