No subject
Tim Ash
ash%cs at ucsd.edu
Tue Mar 14 19:15:54 EST 1989
-----------------------------------------------------------------------
The following technical report is now available.
-----------------------------------------------------------------------
DYNAMIC NODE CREATION
IN
BACKPROPAGATION NETWORKS
Timur Ash
ash at ucsd.edu
Abstract
Large backpropagation (BP) networks are very difficult
to train. This fact complicates the process of iteratively
testing different sized networks (i.e., networks with dif-
ferent numbers of hidden layer units) to find one that pro-
vides a good mapping approximation. This paper introduces a
new method called Dynamic Node Creation (DNC) that attacks
both of these issues (training large networks and testing
networks with different numbers of hidden layer units). DNC
sequentially adds nodes one at a time to the hidden layer(s)
of the network until the desired approximation accuracy is
achieved. Simulation results for parity, symmetry, binary
addition, and the encoder problem are presented. The pro-
cedure was capable of finding known minimal topologies in
many cases, and was always within three nodes of the
minimum. Computational expense for finding the solutions was
comparable to training normal BP networks with the same
final topologies. Starting out with fewer nodes than needed
to solve the problem actually seems to help find a solution.
The method yielded a solution for every problem tried. BP
applied to the same large networks with randomized initial
weights was unable, after repeated attempts, to replicate
some minimum solutions found by DNC.
-----------------------------------------------------------------------
Requests for reprints (ICS Report 8901) should be directed to:
Claudia Fernety
Institute for Cognitive Science
C-015
University of California, San Diego
La Jolla, CA 92093.
-----------------------------------------------------------------------
More information about the Connectionists
mailing list