Subtractive network design

Ken Laws LAWS at ai.sri.com
Tue Jun 6 06:52:25 EDT 2006


There's been some discussion of whether networks should grow
or shrink.  This reminds me of the stepwise-inclusion and
stepwise-deletion debate for multiple regression.  As I recall,
there were demonstrable benefits from combining the two.  Stepwise
inclusion was used for speed, but with stepwise deletion of variables
that were thereby made redundant.  The selection process was
simplified, over the years, by advances in the theory of canonical
correlation.  The theory of minimal encoding has lately been
invoked to improve stopping criteria for the search.

Neural-network researchers don't like globally computed statistics
or stored states, so you can't set up an A* search within a single
network training run.  You do seem willing, however, to use genetic
algorithms or multiple training runs to find sufficiently good
networks for a target application.  Brute-force search techniques
in the space of permitted connectivities may be necessary.  Stepwise
growth alternated with stepwise death may be a useful strategy
for reducing search time.

					-- Ken
-------


More information about the Connectionists mailing list