No subject

Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU
Thu Dec 13 16:47:03 EST 1990


Sure, there are many interesting network architectures that add and
subtract network structure at the same time.  Most people at the post-NIPS
workshop on this topic (including me) seemed to feel that these hybrid
approaches were the most promising of all.

The real problem I have with "Ontogenic" is that the term is so closely
associated in most people's minds with biological development.  Many people
will assume that an "Ontogenic Neural Network" is a serious attempt to
model the embryonic development of some real biological nervous system.
That may happen some day soon, and we probably want to save the term
"Ontogenic" for such applications, rather than co-opting it to refer to any
old net that messes around with its own topology during learning.

[Beware!  Attempted humor follows:]

I wonder if ontogenic neural nets would, in the course of learning,
recapitulate the phylogeny of neural nets.  You start with a simple
perceptron -- two layers of cells -- which then begins growing a hidden
layer.  Unfortunately, the hidden layer is not functional.  Some symbolic
antibodies then attack the embryo and try to kill it off by leeching off
all the nutrients, but a few isolated cells remain.  The cells regroup, but
very loosely.  One part buds off, names itself "Art", and develops an
elaborate, cryptic language of its own.  The rest of the blob turns into a
Hopfield net, heats up and cools down a few times, and finally develops the
organs necessary for back-propagation.  We don't know what happens after
that because the back-propagation phase is so slow that it hasn't converged
yet...

-- Scott


More information about the Connectionists mailing list