No subject

JANSSEN Jacques CADEPS at BBRNSF11.BITNET
Tue Jun 11 08:56:05 EDT 1991


                   STEERABLE GenNets - A Query.
 
Abstract :
            One can evolve a GenNet (a neural net evolved with the genetic
algorithm) to display two separate behaviors depending upon the setting of
a clamped input control variable. By using an intermediate control value
one obtains an intermediate behavior. For example, let the behaviors be
sinusoidal oscillations of periods T1 and T2, where the control settings are
0.5 and -0.5 By using a control value of 0.3, one will get a sinusoid with
a period between T1 and T2. Why? Has anyone out there had any similar
experiences (i.e. of this sort of generalised behavioral learning), and has
anybody any idea why GenNets are capable of such a phenomenon? If I receive
some interesting replies, I'll prepare a summary and report back.
 
 
Further details.
 
                  One of the great advantages of GenNets (= using the GA to
teach your neural nets their behaviors) over traditional NN paradigms such as
backprop, Hopfield, etc is that the GA treats your NN as a black box, and it
doesnt matter how complex the internal dynamics of the NN are. All that counts
is the result. How well did the NN perform? If it did well, the bitstring which
codes for the NN's weights will survive. This allows the creation of GenNets
which can cope with both inputs and outputs which vary constantly. One does not
need stationary output values a la Hopfield etc. Hence NNs become much more
"dynamic", compared to the more "static" nature of traditional paradigms. One
can thus evolve dynamics (behaviors) on NNs (GenNets). This opens up a new
world of NN possibilities. If one can evolve a GenNet to express one behavior,
why not two? If two, can one evolve a continuum of behaviors depending
upon the setting of a controlled input value? The variable frequency generator
GenNet mentioned above shows that this is possible. But I'm damned if I know
why? Whats going on? Have any of you had similar experiences? Any clues for a
theoretical explanation for this extraordinary phenomenon?
 
P.S. To evolve this GenNet, use a fully connected net, with all external
inputs set at zero, except for two inputs. Clamp one at 0.5, and the other
at 0.5 (and then -0.5 in the second "experiment"). The fitness is the inverse
of the sum of the two sums (for the two expts) of the squares of the difference
between the desired output at each clock cycle and the actual output. Assign
one neuron to be the output neuron.
 
Cheers,
 
Hugo de Garis,
University of Brussels, Belgium,
George Mason University, VA, USA.


More information about the Connectionists mailing list