analog outputs
Dave.Touretzky@C.CS.CMU.EDU
Dave.Touretzky at C.CS.CMU.EDU
Tue Jun 6 06:52:25 EDT 2006
There are plenty of neural net models that produce analog outputs, e.g.,
backpropagation nets. A more interesting question is whethere there are
associative memories for analog vectors. Hopfield nets and BSB (Brain
State in a Box), both matrix models, work only for boolean memories; they
use nonlinearity to force their units' states to be 0 or 1.
There are a few models that can learn analog memories, but they tend to involve
competitive learning (winner-take-all nets) and generate a grandmother cell for
each pattern to be learned. One example is Hecht-Nielsen's counter-propagation
net, which can learn to associate one analog pattern with another and produce
an exactly correct output given only an approximate input. Kohonen's
self-organizing feature maps are also based on competitive winner-take-all
behavior.
An interesting generalization on this idea is to allow, say, k distinct
winners, and then combine their outputs somehow, e.g., by using a weighted
average. This is the principle behind Baum, Moody, and Wilczek's ACAM, or
Associative Content-Addressable Memory, which I believe has been generalized
to work on analog patterns. Hecht-Nielsen also discusses the idea of
generalizing counter-propagation to permit k winners; see his article in
the December 1987 issue of Applied Optics.
But these multi-grandmother cell approaches are still not as distributed as a
Hopfield or backprop model. (Of course one can train ordinary backprop nets to
associate analog inputs with analog outputs, but unlike a true associative
memory, there is no guarantee that the backprop network will produce the exact
same output if the input is perturbed slightly, because it has no attractor
states. In fact, this lack of attractor states is being exploited when people
use backprop nets to do function approximation by interpolating between
training instances.)
So the question remains: are there fully-distributed associative memories whose
attractor states are analog vectors? Perhaps some of the recent work on
generalizing backprop to recurrent networks, introducing the potential for
attractor behavior, will lead to a solution to this problem.
-- Dave
-------
More information about the Connectionists
mailing list