Stochastic learning chip

Joshua Alspector josh at flash.bellcore.com
Fri Jul 8 15:23:07 EDT 1988


To clarify some previous postings, we have fabricated
a chip based on a modified Boltzmann algorithm that learns.  It can learn
an XOR function in a few milliseconds.  Patterns can be presented to it
at about 100,000 per second.  It is a test chip containing a small network
in one corner that consists of 6 neurons and 15 two-way synapses for
potentially full connectivity.  We can initialize the network to any
weight configuration and permanently disable some connections.  We also
have demonstrated the capability to do unsupervised competitive learning as well
as supervised learning.

It turns out that our noise amplifiers do not give gaussian uncorrelated
noise as we had hoped, but the noise that exists seems to be sufficient to
help it learn.  This bears out results in previous simulations that show
the noise distributions don't matter too much as long as they provide
a stochastic element.  Therefore, it is not completely accurate to call it
a Boltzmann machine or even a simulated annealing machine.

We can do only toy problems because of the small number of neurons
but we have plans to make much larger networks in the future
than can consist of multiple chips.

Previous papers describing the implementation and extensions of the
stochastic learning technique are:

J. Alspector and R.B. Allen, "A neuromorphic vlsi learning system", in
Advanced Research in VLSI: Proceedings of the 1987 Stanford Conference.
edited by P. Losleben (MIT Press, Cambridge, MA, 1987), pp. 313-349.

J. Alspector, R.B. Allen, V. Hu, & S. Satyanarayana, "Stochastic Learning
Networks and their Electronic Implementation", in
Proceedings of the 1987 NIPS Conference edited by D.Z. Anderson                 

josh


More information about the Connectionists mailing list