Parallel Implementation of Genectic Algorithm and Simulated

Javier Movellan jm2z+ at andrew.cmu.edu
Mon Oct 29 11:06:38 EST 1990


Yu,

There is a third related procedure called "sharpening". My first contact
with the sharpening procedure was through the work of Akiyama et al (see
reference) in what they called Gaussian machines ( Continuous Hopfield
Networks with Gaussian noise injected in the net inputs). 

Sharpening is also used in Mean Field Networks (Continuous Hopfield
Model with the Contrastive Hebbian Learning Algorithm). Sharpening may
be seen as a  deterministic, continuous approximation to annealing in
Stochastic Boltzmann Machines. It works by starting settling using
logistic activations with very low gain and increasing it as settling
progresses. Sharpening, contrary to "true annealing" is deterministic
and thus it may be faster. A similar procedure is used with elastic
networks solving the TSP problem. 

References: 

Peterson, C & Anderson J (1987): A mean field theory learning algorithm
for neural networks. Complex Systems, 1, 995-1019.

Akiyama Y, Yamashita A, Kajiura M, Aiso H (1989) Combinatorial
Optimization with Gaussian Machines. Proceedings of the IJCNN, 1,
533-540.

Hinton G E (1989) Deterministic Boltzmann Learning Performs Stepest
Descent in Weight Space, Neural Computation, 1, 143-150.

Galland C, & Hinton G (1989) Deterministic Boltzmann Learing in Networks
with Asymetric Connectiviy. University of Toronot. Department of
Computer Science Technical Report. CRG-TR-89-6.

Movellan J R (1990) Contrastive Hebbian Learning in the Continuous
Hopfield Model. Proceedings of the 1990 Connectionist Summer School.


Javier


More information about the Connectionists mailing list