Two Papers

G. Kohring kohring at hlrserv.hlrz.kfa-juelich.de
Sat Oct 3 07:55:17 EDT 1992


The paper, "On the Q-State Neuron Problem in Attractor Neural Networks" 
whose abstract appears below, has recently been accepted for publication in 
the Journal "Neural Networks". It discusses some recent results which
demonstrate that the use of analog neurons in Attractor Neural Networks is 
not practical. An abbreviated account of this work recently appeared 
in the Letters section of "Journal de Physique" (Journal de Physique I, 2
(1992) p. 1549) and the abstract for this paper is also given below. If anyone
does not have access to these Journals and would like to get a copy of these
papers, please send a request to the following address.

G.A. Kohring
HLRZ an der KFA Juelich
Postfach 1913
D-5170 Juelich, Germany
e-mail: kohring at hlrsun.hlrz.kfa-juelich.de



        On the Q-State Neuron Problem in Attractor Neural Networks

                    (Neural Networks, in press)

                           ABSTRACT

The problems encountered when using multi-state neurons in attractor neural
networks are discussed. In particular, straight-forward implementations of
neurons with Q states, leads to information storage capacities, E, that
decrease like E ~ log_2 Q/Q^2. More sophisticated schemes yield
capacities that decrease like E ~ log_2 Q/Q, but with retrieval
times increasing proportional to Q. There also exist schemes whereby the
information capacity reaches its maximum value of unity, but the retrieval
time grows with the number of neurons, N, like O(N^3) instead of O(N^2) as in
conventional models. Furthermore, since Q-state models approximate analog
neurons when Q is large, the results demonstrate that the use of analog
neurons is not feasible.  After discussing these problems, a solution is
proposed in which the information capacity is independent of Q, and the
retrieval time increases proportional to  \log_2 Q . The retreival
properties of this model, i.e., basins of attraction, etc. are calculated and 
shown to be in agreement with simple theoretical arguments. Finally, a 
critical discussion of this approach is given.




                On the Problems of Neural Networks 
                   with Multi-state Neurons 

              (Journal de Physique I, 2 (1992) p. 1549)
                      
                           ABSTRACT

For realistic neural network applications the storage and recognition of 
gray-tone patterns, i.e, patterns where each neuron in the network can take 
one of Q different values, is more important than the storage of black and 
white patterns, although the latter has been more widely studied.  Recently, 
several groups have shown the former task to be problematic with current 
techniques since the useful storage capacity, ALPHA, generally decreases like: 
ALPHA ~ Q^{-2}.  In this paper one solution to this problem is proposed, which 
leads to the storage capacity decreasing like:  ALPHA ~ (log_2 Q)^{-1}.
For realistic situations, where Q=256 this implies an increase of nearly
four orders of magnitude in the storage capacity.  The price paid, is
that the time needed to recall a pattern increases like: log_2 Q. This
price can be partially offset by an efficient parallel program which runs at
1.4 Gflops on a 32 processor iPSC/860 Hypercube.




More information about the Connectionists mailing list