Paper

Bo Xu ITGT500%INDYCMS.BITNET at vma.cc.cmu.edu
Tue Oct 15 10:22:41 EDT 1991


Following is the abstract of a paper accepted by IJCNN'91-SINGAPORE.
The main purpose of this paper was to attack the problems of slow rate of
convergence, local minima, and incapability of learning (under certain
preset criteria) etc problems associated with the original back-propagation
neural nets from an alternative viewpoint ---- topology ---- instead of the
learning algorithm and units responsive characteristics. It was shown in
this paper that the topology is a very important factor limiting the
performances of back-propagation neural networks besides the already studied
factors such as the learning algorithm and the units characteristics.

All comments are welcome.



        PPNN: A Faster Learning and Better Generalizing Neural Net


                                  Bo Xu
                           Indiana University

                               Liqing Zheng
                            Purdue University


  Abstract----It was pointed out in this paper that the planar topology of
current back-propagation neural network (BPNN) sets limits to solve the
slow convergence rate problem, local minima, and other problems associated
with BPNN.  The parallel probabilistic neural network (PPNN) using a new
neural network topology, stereotopology, was proposed to overcome these
problems.  The learning ability and the generalization ability of BPNN and
PPNN were compared for several problems.  The simulation results show
that PPNN was capable of learning any kinds of problems much faster than
BPNN and generalized better than BPNN too.  It was analyzed that the faster,
universal learnability of PPNN was due to the parallel characteristic of
PPNN's stereotopology, and the better generalization ability came from the
probabilistic characteristic of PPNN's memory retrieval rule.


Bo Xu
Indiana University
itgt500 at indycms.iupui.edu


More information about the Connectionists mailing list