No subject
sankar@bach.rutgers.edu
sankar at bach.rutgers.edu
Tue Apr 23 18:49:46 EDT 1991
The following two papers are now available via FTP from the neuroprose
archives. Both will be presented at IJCNN-Seattle, 1991. These papers
describe a new approach that combines Neural Networks and Decision Trees
to form a classifier that grows the neurons as it learns.
****************************************************************************
SPEAKER INDEPENDENT VOWEL RECOGNITION USING NEURAL TREE NETWORKS
Ananth Sankar and Richard Mammone
CAIP Center and Dept. of Electrical Engg.
Rutgers University, P.O. Box 1390
Piscataway, NJ 08855-1390
Speaker independent vowel recognition is a difficult pattern
recognition problem. Recently there has been much research using
Multi-Layer Perceptrons (MLP) and Decision Trees for this task. This
paper presents a new approach to this problem. A new neural
architecture and learning algorithm called Neural Tree Networks (NTN)
are developed. This network uses a tree structure with a neural
network at each tree node. The NTN architecture offers a very
efficient hardware implementation as compared to MLPs. The NTN
algorithm grows the neurons while learning as opposed to
backpropagation, for which the number of neurons must be known before
learning can begin. The new algorithm is guaranteed to converge on the
training set whereas backpropagation can get stuck in local minima. A
gradient descent technique is used to grow the NTN. This approach is
more efficient than the exhaustive search techniques used in standard
decision tree algorithms. We present simulation results on a speaker
independent vowel recognition task. These results show that the new
method is superior to both MLP and decision tree methods.
*****************************************************************************
OPTIMAL PRUNING OF NEURAL TREE NETWORKS FOR IMPROVED GENERALIZATION
Ananth Sankar and Richard Mammone
CAIP Center and Dept. of Electrical Engg.
Rutgers University, P.O. Box 1390
Piscataway, NJ 08855-1390
An optimal pruning algorithm for a Neural Network recently developed
called Neural Tree Networks (NTN) is presented. The NTN is grown by a
constructive learning algorithm that decreases the classification
error on the training data recursively. The optimal pruning algorithm
is then used to improve generalization. The pruning algorithm is shown
to be computationally inexpensive. Simulation results on a speaker
independent vowel recognition task are presented to show the improved
generalization using the pruning algorithm.
***************************************************************************
To retrieve:
unix> ftp cheops.cis.ohio-state.edu (or 128.146.8.62)
Name: anonymous
Password: neuron
ftp> cd pub/neuroprose
ftp> binary
ftp> get sankar.ijcnn91_1.ps.Z
ftp> get sankar.ijcnn91_2.ps.Z
ftp> quit
unix> uncompress sankar.ijcnn*.ps
unix> lpr sankar.ijcnn91_1.ps sankar.ijcnn91_2.ps
Thanks to Jordan Pollack for making this service available!
More information about the Connectionists
mailing list