No subject

Jenq-Neng Hwang hwang at pierce.ee.washington.edu
Tue Oct 5 18:34:57 EDT 1993


Technical Report available from neuroprose:  26 single spaced pages
                         (13 pages of text and 13 pages of figures) 


    WHAT'S WRONG WITH A CASCADED CORRELATION LEARNING NETWORK:
       A  PROJECTION PURSUIT LEARNING PERSPECTIVE

   Jenq-Neng Hwang, Shih-Shien You, Shyh-Rong Lay, I-Chang Jou

             Information Processing Laboratory
         Department of Electrical Engineering, FT-10,
         University of Washington, Seattle, WA 98195.  
 
                 Telecommunication Laboratories
          Ministry of Transportation and Communications
            P.O. Box 71, Chung-Li, Taiwan 320, R.O.C.


ABSTRACT:

Cascaded correlation is a popular supervised learning architecture
that  dynamically grows layers of  hidden neurons of fixed nonlinear
activations (e.g., sigmoids), so that the network topology (size,
depth) can be   efficiently determined. Similar to a cascaded
correlation learning network (CCLN), a projection pursuit learning
network (PPLN) also dynamically grows the hidden neurons.  Unlike a
CCLN where cascaded connections from the existing  hidden units to the
new  candidate hidden unit are required to establish   high-order
nonlinearity in approximating the residual error, a PPLN approximates
the   high-order nonlinearity by using  (more flexible) trainable
nonlinear nodal activation functions. Moreover, the maximum
correlation training criterion used in a CCLN results in a poorer
estimate  of hidden weights when compared with the minimum mean
squared error criterion used in a PPLN.  The CCLN is thus excluded for
most regression applications where smooth interpolation of functional
values are highly desired.  Furthermore, it is shown that the PPLN
can also achieves much  better performance in solving the two-spiral
classification benchmarks using comparable size of weight parameters.    


================
To obtain copies of the postscript file, please use Jordan Pollack's service
(no hardcopies will be provided):

Example:
unix> ftp archive.cis.ohio-state.edu                 (or ftp 128.146.8.52)
Name (archive.cis.ohio-state.edu): anonymous
Password (archive.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get hwang.cclppl.ps.Z
ftp> quit
unix> uncompress  hwang.cclppl.ps

Now print "hwang.cclppl.ps" as you would any other (postscript) file.

In case your printer has limited memory, you can divide this file into
two smaller files after the uncompress:

unix>>  head -42429 hwang.cclppl.ps > file1.ps
unix>>  tail +42430 hwang.cclppl.ps > file2.ps

Then print "file1.ps" and "file2.ps" separately.




More information about the Connectionists mailing list