back-propagation and projection pursuit learning
Jenq-Neng Hwang
hwang at pierce.ee.washington.edu
Mon May 3 15:17:00 EDT 1993
Technical Report available from neuroprose:
REGRESSION MODELING IN BACK-PROPAGATION AND PROJECTION PURSUIT LEARNING
Jenq-Neng Hwang, Shyh-Rong Lay
Information Processing Laboratory
Department of Electrical Engineering, FT-10
University of Washington, Seattle, WA 98195
and
Martin Maechler, Doug Martin, Jim Schimert
Department of Statistics, GN-22
University of Washington, Seattle, WA 98195
ABSTRACT
We studied and compared two types of connectionist learning methods
for model-free regression problems in this paper. One is the popular
"back-propagation" learning (BPL) well known in the artificial
neural networks literature; the other is the "projection
pursuit" learning (PPL) emerged in recent years in the statistical
estimation literature. Both the BPL and the PPL are based on
projections of the data in directions determined from interconnection
weights. However, unlike the use of fixed nonlinear activations
(usually sigmoidal) for the hidden neurons in BPL, the PPL
systematically approximates the unknown nonlinear activations.
Moreover, the BPL estimates all the weights simultaneously at each
iteration, while the PPL estimates the weights cyclically
(neuron-by-neuron and layer-by-layer) at each iteration. Although the
BPL and the PPL have comparable training speed when based on a
Gauss-Newton optimization algorithm, the PPL proves more parsimonious
in that the PPL requires a fewer hidden neurons to approximate the
true function. To further improve the statistical performance of the
PPL, an orthogonal polynomial approximation is used in place of the
supersmoother method originally proposed for nonlinear activation
approximation in the PPL.
================
To obtain copies of the postscript file, please use Jordan Pollack's service
(no hardcopies will be provided):
Example:
unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52)
Name (archive.cis.ohio-state.edu): anonymous
Password (archive.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get hwang.bplppl.ps.Z
ftp> quit
unix> uncompress hwang.bplppl.ps
Now print "hwang.bplppl.ps" as you would any other (postscript) file.
More information about the Connectionists
mailing list