New ICSI TR on incremental learning
Ethem Alpaydin
ethem at ICSI.Berkeley.EDU
Tue May 21 13:56:40 EDT 1991
The following TR is available by anonymous net access at
icsi-ftp.berkeley.edu (128.32.201.55) in postscript. Instructions to ftp
and uncompress follow text.
Hard copies may be requested by writing to either of the addresses
below:
ethem at icsi.berkeley.edu
Ethem Alpaydin
ICSI 1947 Center St. Suite 600
Berkeley CA 94704-1105 USA
------------------------------------------------------------------------------
GAL:
Networks that grow when they learn and
shrink when they forget
Ethem Alpaydin
International Computer Science Institute
Berkeley, CA
TR 91-032
Abstract
Learning when limited to modification of some parameters has a limited
scope; the capability to modify the system structure is also needed to
get a wider range of the learnable. In the case of artificial neural
networks, learning by iterative adjustment of synaptic weights can
only succeed if the network designer predefines an appropriate network
structure, i.e., number of hidden layers, units, and the size and
shape of their receptive and projective fields. This paper advocates
the view that the network structure should not, as usually done, be
determined by trial-and-error but should be computed by the learning
algorithm. Incremental learning algorithms can modify the network
structure by addition and/or removal of units and/or links. A survey
of current connectionist literature is given on this line of thought.
``Grow and Learn'' (GAL) is a new algorithm that learns an association
at one-shot due to being incremental and using a local representation.
During the so-called ``sleep'' phase, units that were previously
stored but which are no longer necessary due to recent modifications
are removed to minimize network complexity. The incrementally
constructed network can later be finetuned off-line to improve
performance. Another method proposed that greatly increases
recognition accuracy is to train a number of networks and vote over
their responses. The algorithm and its variants are tested on
recognition of handwritten numerals and seem promising especially in
terms of learning speed. This makes the algorithm attractive for
on-line learning tasks, e.g., in robotics. The biological
plausibility of incremental learning is also discussed briefly.
Keywords
Incremental learning, supervised learning, classification, pruning,
destructive methods, growth, constructive methods, nearest neighbor.
--------------------------------------------------------------------------
Instructions to ftp the above-mentioned TR (Assuming you are under
UNIX and have a postscript printer --- messages in parantheses indicate
system's responses):
ftp 128.32.201.55
(Connected to 128.32.201.55.
220 icsi-ftp (icsic) FTP server (Version 5.60 local) ready.
Name (128.32.201.55:ethem):)anonymous
(331 Guest login ok, send ident as password.
Password:)(your email address)
(230 Guest login Ok, access restrictions apply.
ftp>)cd pub/techreports
(250 CWD command successful.
ftp>)bin
(200 Type set to I.
ftp>)get tr-91-032.ps.Z
(200 PORT command successful.
150 Opening BINARY mode data connection for tr-91-032.ps.Z (153915 bytes).
226 Transfer complete.
local: tr-91-032.ps.Z remote: tr-91-032.ps.Z
153915 bytes received in 0.62 seconds (2.4e+02 Kbytes/s)
ftp>)quit
(221 Goodbye.)
(back to Unix)
uncompress tr-91-032.ps.Z
lpr tr-91-032.ps
Happy reading, I hope you'll enjoy it.
More information about the Connectionists
mailing list