learning a synaptic learning rule. TR available.
Yoshua BENGIO
yoshua at HOMER.MACH.CS.CMU.EDU
Wed Nov 21 19:41:36 EST 1990
The following technical report is now available by ftp from neuroprose:
Bengio Y. and Bengio S. (1990). Learning a synaptic learning rule.
Technical Report #751, Universite de Montreal, Departement d'informatique
et de recherche operationelle.
Learning a synaptic learning rule
Yoshua Bengio Samy Bengio
McGill University, Universite de Montreal
School of Computer Science, Departement d'informatique
3480 University street, et de recherche operationelle,
Montreal, Qc, Canada, H3A 2A7 Montreal, Qc, Canada, H3C 3J7
yoshua at cs.mcgill.ca bengio at iro.umontreal.ca
An original approach to neural modeling is presented,
based on the idea of searching for and tuning, with learning
methods, a synaptic learning rule which is biologically plausible, and
yields networks capable to learn to perform difficult tasks.
This method relies on the idea of considering the synaptic modification
rule DeltaW() as a parametric function. This function has local inputs
and is the same in many neurons. Its parameters can be estimated with
known learning methods. For this optimization, we give particular
attention to gradient descent and genetic algorithms. Estimation of
these parameters consists of a joint global optimization of
(a) the synaptic modification function, and
(b) the networks that are learning to perform some tasks, using this function.
We show how to compute the gradient of an optimization criteria
with respect to the parameters of DeltaW().
Both network architecture and the learning function can be
designed within constraints derived from biological knowledge.
To avoid that DeltaW() be too specialized, this function is forced
to be the same for a large number of synapses, in a population of
networks learning to perform different tasks. To enforce efficiency
constraints, some of these networks should learn complex mappings
(as in pattern recognition). Others should learn to reproduce
behavioral phenomena, such as associative conditioning, and
neurological phenomena, such as habituation, recovery, dishabituation
and sensitization. The architecture of the networks reproducing
these biological phenomena can be designed based on well-studied
circuits, such as those involved in associations in Aplysia,
Hermissenda, or the rabbit eyelid closure response. Multiple synaptic
modification functions allow for the diverse types of synapses
(e.g. inhibitory, excitatory). Models of pre-, epi- and
post-synaptic mechanisms can be used to bootstrap Delta W(), so that
it initially consists of a combination of simpler modules, each
emulating a particular synaptic mechanism.
---------------------------------------------------------------------------
Copies of the postscript file bengio.learn.ps.Z may be obtained from the
pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the Getps
script or do this:
unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62)
Connected to cheops.cis.ohio-state.edu.
Name (cheops.cis.ohio-state.edu:): anonymous
331 Guest login ok, sent ident as password.
Password: neuron
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
ftp> binary
ftp> get bengio.learn.ps.Z
ftp> quit
unix-2> uncompress bengio.learn.ps.Z
unix-3> lpr -P(your_local_postscript_printer) bengio.learn.ps
Or, order a hardcopy by sending your physical mail address to
bengio at iro.umontreal.ca, mentioning Technical Report #751.
Please do this only if you cannot use the ftp method described above.
----------------------------------------------------------------------------
More information about the Connectionists
mailing list