Learning with "realistic" neurons
Peter Rowat
prowat at UCSD.EDU
Thu Jan 24 19:05:13 EST 1991
Gary Cottrell recently referred to work I am doing with models of
the gastric mill network in the lobster's stomatogastric ganglion. This
work is not published, but I do have a related paper which
generalizes BP to arbitrarily complex models (amongst other things),
and which is now available by ftp from the neuroprose archive.
Namely:
Peter Rowat and Allen Selverston (1990). Learning algorithms for
oscillatory networks with gap junctions and membrane currents.
To appear in: NETWORK: Computation in Neural systems, Volume 2,
Issue 1, February 1991.
Abstract:
We view the problem of parameter adjustment in oscillatory neural networks
as the minimization of the difference between two limit cycles.
Backpropagation is described as the application of gradient descent to
an error function that computes this difference. A mathematical
formulation is given that is applicable to any type of network model, and
applied to several models. By considering a neuron equivalent circuit, the
standard connectionist model of a neuron is extended to allow gap junctions
between cells and to include membrane currents. Learning algorithms are
derived for a two cell network with a single gap junction, and for a pair
of mutually inhibitory neurons each having a simplified membrane current.
For example, when learning in a network in which all cells
have a common, adjustable, bias current, the value of the bias is adjusted
at a rate proportional to the difference between the sum of the target
outputs and the sum of the actual outputs. When learning in a network
of n cells where a target output is given for every cell, the learning
algorithm splits into n independent learning algorithms, one per cell.
For networks containing gap junctions, a gap junction is modelled as a
conductance times the potential difference between the two adjacent cells.
The requirement that a conductance g must be positive is enforced by
replacing g by a function pos(g*) whose value is always positive, for example
exp(0.1 g*), and deriving an algorithm that adjusts the parameter g* in
place of g. When target output is specified for every cell in a network
with gap junctions, the learning algorithm splits into fewer independent
components, one for each gap-connected subset of the network. The learning
algorithm for a gap-connected set of cells cannot be parallelized further.
As a final example, a learning algorithm is derived for a mutually inhibitory
two-cell network in which each cell has a membrane current.
This generalized approach to backpropagation allows one to
derive a learning algorithm for almost any model neural network given in
terms of differential equations. It is one solution to the problem of
parameter adjustment in small but complex network models.
- ---------------------------------------------------------------------------
Copies of the postscript file rowat.learn-osc.ps.Z may be obtained from the
pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the
Getps script or do this:
unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62)
Connected to cheops.cis.ohio-state.edu.
Name (cheops.cis.ohio-state.edu:): anonymous
331 Guest login ok, sent ident as password.
Password: neuron
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
ftp> binary
ftp> get rowat.learn-osc.ps.Z
ftp> quit
unix-2> uncompress rowat.learn-osc.ps.Z
unix-3> lpr -P(your_local_postscript_printer) rowat.learn-osc.ps
(The file starts with 7 bitmapped figures which are slow to print.)
More information about the Connectionists
mailing list