tech report
Mathew Yeates
mathew at elroy.Jpl.Nasa.Gov
Wed Mar 13 12:38:45 EST 1991
The following technical report (JPL Publication) is available
for anonymous ftp from the neuroprose directory at
cheops.cis.ohio-state.edu. This is a short version of a previous
paper "An Architecture With Neural Network Characteristics for Least
Squares Problems" and has appeared in various forms at several
conferences.
There are two ideas that may be of interest:
1) By making the input layer of a single layer Perceptron fully
connected, the learning scheme approximates Newtons algorithm
instead of steepest descent.
2) By allowing local interactions between synapses the network can
handle time varying behavior. Specifically, the network can
implement the Kalman Filter for estimating the state of a linear
system.
get both yeates.pseudo-kalman.ps.Z and
yeates.pseudo-kalman-fig.ps.Z
A Neural Network for Computing the Pseudo-Inverse of a Matrix
and Applications to Kalman Filtering
Mathew C. Yeates
California Institute of Technology
Jet Propulsion Laboratory
ABSTRACT
A single layer linear neural network for associative memory is
described. The matrix which best maps a set of input keys to desired
output targets is computed recursively by the network using a parallel
implementation of Greville's algorithm. This model differs from the
Perceptron in that the input layer is fully interconnected leading
to a parallel approximation to Newtons algorithm. This is in contrast
to the steepest descent algorithm implemented by the Perceptron.
By further extending the model to allow synapse updates to interact
locally, a biologically plausible addition, the network implements
Kalman filtering for a single output system.
More information about the Connectionists
mailing list