Paper announcements
thanasis kehagias
ST401843 at brownvm.brown.edu
Tue Mar 19 18:29:54 EST 1991
I have just placed two papers of mine in the ohio-state archive.
The first one is in the file kehagias.srn1.ps.Z and the relevant
figures in the companion file kehagias.srn1fig.ps.Z.
The second one is in the file kehagias.srn2.ps.Z and the relevant
figures in the companion file kehagias.srn2fig.ps.Z.
Detailed instructions for getting and printing these files will
be included in the end of this message.
Some of you have received versions of these files in email previously.
In that case read a postscript at the end of this message.
-----------------------------------------------------------------------
Stochastic Recurrent Network training
by the Local Backward-Forward Algorithm
Ath. Kehagias
Brown University
Div. of Applied Mathematics
We introduce Stochastic Recurrent Networks, which
are collections of interconnected finite state units.
At every discrete time step, each unit goes into a new state,
following a probability law that is conditional on the
state of neighboring units at the previous time step.
A network of this type can learn a
stochastic process, where ``learning'' means
maximizing the probability Likelihood function of the model. A new
learning (i.e. Likelihood maximization) algorithm is
introduced, the Local Backward-Forward Algorithm.
The new algorithm is based on the Baum Backward-Forward Algorithm
(for Hidden Markov Models) and improves speed of learning substantially.
Essentially, the local Backward-Forward Algorithm is
a version of Baum's algorithm which estimates local
transition probabilities rather than the global transition
probability matrix. Using the local BF algorithm, we train
SRN's that solve the 8-3-8 encoder problem and the phoneme
modelling problem.
This is the paper kehagias.srn1.ps.Z, kehagias.srn1fig.ps.Z .
The paper srn1 has undergone significant revision. It had too many typos,
bad notation and also needed reorganization . All of these have been
done. Thanks to N. Chater, S. Nowlan and A.T. Tsoi and M. Perrone
for many useful suggestions along these lines.
--------------------------------------------------------------------
Stochastic Recurrent Network training
Prediction and Classification of Time Series
Ath. Kehagias
Brown University
Div. of Applied Mathematics
We use Stochastic Recurrent Networks of the type introduced
in [Keh91a] as models of finite-alphabet time series.
We develop the Maximum Likelihood Prediction Algorithm
and the Maximum A Posteriori Classification Algorithm
(which can both be implemented in recurrent PDP form).
The prediction problem is:
given the output up to the present time: Y^1,...,Y^t and the
input up to the immediate future: U^1,...,U^t+1, predict
with Maximum Likelihood the output Y^t+1 that the SRN will produce
in the immediate future. The classification problem is:
given the output up to the present time: Y^1,...,Y^t and the
input up to the present time: U^1,...,U^t, as well as a number
of candidate SRN's: M_1, M_2, .., M_K, find the network that
has Maximum Posterior Probability of producing
Y^1,...,Y^t. We apply our algorithms to prediction and classification
of speech waveforms.
This is the paper kehagias.srn2.ps.Z, kehagias.srn2fig.ps.Z .
-----------------------------------------------------------------------
To get these files, do the following:
gvax> ftp cheops.cis.ohio-state.edu
220 cheops.cis.ohio-state.edu FTP server ready.
Name: anonymous
331 Guest login ok, send ident as password.
Password:neuron
ftp> Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
ftp> binary
200 Type set to I.
ftp> get kehagias.srn1.ps.Z
ftp> get kehagias.srn1fig.ps.Z
ftp> get kehagias.srn2.ps.Z
ftp> get kehagias.srn2fig.ps.Z
ftp> quit
gvax> uncompress kehagias.srn1.ps.Z
gvax> uncompress kehagias.srn1fig.ps.Z
gvax> uncompress kehagias.srn2.ps.Z
gvax> uncompress kehagias.srn2fig.ps.Z
gvax> lqp kehagias.srn1.ps
gvax> lqp kehagias.srn1fig.ps
gvax> lqp kehagias.srn2.ps
gvax> lqp kehagias.srn2fig.ps
POSTSCRIPT: All of the people that sent a request (about a month
ago) for srn1 in its original form are in my mailing list and most
got copies of new versions of srn1,srn2 in email. Some of these files
did not make it through internet, because of size restrictions etc.
so you may want to fpt them now. Incidentally, if you want to be
removed from the mailing list (for when the next paper in the
series comes by) send me mail.
Thanasis Kehagias
More information about the Connectionists
mailing list