Paper available in Neuroprose

arun maskara spec lec cis arun at hertz.njit.edu
Mon Mar 16 13:44:56 EST 1992


The following paper is now available by ftp from neuroprose archive:


	Forcing Simple Recurrent Neural Networks to Encode Context

	Arun Maskara, New Jersey Institute of Technology,
	Department of Computer and Information Sciences
	University Heights, Newark, NJ 07102, arun at hertz.njit.edu 
 
	Andrew Noetzel, The William Paterson College,
	Department of Computer Science, Wayne, NJ 07470

			Abstract

The Simple Recurrent Network (SRN) is a neural network model that has been
designed for the recognition of symbol sequences. It is a back-propagation
network with a single hidden layer of units. The symbols of a sequence are
presented one at a time at the input layer. But the activation pattern in
the hidden units during the previous input symbol is also presented as an
auxiliary input. In previous research, it has been shown that the SRN
can be trained to behave as a finite state automaton (FSA) which accepts the
valid strings corresponding to a particular grammar and rejects the invalid
strings. It does this by predicting each successive symbol in the input string.

However, the SRN architecture sometime fails to encode the context necessary to
predict the next input symbol. This happens when two different states in the FSA
generating the strings have the same output, and the SRN develops similar hidden
layer encodings for these states. The failure happens more often when number of
units in the hidden layer is limited. We have developed a new architecture,
called the Forced Simple Recurrent Network (FSRN), that solves this problem.
This architecture contains additional output units, which are trained to show
the current input and the previous context. Simulation results show that for
certain classes of FSA with $u$ states, the SRN with $\lceil \log_2u \rceil$
units in the hidden layers fails, where as the FSRN with the same number of
hidden layer units succeeds.

-------------------------------------------------------------------------------

Copy of the postscript file has been placed in neuroprose archive. The
file name is maskara.fsrn.ps.Z

The usual instructions can be followed to obtain the file from the
directory pub/neuroprose from the ftp site archive.cis.ohio-state.edu

Arun Maskara



More information about the Connectionists mailing list