Reprint: First-Order vs. Second-Order Single Layer Recurrent NN
Lee Giles
giles at research.nj.nec.com
Fri Mar 5 16:09:31 EST 1993
The following reprint is available via the NEC Research
Institute ftp archive external.nj.nec.com. Instructions for
retrieval from the archive follow the summary.
----------------------------------------------------------------------------------
First-Order vs. Second-Order Single Layer Recurrent Neural Networks
Mark W. Goudreau (Princeton University and NEC Research Institute, Inc.)
C. Lee Giles (NEC Research Institute, Inc. and University of Maryland)
Srimat T. Chakradhar (C&CRL, NEC USA, Inc.)
D. Chen (University of Maryland)
ABSTRACT
We examine the representational capabilities of first-order and second-order
Single Layer Recurrent Neural Networks (SLRNNs) with hard-limiting neurons. We
show that a second-order SLRNN is strictly more powerful than a first-order SLRNN.
However, if the first-order SLRNN is augmented with output layers of feedforwardneurons, it can implement any finite-state recognizer, but only if state-splitting
is employed. When a state is split, it is divided into two equivalent states.
The judicious use of state-splitting allows for efficient implementation of
finite-state recognizers using augmented first-order SLRNNs.
-------------------------------------------------------------------------------------
FTP INSTRUCTIONS
unix> ftp external.nj.nec.com
Name: anonymous
Password: (your_userid at your_site)
ftp> cd pub/giles/papers
ftp> binary
ftp> get SLRNN.ps.Z
ftp> quit
unix> uncompress SLRNN.ps.Z
--------------------------------------------------------------------------------.
--
C. Lee Giles / NEC Research Institute / 4 Independence Way
Princeton, NJ 08540 / 609-951-2642 / Fax 2482
==
More information about the Connectionists
mailing list