No subject

Giles L. giles at fuzzy.nec.com
Fri Mar 9 15:11:50 EST 1990


paper available:


This 8-page paper will appear in  Advances in Neural Information 
Processing Systems 2, D.S. Touretzky (ed), Morgan Kaufmann, San 
Mateo, Ca., 1990.

HIGHER ORDER RECURRENT NETWORKS & GRAMMATICAL INFERENCE

C. L. Giles*,  G. Z. Sun, H. H. Chen, Y. C. Lee,  D. Chen
Department of Physics and Astronomy and Institute for Advanced 
Computer Studies, University of Maryland, College Park, MD 20742.
*NEC Research Institute, 4 Independence Way, Princeton, N.J. 08540

ABSTRACT

We design a higher-order single layer, recursive neural network 
which easily learns to simulate a deterministic finite state 
machine and infer simple regular grammars from small training 
sets.  An enhanced version of this neural network state machine 
is then constructed and connected through a common error term to 
an external analog stack memory. The resulting hybrid machine can 
be interpreted as a type of neural net pushdown automata.  The 
neural net finite state machine part is given the primitives, 
push and pop, and is able to read the top of the stack.  Using a 
gradient descent learning rule derived from a common error 
function, the hybrid network learns to effectively use the stack 
actions to manipulate the stack memory and to learn simple 
context-free grammars.  If the neural net pushdown automata are 
reduced through a heuristic clustering of neuron states and 
actions, the neural network reduces to correct pushdown automata 
which recognize the learned context-free grammars.

---------------

For a hard copy of the above, please send a request to:
gloria at research.nec.com  or
Gloria Behrens
NEC Research Institute
4 Independence Way
Princeton, N.J. 08540



More information about the Connectionists mailing list