TR available - fuzzy recurrent neural networks

omlinc@cs.rpi.edu omlinc at cs.rpi.edu
Fri Mar 29 17:00:21 EST 1996



The following Technical Report is available via the University of Maryland 
Department of Computer Science and the NEC Research Institute archives:

---------------------------------------------------------------------------

    Fuzzy Finite-state Automata Can Be Deterministically Encoded 
                into Recurrent Neural Networks

    Christian W. Omlin(a), Karvel K. Thornber(a), C. Lee Giles(a,b) 
           (a)NEC Research Institute, Princeton, NJ 08540
          (b)UMIACS, U. of Maryland, College Park, MD 20742


      U. of Maryland Technical Report CS-TR-3599 and UMIACS-96-12 


			ABSTRACT

 
There has been an increased interest in combining fuzzy systems with 
neural networks because fuzzy neural systems merge the advantages of 
both paradigms. On the one hand, parameters in fuzzy systems have clear
physical meanings and rule-based and linguistic information can be 
incorporated into adaptive fuzzy systems in a systematic way. On the 
other hand, there exist powerful algorithms for training various neural 
network models. However, most of the proposed combined architectures 
are only able to process static input-output relationships, i.e. they 
are not able to process temporal input sequences of arbitrary length.
Fuzzy finite-state automata (FFAs) can model dynamical processes
whose current state depends on the current input and previous states.
Unlike in the case of deterministic finite-state automata (DFAs),
FFAs are not in one particular state, rather each state is occupied to 
some degree defined by a membership function. Based on previous work on 
encoding DFAs in discrete-time, second-order recurrent neural networks, 
we propose an algorithm that constructs an augmented recurrent neural 
network that encodes a FFA and recognizes a given fuzzy regular language 
with arbitrary accuracy. We then empirically verify the encoding methodology 
by measuring string recognition performance of recurrent neural networks 
which encode large randomly generated FFAs. In particular, we examine how 
the networks' performance varies as a function of synaptic weight strength.

Keywords: Fuzzy logic, automata, fuzzy automata, recurrent neural networks, 
encoding, rules.

****************************************************************

I would like to add to my announcement of the TR
that recurrent neural networks with sigmoid discriminant functions
that represent finite-state automata are an example of hybrid
systems.

Comments regarding the TR are welcome. Please send them to
omlinc at research.nj.nec.com.

Thanks

 -Christian

****************************************************************************

http://www.neci.nj.nec.com/homepages/giles.html
http://www.neci.nj.nec.com/homepages/omlin/omlin.html
http://www.cs.umd.edu/TRs/TR-no-abs.html

ftp://ftp.nj.nec.com/pub/giles/papers/
UMD-CS-TR-3599.fuzzy.automata.encoding.recurrent.nets.ps.Z

******************************************************************************



More information about the Connectionists mailing list