continuous vs symbolic: a more concrete problem

Goldfarb GOLDFARB%UNB.CA at UNBMVS1.csd.unb.ca
Tue Jan 8 19:27:09 EST 1991


Over the last two decades it has become reasonably clear that at some
stage of information processing (even in vision) it is often convenient
to represent objects in a symbolic form. The simplest symbolic
representation is a string over a finite alphabet. An immediate
question that arises then is:

       How does one go about constructing a neural network that can
       recognize a reasonably large (infinite and nontrivial) *class*
       of formal languages?


For example, let us specify a language in the following way:
       fix  1) some string (e.g. dabbe) and  2) a finite set S of
       strings (e.g. S={aa, da, cdc}); then the language is formed
       by all strings that can be obtained from the single fixed
       string (dabbe) by inserting in any place and in any order any
       number of strings from set S.
Consider now the class of *all* languages that can be specified in this
way. It is a subclass of the class of regular languages.

If the NN is "the right" model, then the above problem should have a
reasonably simple solution. Does anyone know such a solution? By the
way, the reason I have chosen the above class of languages is that
the new RLM model mentioned in several earlier postings solves the
the problem in a very routine manner.


--Lev Goldfarb



More information about the Connectionists mailing list