AI (discrete) moodel and NN (continuous) model

Lev Goldfarb GOLDFARB%UNB.CA at unbmvs1.csd.unb.ca
Fri Dec 21 23:57:29 EST 1990


D.E Rumelhart:

    It seems to me that a number of issues are being confused in
    this discussion. One has to do what AI is and another has to
    do with what "connectionism" is and why we might be interested
    [in] it.

To prevent a greater confusion, let me stress again the point that
was expressed by J. von Neumann. If we want to do what so far has
been called science, we must evaluate the progress not by what
"we might be interested in", but by the state of development
of the corresponding mathematical models. Therefore, today AI *is*
what the practitioners of it are "practicing", and it is not difficult
to find what they are saying about the underlying mathematical model:
"artificial intelligence (AI) is primarily concerned with propositional
languages for representing knowledge and with techniques for
manipulating these representations" (Nils J. Nilsson, Artificial
Intelligence: Engineering, Science, or Slogan, AI Magazine, Winter
1981-82, p.2).

It appears that there is a greater confusion in the NN community about
what the underlying mathematical model is. I believe that the
underlying mathematical model,i.e. the place where all "events"
are developing, is the vector space model. This is because the NN
can be viewed as a "mechanism" for transforming input vectors into
the subset of real numbers (the transformation, or mapping, is composed
of several vector space transformations). What the NN practitioners
often forget is that the vector space they want to use is the one
that has some geometry in it (there are "algebraic" vector spaces
without any geometry in them). The reasons why the geometry is
*necessary*  for the NN are many: to measure the pattern closeness,
to introduce (to construct) the optimization function, etc. Let
me say it again: in the present setting, one cannot talk meaningfully
about the NN without the corresponding geometry.

Thus, as I have alluded to in my last correspondence, we have a
"classical" opposition between the basic mathematical paradigms--
the discrete and the continuous.

Mathematicians have usually resolved the "opposition" by inducing the
continuous structure on top of the discrete one in such a way that the
corresponding algebraic operations become continuous, which results in
much richer mathematical models (inner product spaces, topological
groups, etc.).

In order to reconcile the present "AI" and the present "connectionism"
(as mathematical models), i.e. to pave the way for the "new" and the
desirable AI, one has to construct essentially the same new model
that would reconcile, for example, the syntactic pattern recognition
(based on Chomsky's generative grammars) and the classical vector space
pattern recognition. The old mathematical "solutions", however, of
"simple" fusing of the two structures do not work here, since the
induced geometry must not be fixed but should vary depending on the
structure of the classes that has to be learned. Thus, not only does
the continuous structure must be fused with the discrete, which can
be accomplished by associating the weighting scheme with the set of
operations, but the system must also be able to *evolve structurally*
in an efficient manner, i.e. it must be able to learn *efficiently*
new (macro)operations necessary for discrimination of the learning class
(all present AI learning algorithms do it *very* inefficiently, since
they do not use any geometry on the learning space).

The outline of my answer to the above "reconciliation" problem, as I
have mentioned several months ago, can be found in the June's issue
of Pattern Recognition, but the progress since then has been substantial.

F-i-n-a-l-l-y, one quick note on why the vector space model is not
likely to be of sufficient generality for many environments: the
distance functions that can be generated even by simplest insertion/
deletion operations for the string patterns cannot be reconstructed
in a Euclidean vector space of any dimension. This fact is not really
surprising, since the class of Euclidean space forms a very small
subclass of the class of all metric spaces.  Thus, it seems to me that
the current NN framework must be substantially modified.

(Biological implications should also be discussed.)

I'll be away for a week. Best wishes for the coming year.

-- Lev Goldfarb



More information about the Connectionists mailing list