Connectionist symbol processing: any progress?

Mitsu Hadeishi mitsu at ministryofthought.com
Sat Aug 15 23:47:28 EDT 1998


Lev Goldfarb wrote:

> On Sat, 15 Aug 1998, Mitsu Hadeishi wrote:
>
> > Since you are using terms like "metric" extremely loosely, I was also doing
> > so.
>
> Please, note that although I'm not that precise, I have not used the
> "terms like 'metric' extremely loosely".

I am referring to this statement:

>How could a recurrent net learn without some metric and, as
>far as I know, some metric equivalent to the Euclidean metric?Here you are talking
about the input space as though the Euclidean metric on that space is particularly
key, when it is rather the structure of the whole network, the feedback scheme, the
definition of the error measure, the learning algortihm, and so forth which actually
create the relevant and important mathematical structure.  In a sufficiently complex
network, you can pretty much get any arbitrary map you like from the input space to
the output, and the error measure is biased by the specific nature of the training
set (for example), and is measured on the output of the network AFTER it has gone
through what amounts to an arbitrary differentiable transformation.  By this time,
the "metric" on the original input space can be all but destroyed.  Add recurrency
and you even get rid of the fixed dimensionality of the input space.  In the quote
above, it appears you are implying that there is some direct relationship between
the metric on the initial input space and the operation of the learning algorithm.
I do not see how this is the case.

> The main reason we are developing the ETS model is precisely related to
> the fact that we believe it offers THE ONLY ONE POSSIBLE NATURAL (and
> fundamentally new) SYMBIOSIS of the discrete and the continuous FORMALISMS
> as opposed to the unnatural ones. I would definitely say (and you would
> probably agree) that (if, indeed, this is the case) it is the most
> important consideration.
>
> Moreover, it turns out that the concept of a fuzzy set, which was
> originally introduced in a rather artificial manner that didn't clarify
> the underlying source of fuzziness (and this have caused an understandable
> and substantial resistance to its introduction), emerges VERY naturally
> within the ETS model: the definition of the class via the corresponding
> distance function typically and naturally induces the fuzzy class boundary
> and also reveals the source of fuzziness, which includes the interplay
> between the corresponding weighted operations and (in the case of noise in
> the training set) a nonzero radius. Note that in the parity class problem,
> the parity class is not fuzzy, as reflected in the corresponding weighting
> scheme and the radius of 0.

Well, what one mathematician calls natural and the other calls artificial may be
somewhat subject to taste as well as rational argument.  At this point one can get
into the realm of mathematical aesthetics or philosophy rather than hard science.


More information about the Connectionists mailing list