questions on kohonen's maps

chrisley.pa@Xerox.COM chrisley.pa at Xerox.COM
Fri Mar 24 17:53:00 EST 1989


Lonce (lwyse at bucasb.BU.EDU) writes:

"What does "ordering" mean when your projecting inputs to a lower
dimensional space? For example, the "Peano" type curves that result from a
one-D neighborhood learning a 2-D input distribution, it is obviously NOT 
true that nearby points in the input space maximally activate nearby
points on the neighborhood chain." 

It is not true that nearby points in input space are always mapped to
nearby points in the output space when the mapping is dimensionality
reducing, agreed.  But 'ordering' still makes sense.  The map is
topology-preserving if the dependency is in the other direction, i.e., if
nearby points in output space are always activated by nearby points in
input space.

Lonce goes on to say:

"In this case, it is not even clear that "untangling" the neighborhood is
of utmost importance, since a tangled chain can still do a very good job of
divvying up the space almost equally between its nodes."

I agree that topology preservation is not necessarily of utmost importance,
but it may be useful in some applications, such as the ones I mentioned a
few messages back (phoneme recognition, inverse kinematics, etc.).  Also,
there is 1) the interest in properties of self-organizing systems in
themselves, even though an application can't be immediately found; and 2)
the observation that for some reason the brain seems to use topology
preserving maps (with the one-way dependency I mentioned above), which,
although they *could* be computationally unnecessary or even
disadvantageous, are probably in fact, nature being what she is, good
solutions to tough real time problems. 

Ron Chrisley
After April 14th, please send personal email to Chrisley at vax.ox.ac.uk


More information about the Connectionists mailing list