weight spaces
Nolfi & Cecconi
STIVA%IRMKANT.BITNET at vma.CC.CMU.EDU
Fri Apr 20 10:18:43 EDT 1990
We would like to submit to discussion this topic:
A good way to understand neural network functioning is to see the learning
process as a trajectory in the weight space. More specifically we can imagine,
given a task, the process of learning as a movement on the error (fitness)
surface of the weight space. The concept of local minima, for example, that
derive from this idea, has been showed to be extremally useful.
However, we know very little about weight spaces. This certainly comes from
the fact that they are very complex to investigate. On the other hand we think
that it would be useful to try to answer questions like: are there some kind
of regularities in the error surface ? If this is the case, are these
regularities task dependent or there are also general type regularities ?
How learning algorithms differ from the point of view of the trajectory in
the weight space ?
We will appreciate comments and possibly references about that.
Nolfi & Cecconi
More information about the Connectionists
mailing list