"Layers"

Geoffrey Hinton hinton at ai.toronto.edu
Tue Sep 13 13:52:16 EDT 1988


As Scott points out, the problem is that a net with one hidden layer has:

3 layers of units (including input and output)
2 layers of modifiable weights
1 layer of hidden units.

Widrow has objected (quite reasonably) to calling the input units "units"
since they don't have modifiable incoming weights, nor do they have a
non-linear I/O function.  So that means we will never agree on counting the
total number of layers.

The number of layers of modifiable weights is unambiguous, but has the problem
that most people think of the "neurons" as forming the layers, and also it
gets complicated when connections skip layers  (of units).

Terminology can be made unambiguous by referring to the number of hidden
layers.  This has a slight snag when the first layer of weights (counting from
the input) is unmodifiable, since the units in the next layer are then not
true hidden units (they dont learn representations), but we can safely leave
it to the purists and flamers to worry about that.

I strongly suggest that people NEVER use the term "layers" by itself.  Either
say "n hidden layers" or say "n+1 layers of modifiable weights".  I don't
think attempts to legislate in favor of one or the other of these alternatives
will work.

Geoff


More information about the Connectionists mailing list