Connectionists: New paper on classification with sign-constrained synapses

Legenstein Robert legi at igi.tugraz.at
Mon May 14 03:41:05 EDT 2007


Dear Colleagues,

we announce a new paper to appear in Neural Computation.

The paper addresses the often ignored discrepancy between the learning
capability of a biological neuron and that of a perceptron (or any
other model commonly considered in learning theory): The sign of a
synaptic weight does not change, since it depends on the type
(excitatory of inhibitory) of the presynaptic neuron.

We analyze theoretically the classification capability of perceptrons
with weights that do not change during learning.  We show that this
constraint reduces drastically the learning capability of a
perceptron, unless the distribution of input patterns has particular
properties that are made explicit in this article.

Furthermore, we show by computer simulations that the
classification capability rises if input patterns are sparse.


Paper: On the classification capability of sign-constrained perceptrons

Authors:  Robert Legenstein and Wolfgang Maass

Preprint available at http://www.igi.tugraz.at/legi/psfiles/170.pdf

Full Abstract:

The perceptron (also referred to as McCulloch-Pitts neuron, or linear
threshold gate) is commonly used as a simplified model for the
discrimination and learning capability of a biological
neuron. Criteria that tell us when a perceptron can implement (or
learn to implement) all possible dichotomies over a given set of input
patterns are well-known, but only for the idealized case where one
assumes that the sign of a synaptic weight can be switched during
learning. We present in this article an analysis of the classification
capability of the biologically more realistic model of a
sign-constrained perceptron, where the signs of synaptic weights
remain fixed during learning (which is the case for most types of
biological synapses). In particular, the VC-dimension of
sign-constrained perceptrons is determined, and a necessary and
sufficient criterion is provided that tells us when all 2^m
dichotomies over a given set of m patterns can be learned by
sign-constrained perceptron. We also show that uniformity of L1 norms
of input patterns is a sufficient condition for full representation
power in the case where all weights are required to be
nonnegative. Finally, we also exhibit cases where the sign-constraint
of a perceptron drastically reduces its classification capability. Our
theoretical analysis is complemented by computer simulations, which
demonstrate in particular that sparse input patterns improve the
classification capability of sign-constrained perceptrons.


Best regards,

Robert Legenstein
Institute for theoretical computer science
TU Graz
Inffeldgasse 16b/I, 8010 Graz, Austria



More information about the Connectionists mailing list