Neural net paper available by anonymous ftp

Mario Marchand mario at joule.physics.uottawa.ca
Tue Feb 14 16:01:03 EST 1995


The following paper, which has just been accepted for publication
in the journal "Neural Networks", is available by anonymous ftp at:

ftp://dirac.physics.uottawa.ca/pub/tr/marchand

FileName: NN95.ps.Z

Title: Learning $\mu$-Perceptron Networks On the Uniform Distribution

Authors: Golea M., Marchand M. and Hancock T.R..

Abstract: We investigate the learnability, under the uniform distribution,
of  neural concepts that can be represented as simple combinations
of {\em nonoverlapping\/} perceptrons (also called $\mu$ perceptrons)
with binary weights and arbitrary thresholds.
Two perceptrons are said to be  nonoverlapping if they do not share
any input variables. Specifically, we investigate,
within the distribution-specific PAC model,
the learnability of $\mu$ {\em perceptron unions\/},
{\em decision lists\/},
and {\em generalized  decision lists\/}.
In contrast to most neural network learning
algorithms, we do not assume that the architecture of the network
is known in advance. Rather, it is the task of the algorithm
to find both the architecture of the net and the weight values
necessary to represent the function to be learned.
We give polynomial time algorithms  for
learning these restricted classes of networks.
The algorithms work by estimating various statistical
quantities that yield enough information to infer, with
high probability, the target concept.
Because the algorithms are statistical in
nature,  they are  robust against large amounts of random
classification noise.

ALSO: you will find other papers co-authored by Mario Marchand in
      this directory. The text file: Abstracts-mm.txt contains a
      list of abstracts of all the papers.

PLEASE: communicate to me any printing or transmission problems.
        Any comments concerning these papers are very welcome.





More information about the Connectionists mailing list