Errata for Neural Net and PAC learning papers

Mario Marchand mario at physics.uottawa.ca
Thu Jun 16 08:52:40 EDT 1994


Dear Colleagues

The following papers have already been announced 10 days ago
but have been moved to another directory. To retrieve any one of them
please follow these instructions:

unix> ftp Dirac.physics.uottawa.ca
Name: anonymous
Password: "your email address"
fpt> cd /pub/tr/marchand
ftp> binary
ftp> get filename.ps.Z
ftp> bye
unix> uncompress filename.ps.Z

.. and print the postscript file: filename.ps


----

****************************************************************
FILE: COLT93.ps.Z

TITLE: Average Case Analysis of the Clipped Hebb Rule
         for Nonoverlapping Perceptron Networks

AUTHORS: Mostefa Golea, Mario Marchand


             Abstract

We investigate the clipped Hebb rule for learning
different multilayer networks of nonoverlapping
perceptrons with binary weights and
zero thresholds
when the examples are generated according to
the uniform distribution.
Using the central limit theorem and very simple counting arguments,
we calculate exactly its learning curves
(\ie\ the generalization rates as a
function of the number of training examples)
in the limit of a large number of inputs.
We find that the learning curves converge {\em exponentially\/}
rapidly to perfect generalization.
These results are very encouraging given the
simplicity of the learning rule.
The analytic expressions of the
learning curves are in excellent agreement
with the numerical simulations,
even for moderate values of the number of inputs.
*******************************************************************

******************************************************************
FILE: EuroCOLT93.ps.Z

TITLE:   On Learning Simple Deterministic and Probabilistic
                      Neural Concepts

AUTHORS: Mostefa Golea, Mario Marchans


                  Abstract


We investigate the learnability, under the uniform distribution,
of deterministic and probabilistic
neural concepts that can be represented as {\em simple combinations}
of {\em nonoverlapping} perceptrons with binary weights.
Two perceptrons are said to be  nonoverlapping if they do not share
any input variables.
In the deterministic case, we investigate,
within the distribution-specific PAC model,
the learnability of {\em perceptron decision lists}
and {\em generalized  perceptron decision lists}.
In the probabilistic case, we adopt the approach of
{\em learning with a model of probability} introduced by Kearns and
Schapire~\cite{KS90} and Yamanishi~\cite{Y92},
and investigate a class
of concepts we call {\em probabilistic majorities}
of nonoverlapping perceptrons. We give polynomial time algorithms  for
learning these restricted classes of networks.
The algorithms work by estimating various statistical
quantities that yield enough information to infer, with
high probability, the target concept.
***********************************************************************

***********************************************************************
FILE: NeuralComp.ps.Z

TITLE:  On Learning Perceptrons with Binary Weights

AUTHORS: Mostefa Golea, Mario Marchand


                       Abstract

We present an algorithm that PAC learns  any perceptron with binary weights
and arbitrary threshold under the family of {\em product distributions\/}.
The sample complexity of this algorithm is of
$O((n/\epsilon)^4 \ln(n/\delta))$
and its running time increases only linearly with the number of
training examples. The algorithm does not try to find an
hypothesis that agrees
with all of the training examples; rather, it constructs a binary
perceptron based
on various probabilistic estimates obtained from  the training examples.
We show that, under the restricted case of the {\em uniform distribution\/}
and zero threshold,
the algorithm reduces to the well known {\em clipped Hebb rule\/}.
We calculate exactly the average generalization rate
(\ie\ the learning curve) of the algorithm, under the uniform distribution,
in the limit of an infinite number of dimensions. We find that the error
rate decreases
{\em exponentially\/} as a function of the number of training examples.
Hence, the average case
analysis gives a sample complexity of $O(n\ln(1/\epsilon))$;
a large improvement over
the PAC learning analysis.
The analytical expression of the learning curve is in excellent agreement
with the extensive numerical simulations.
In addition, the algorithm is very robust with respect to classification noise.
***********************************************************************

**********************************************************************
FILE: NIPS92.ps.Z

TITLE: On Learning $\mu$-Perceptron Networks  with Binary Weights

AUTHORS: Mostefa Golea, Mario Marchand, Thomas R. Hancock



           Abstract


Neural networks with binary weights are very important
from both the theoretical and practical points of view.
In this paper, we investigate the learnability of single binary
perceptrons and unions of $\mu$-binary-perceptron networks,
{\em i.e.\/} an ``OR'' of
binary perceptrons where each input unit is connected
to one and only one
perceptron. We give a polynomial time algorithm
that PAC learns these networks
under the uniform distribution.
The algorithm is able to identify both the
network connectivity and the weight values
necessary to represent the target function.
These results suggest that, under
reasonable distributions, $\mu$-perceptron
networks may be easier to learn than fully connected networks.
**********************************************************************

**********************************************************************
FILE: Network93.ps.Z

TITLE: On Learning Simple Neural Concepts: From Halfspace
           Intersections to Neural Decision Lists


AUTHORS: Mario Marchand, Mostefa Golea


                       Abstract


In this paper, we take a close look at the problem of learning
simple neural concepts under the uniform distribution of examples.
By simple neural concepts we mean concepts that can be represented
as  simple combinations of perceptrons (halfspaces). One such
class of concepts is the class of halfspace intersections.
By formalizing the problem of learning halfspace intersections as a
{\em set covering problem}, we are led to consider the following
sub-problem: given a set of non linearly
separable examples, find  the largest linearly separable subset of it.
We give an approximation algorithm for this NP-hard sub-problem.
Simulations, on both linearly and non linearly
separable functions, show that this approximation algorithm works
well under the uniform distribution, outperforming
the Pocket algorithm used by many constructive neural algorithms.
Based on this approximation algorithm, we present a greedy method
for learning halfspace intersections.
We also present extensive numerical results that strongly
suggests that this greedy method learns halfspace intersections
under the uniform distribution of examples.
Finally, we introduce a new  class of simple, yet very rich, neural
concepts that we call {\em neural decision
lists}. We show how the greedy method can be generalized to handle
this class of concepts. Both greedy methods for halfspace
intersections and neural decision lists were tried on real-world
data with very encouraging results. This shows that these concepts
are not only important from the theoretical point of view, but also
in practice.
***************************************************************************

Sorry for this inconvenience,   - Mario



   ----------------------------------------------------------------
  |  UUU            UUU            Mario Marchand                  |
  |  UUU            UUU            -----------------------------   |
  |  UUU       OOOOOOOOOOOOOOOO    Department of Physics           |
  |  UUU      OOO   UUU      OOO   University of Ottawa            |
  |   UUUUUUUUUUUUUUUU       OOO   150 Louis Pasteur street        |
  |           OOO            OOO   PO BOX 450 STN A                |
  |            OOOOOOOOOOOOOOOO    Ottawa (Ont) Canada K1N 6N5     |
  |                                                                |
  |   ***** Internet E-Mail: mario at physics.uottawa.ca **********   |
  |   ***** Tel: (613)564-9293 ------------- Fax: 564-6712 *****   |
   ----------------------------------------------------------------





More information about the Connectionists mailing list