Technical Report Series in Neural and Computational Learning
john@dcs.rhbnc.ac.uk
john at dcs.rhbnc.ac.uk
Fri Jul 8 10:37:02 EDT 1994
The European Community ESPRIT Basic Research Action is funding a Working Group
in the area of Neural and Computational Learning Theory involving 10 European
sites. As part of its activities the NeuroCOLT Working Group is maintaining a
Technical Report Series of the ongoing work of its members at the coordinating
site of Royal Holloway, University of London.
This message is to announce the instalment of the first two reports and to
describe how they can be accessed.
--------------------------------------
NeuroCOLT Technical Report NC-TR-94-1:
--------------------------------------
Computing over the Reals with Addition and Order: Higher Complexity Classes
by Felipe Cucker and Pascal Koiran
Abstract:
This paper deals with issues of structural complexity in a linear
version of the Blum-Shub-Smale model of computation over the real
numbers. Real versions of $\pspace$ and of the polynomial time
hierarchy are defined, and their properties are investigated. Mainly
two types of results are presented:
\begin{itemize}
\item Equivalence between quantification over the real numbers and
over $\{0,1\}$;
\item Characterizations of recognizable subsets of $\{0,1\}^*$ in
terms of familiar discrete complexity classes.
\end{itemize}
The complexity of the decision and quantifier elimination problems in
the theory of the reals with addition and order is also studied.
--------------------------------------
NeuroCOLT Technical Report NC-TR-94-3:
--------------------------------------
Probabilistic Analysis of Learning in Artificial Neural Networks: The
PAC Model and its Variants
by Martin Anthony
Abstract:
This report (72 pages) surveys the probably approximately correct model
of machine learning, with emphasis on the sample complexity of learning.
Applications to the theory of learning in artificial neural networks
are discussed.
The survey should be accessible to those unfamiliar with computational
learning theory. It is assumed the reader has some familiarity with neural
networks, but otherwise the survey is largely self-contained.
The basic PAC model of concept learning is discussed and the key results
involving the Vapnik-Chervonenkis dimension are derived. Implications for
the theory of artificial neural networks are discussed through a
survey of known results on the VC-dimension of neural nets. A brief
discussion of the computational complexity of PAC learning follows.
We then discuss generalisations and extensions of the PAC model:
stochastic concepts, learning with respect to particular distributions,
and the learnability of functions and p-concepts. (We do not discuss
computational complexity in these contexts.)
Contents:
1. Introduction
2. The Basic PAC Model of Learning
3. VC-Dimension and Growth Function
4. VC-Dimension and Linear Dimension
5. A Useful Probability Theorem
6. PAC Learning and the VC-Dimension
7. VC-Dimension of Binary-Output Networks
introduction
linearly weighted neural networks
linear threshold networks
other activation functions
the effect of weight restrictions
8. Computational Complexity of Learning
9. Stochastic Concepts
10. Distribution-Specific Learning
11. Graph Dimension and Multiple-Output Nets
the graph dimension
multiple-output feedforward threshold networks
12. Pseudo-Dimension and Function Learning
the pseudo-dimension
learning real-valued functions
13. Capacity of a Function Space
capacity and learning
applications to sigmoid neural networks
14. Scale-Sensitive Dimensions
learnability of p-concepts
learnability of functions
15. Conclusions
-----------------------
The Report NC-TR-94-1 can be accessed and printed as follows
% ftp cscx.cs.rhbnc.ac.uk (134.219.200.45)
Name: anonymous
password: your full email address
ftp> cd pub/neurocolt/tech_reports
ftp> binary
ftp> get nc-tr-94-1.ps.Z
ftp> bye
% zcat nc-tr-94-1.ps.Z | lpr -l
Similarly for the Report NC-TR-94-3.
Uncompressed versions of the postscript files have also been left for anyone not
having an uncompress facility.
Best wishes
John
More information about the Connectionists
mailing list