Technical Report Series in Neural and Computational Learning

John Shawe-Taylor john at dcs.rhbnc.ac.uk
Thu Oct 5 09:24:13 EDT 1995


The European Community ESPRIT Working Group in Neural and Computational 
       Learning Theory (NeuroCOLT): three new reports available

----------------------------------------
NeuroCOLT Technical Report NC-TR-95-051:
----------------------------------------
On the Computational Power of Continuous Time Neural Networks
by Pekka Orponen, University of Helsinki, Finland

Abstract:
We investigate the computational power of continuous-time neural
networks with Hopfield-type units. We prove that polynomial-size
networks with saturated-linear response functions are at least as
powerful as polynomially space-bounded Turing machines.

----------------------------------------
NeuroCOLT Technical Report NC-TR-95-052:
----------------------------------------
Computational Machine Learning in Theory and Praxis
by Ming Li, University of Waterloo, Canada
   Paul Vitanyi, CWI and Universiteit van Amsterdam, The Netherlands.

Abstract:
In the last few decades a computational approach to machine learning
has emerged based on paradigms from recursion theory and the theory of
computation. Such ideas include learning in the limit, learning by
enumeration, and probably approximately correct (pac) learning.  These
models usually are not suitable in practical situations. In contrast,
statistics based inference methods have enjoyed a long and
distinguished career. Currently, Bayesian reasoning in various forms,
minimum message length (MML) and minimum description length (MDL), are
widely applied approaches. They are the tools to use with particular
machine learning praxis such as simulated annealing, genetic
algorithms, genetic programming, artificial neural networks, and the
like. These statistical inference methods select the hypothesis which
minimizes the sum of the length of the description of the hypothesis
(also called `model') and the length of the description of the data
relative to the hypothesis. It appears to us that the future of
computational machine learning will include combinations  of the
approaches above coupled with guaranties with respect to used time and
memory resources. Computational learning theory will move closer to
practice and the application of the principles such as MDL require
further justification.  Here, we survey some of the actors in this
dichotomy between theory and praxis, we justify MDL via the Bayesian
approach, and give  a comparison between pac learning and MDL learning
of decision trees.

----------------------------------------
NeuroCOLT Technical Report NC-TR-95-053:
----------------------------------------
On the relations between distributive computability and the BSS model
by Sebastiano Vigna, University of Milan, Italy

Abstract:
This paper presents an equivalence result between computability in the
BSS model and in a suitable distributive category. It is proved that
the class of functions $R^m\to R^n$ (with $n,m$ finite and $R$ a
commutative, ordered ring) computable in the BSS model and the
functions distributively computable over a natural distributive graph
based on the operations of $R$ coincide. Using this result, a new
structural characterization, based on iteration, of the same functions
is given.


-----------------------
The Report NC-TR-95-051 can be accessed and printed as follows 

% ftp cscx.cs.rhbnc.ac.uk  (134.219.200.45)
Name: anonymous
password: your full email address
ftp> cd pub/neurocolt/tech_reports
ftp> binary
ftp> get nc-tr-95-051.ps.Z
ftp> bye
% zcat nc-tr-95-051.ps.Z | lpr -l

Similarly for the other technical report.

Uncompressed versions of the postscript files have also been
left for anyone not having an uncompress facility.

A full list of the currently available Technical Reports in the 
Series is held in a file `abstracts' in the same directory.

The files may also be accessed via WWW starting from the NeuroCOLT homepage:

http://www.dcs.rhbnc.ac.uk/neural/neurocolt.html


Best wishes
John Shawe-Taylor





More information about the Connectionists mailing list