Technical Report Series in Neural and Computational Learning

john@dcs.rhbnc.ac.uk john at dcs.rhbnc.ac.uk
Tue Oct 25 07:42:21 EDT 1994


The European Community ESPRIT Working Group in Neural and Computational 
       Learning Theory (NeuroCOLT): two new reports available

----------------------------------------
NeuroCOLT Technical Report NC-TR-94-009:
----------------------------------------
Complexity Issues in Discrete Hopfield Networks
by Patrik Flor\'{e}en and Pekka Orponen 
   University of Helsinki, Department of Computer Science 
   P. O. Box 26, FIN-00014 University of Helsinki, Finland

Abstract: We survey some aspects of the computational complexity theory of
discrete-time and discrete-state Hopfield networks. The emphasis is on topics
that are not adequately covered by the existing survey literature, most
significantly:
1.  the known upper and lower bounds for the convergence times
    of Hopfield nets (here we consider mainly worst-case results);
2.  the power of Hopfield nets as general computing devices (as opposed 
    to their applications to associative memory and optimization);
3.  the complexity of the synthesis (``learning'') and analysis
    problems related to Hopfield nets as associative memories.


----------------------------------------
NeuroCOLT Technical Report NC-TR-94-013:
----------------------------------------
by Martin Anthony, Department of Mathematics, The London School of 
   Economics and Political Science, Houghton Street, London WC2A 2AE,
   United Kingdom
and Peter Bartlett, Department of Systems Engineering, Research School
   of Information Sciences and Engineering, The Australian National
   University, Canberra, 0200 Australia

Abstract:
In this paper, we study a statistical property of classes of real-valued functions
that we call approximation from interpolated examples.  We derive a
characterization of function classes that have this property, in terms of their
`fat-shattering function', a notion that has proven useful in computational
learning theory.  We  discuss the implications for function learning of
approximation from interpolated examples.  Specifically, it can be interpreted as a
problem of learning real-valued functions from random examples in which we require
satisfactory performance from every algorithm that returns a function which
approximately interpolates the training examples.



-----------------------
The Report NC-TR-94-013 can be accessed and printed as follows 

% ftp cscx.cs.rhbnc.ac.uk  (134.219.200.45)
Name: anonymous
password: your full email address
ftp> cd pub/neurocolt/tech_reports
ftp> binary
ftp> get nc-tr-94-013.ps.Z
ftp> bye
% zcat nc-tr-94-013.ps.Z | lpr -l


Likewise for NC-TR-94-009. Uncompressed versions of the postscript files have also been
left for anyone not having an uncompress facility.

A full list of the currently available Technical Reports in the Series is held in a file
`abstracts' in the same directory.

Best wishes
John Shawe-Taylor





More information about the Connectionists mailing list