Technical Report Series in Neural and Computational Learning

John Shawe-Taylor john at dcs.rhbnc.ac.uk
Tue Dec 23 08:31:39 EST 1997


The European Community ESPRIT Working Group in Neural and Computational 
Learning Theory (NeuroCOLT) has produced a set of new Technical Reports
available from the remote ftp site described below. They cover topics in
real valued complexity theory, computational learning theory, and analysis
of the computational power of continuous neural networks.  Abstracts are
included for the titles.

----------------------------------------
NeuroCOLT Technical Report NC-TR-97-040:
----------------------------------------
Saturation and Stability in the Theory of Computation over the Reals
by Olivier Chapuis, Universit\'e Lyon I, France
   Pascal Koiran, Ecole Normale Sup\'erieure de Lyon, France

Abstract:
This paper was motivated by the following two questions which arise in
the theory of complexity for computation over ordered rings in the now
famous computational model introduced by Blum, Shub and Smale:
(i) is the answer to the question $\p$ $=$? $\np$ the same in every
real-closed field?
(ii) if $\p\neq\np$ for $\r$, does there exist  a problem of $\r$ which
is $\np$ but not $\np$-complete?
Some unclassical complexity classes  arise naturally in the study of
these questions. They are still open, but we could obtain unconditional
results of independent interest.
Michaux introduced $/\const$ complexity classes in an effort to attack
question~(i).  We show that $\a_{\r}/ \const = \a_{\r}$, answering a
question of his.  Here $\a$ is the class of real problems which are
algorithmic in bounded time.  We also prove the stronger result:
$\parl_{\r}/ \const =\parl_{\r}$, where $\parl$ stands for parallel
polynomial time.  In our terminology, we say that $\r$ is
$\a$-saturated and $\parl$-saturated.  We also prove, at the nonuniform
level, the above results for every real-closed field.  It is not known
whether $\r$ is $\p$-saturated.  In the case of the reals with addition
and order we obtain $\p$-saturation (and a positive answer to question
(ii)).  More generally, we show that an ordered $\q$-vector space is
$\p$-saturated at the nonuniform level (this almost implies a positive
answer to the analogue of question (i)).
We also study a stronger notion that we call $\p$-stability.  Blum,
Cucker, Shub and Smale have (essentially) shown that fields of
characteristic 0 are $\p$-stable. We show that the reals with addition
and order are $\p$-stable, but real-closed fields are not.
Questions (i) and (ii) and the $/\const$ complexity classes have some
model theoretic flavor. This leads to the theory of complexity over
``arbitrary'' structures as introduced by Poizat. We give a summary of
this theory with a special emphasis on the connections with model
theory and we study $/\const$ complexity classes with this point of
view.  Note also that our proof of the $\parl$-saturation of $\r$ shows
that an o-minimal structure which admits quantifier elimination is
$\a$-saturated at the nonuniform level.



----------------------------------------
NeuroCOLT Technical Report NC-TR-97-041:
----------------------------------------
A survey on the Grzegorczyk Hierarchy and its  extension through the
BSS Model of Computability
by Jean-Sylvestre Gakwaya, Universit\'e de Mons-Hainaut, Belgium

Abstract:
This paper concerns the Grzegorczyk classes  defined from a particular
sequence of computable functions. We provide some relevant properties
and  the main problems about the Grzegorczyk classes through two
settings of computability. The first one is the usual setting of
recursiveness and  the second one is the BSS model introduced at the
end of the $80'$s. This model  of computability allows to define the
concept of effectiveness over continuous domains such that the real
numbers.


----------------------------------------
NeuroCOLT Technical Report NC-TR-97-042:
----------------------------------------
----------------------------------------
NeuroCOLT Technical Report NC-TR-97-042:
----------------------------------------
On the Effect of Analog Noise in Discrete-Time Analog Computations
by  Wolfgang Maass, Technische Universitaet Graz, Austria
    Pekka Orponen, University of Jyv\"askyl\"a, Finland
    
Abstract:
We introduce a model for analog computation with discrete time in the
presence of analog noise that is flexible enough to cover the most
important concrete cases, such as noisy analog neural nets and networks 
of spiking neurons.  This model subsumes the classical model for digital
computation in the presence of noise.  We show that the presence of
arbitrarily small amounts of analog noise reduces the power of analog
computational models to that of finite automata, and we also prove a new
type of upper bound for the VC-dimension of computational models with
analog noise.

----------------------------------------
NeuroCOLT Technical Report NC-TR-97-043:
----------------------------------------
Analog Neural Nets with Gaussian or other Common Noise Distributions 
cannot Recognise Arbitrary Regular Languages
by  Wolfgang Maass, Technische Universitaet Graz, Austria
    Eduardo D. Sontag, Rutgers University, USA

Abstract:
We consider recurrent analog neural nets where the output of each gate
is subject to Gaussian noise, or any other common noise distribution
that is nonzero on a large set.  We show that many regular languages
cannot be recognised by networks of this type, and we give a precise
characterization of those languages which can be recognised. This result
implies severe constraints on possibilities for constructing recurrent
analog neural nets that are robust against realistic types of analog
noise.  On the other hand we present a method for constructing
feedforward analog neural nets that are robust with regard to analog
noise of this type.




--------------------------------------------------------------------

***************** ACCESS INSTRUCTIONS ******************

The Report NC-TR-97-001 can be accessed and printed as follows 

% ftp ftp.dcs.rhbnc.ac.uk  (134.219.96.1)
Name: anonymous
password: your full email address
ftp> cd pub/neurocolt/tech_reports
ftp> binary
ftp> get nc-tr-97-001.ps.Z
ftp> bye
% zcat nc-tr-97-001.ps.Z | lpr -l

Similarly for the other technical reports.

Uncompressed versions of the postscript files have also been
left for anyone not having an uncompress facility. 

In some cases there are two files available, for example,
nc-tr-97-002-title.ps.Z
nc-tr-97-002-body.ps.Z
The first contains the title page while the second contains the body 
of the report. The single command,
ftp> mget nc-tr-97-002*
will prompt you for the files you require.

A full list of the currently available Technical Reports in the 
Series is held in a file `abstracts' in the same directory.

The files may also be accessed via WWW starting from the NeuroCOLT 
homepage:

http://www.dcs.rhbnc.ac.uk/research/compint/neurocolt

or directly to the archive:
ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports


Best wishes
John Shawe-Taylor




More information about the Connectionists mailing list