Technical Report Series in Neural and Computational Learning

John Shawe-Taylor john at dcs.rhbnc.ac.uk
Mon Mar 25 16:40:21 EST 1996


The European Community ESPRIT Working Group in Neural and Computational 
Learning Theory (NeuroCOLT) has produced a set of new Technical Reports
available from the remote ftp site described below. They cover topics in
real valued complexity theory, computational learning theory, and analysis
of the computational power of continuous neural networks.  Abstracts are
included for the titles.

*** Please note that the location of the files was changed at the beginning of
** the year, so that any copies you have of the previous instructions should be 
* discarded. The new location and instructions are given at the end of the list.


----------------------------------------
NeuroCOLT Technical Report NC-TR-96-038:
----------------------------------------
Active Noise Control with Dynamic Recurrent Neural Networks
by  Davor Pavisic, Facult\'{e} Polytechnique de Mons, Belgium
    Laurent Blondel, Facult\'{e} Polytechnique de Mons, Belgium
    Jean-Philipe Draye,  Facult\'{e} Polytechnique de Mons, Belgium
    Ga\"{e}tan Libert,  Facult\'{e} Polytechnique de Mons, Belgium
    Pierre Chapelle,  Facult\'{e} Polytechnique de Mons, Belgium

Abstract:
We have developed a neural active noise controller which performs
better than existing techniques.  We used a dynamic recurrent neural
network to model the behaviour of an existing controller that uses a
Least Mean Squares algorithm to minimize an error signal.   The network
has two types of adaptive parameters, the weights between the units and
the time constants associated with each neuron.  Measured results show
a significant improvement of the neural controller when compared with
the existing system.


----------------------------------------
NeuroCOLT Technical Report NC-TR-96-039:
----------------------------------------
A Survey on real Structural Complexity Theory
by  Klaus Meer, RWTH Aachen, Germany
    Christian Michaux, Universit\'e de Mons-Hainaut

Abstract:
In this tutorial paper we overview research being done in the field of
structural complexity and recursion theory over the real numbers and
other domains following the approach by Blum, Shub and Smale.


----------------------------------------
NeuroCOLT Technical Report NC-TR-96-040:
----------------------------------------
The Computational Power of Spiking Neurons Depends on the Shape of
the Postsynaptic Potentials
by  Wolfgang Maass, Technische Universitaet Graz, Austria
    Berthold Ruf, Technische Universitaet Graz, Austria

Abstract:
Recently one has started to investigate the computational power of
spiking neurons (also called ``integrate and fire neurons''). These are
neuron models that are substantially more realistic from the biological
point of view than the ones which are traditionally employed in
artificial neural nets.  It has turned out that the computational power
of networks of spiking neurons is quite large.  In particular they have
the ability to communicate and manipulate analog variables in
spatio-temporal coding, i.e.~encoded in the time points when specific
neurons ``fire'' (and thus send a ``spike'' to other neurons).
These preceding results have motivated the question which details of
the firing mechanism of spiking neurons are essential for their
computational power, and which details are ``accidental'' aspects of
their realization in biological ``wetware''.  Obviously this question
becomes important if one wants to capture some of the advantages of
computing and learning with spatio-temporal coding in a new generation
of artificial neural nets, such as for example pulse stream VLSI.
The firing mechanism of spiking neurons is defined in terms of their
postsynaptic potentials or ``response functions'', which describe the
change in their electric membrane potential as a result of the firing
of another neuron.  We consider in this article the case where the
response functions of spiking neurons are assumed to be of the
mathematically most elementary type: they are assumed to be
step-functions (i.e. piecewise constant functions). This happens to be
the functional form which has so far been adapted most frequently in
pulse stream VLSI as the form of potential changes (``pulses'') that
mimic the role of postsynaptic potentials in biological neural
systems.  We prove the rather surprising result that in models without
noise the computational power of networks of spiking neurons with
arbitrary piecewise constant response functions is strictly weaker than
that of networks where the response functions of neurons also contain
short segments where they increase respectively decrease in a linear
fashion (which is in fact biologically more realistic).  More precisely
we show for example that an addition of analog numbers is impossible
for a network of spiking neurons with piecewise constant response
functions (with any bounded number of computation steps, i.e. spikes),
whereas addition of analog numbers is easy if the response functions
have linearly increasing segments.


----------------------------------------
NeuroCOLT Technical Report NC-TR-96-041:
----------------------------------------
Finding Optimal Multi-Splits for Numerical Attributes in Decision Tree Learning
by  Tapio Elomaa, University of Helsinki, Finland
    Juho Rousu, University of Helsinki, Finland

Abstract:
Handling continuous attribute ranges remains a deficiency of top-down
induction of \dt s. They require special treatment and do not fit the
learning scheme as well as one could hope for. Nevertheless, they are
common in practical tasks and, therefore, need to be taken into
account.
This topic has attracted abundant attention in recent years. In
particular, Fayyad and Irani showed how optimal binary partitions can
be found efficiently. Later, they based a greedy heuristic
multipartitioning algorithm on these results. Recently, Fulton, Kasif,
and Salzberg attempted to develop algorithms for finding the optimal
multi-split for a numerical attribute in one phase.
We prove that, similarly as in the binary partitioning, only boundary
points need to be inspected in order to find the optimal multipartition
of a numerical value range. We develop efficient algorithms for finding
the optimal splitting into more than two intervals. The resulting
partition is guaranteed to be optimal w.r.t.\ the function that is used
to evaluate the attributes' utility in class prediction.
We contrast our method with alternative approaches in initial empirical
experiments. They show that the new method surpasses the greedy
heuristic approach of Fayyad and Irani constantly in the goodness of
the produced multi-split, but, with small data sets, cannot quite
attain the efficiency of the greedy approach. Furthermore, our
experiments reveal that one of the techniques proposed by Fulton,
Kasif, and Salzberg is of scarce use in practical tasks, since its time
consumption falls short of all demands. In addition, it categorically
fails in finding the optimal multi-split because of an error in the
rationale of the method.

----------------------------------------
NeuroCOLT Technical Report NC-TR-96-042:
----------------------------------------
Shattering all Sets of k points in `General Position' Requires (k-1)/2 
Parameters
by  Eduardo D. Sontag, Rutgers University, USA

Abstract:
For classes of concepts defined by certain classes of analytic functions
depending on n parameters, there are nonempty open sets of samples of 
length 2n+2 which cannot be shattered.
A slightly weaker result is also proved for piecewise-analytic functions.
The special case of neural networks is discussed.


 
--------------------------------------------------------------------

***************** ACCESS INSTRUCTIONS ******************

The Report NC-TR-96-001 can be accessed and printed as follows 

% ftp ftp.dcs.rhbnc.ac.uk  (134.219.96.1)
Name: anonymous
password: your full email address
ftp> cd pub/neurocolt/tech_reports
ftp> binary
ftp> get nc-tr-96-001.ps.Z
ftp> bye
% zcat nc-tr-96-001.ps.Z | lpr -l

Similarly for the other technical reports.

Uncompressed versions of the postscript files have also been
left for anyone not having an uncompress facility. 

In some cases there are two files available, for example,
nc-tr-96-002-title.ps.Z
nc-tr-96-002-body.ps.Z
The first contains the title page while the second contains the body 
of the report. The single command,
ftp> mget nc-tr-96-002*
will prompt you for the files you require.

A full list of the currently available Technical Reports in the 
Series is held in a file `abstracts' in the same directory.

The files may also be accessed via WWW starting from the NeuroCOLT 
homepage (note that this is undergoing some corrections and may be 
temporarily inaccessible):

http://www.dcs.rhbnc.ac.uk/neural/neurocolt.html


Best wishes
John Shawe-Taylor





More information about the Connectionists mailing list