NIPS'93 workshop on "Stability and Solvability"
GARZONM@hermes.msci.memst.edu
GARZONM at hermes.msci.memst.edu
Tue Sep 28 10:28:07 EDT 1993
C A L L F O R P A P E R S
A One-day Workshop on
* STABILITY AND OBSERVABILITY *
at NIPS'93
December 3, 1993
We are organizing a workshop at the NIPS'93 -Neural Information
Processing Systems conference to be held at the Denver/Vail area in
Colorado December 3. The themes of the workshop are `Stability
and Observability'. A more detailed description is attached below.
There is still room for some contributed talks. If you are
interested in presenting a paper based on previous and/or current
research, send a short (one-page) abstract or contact one of the
organizers by October 8 via email or fax. A list of speakers will
be finalized by mid October.
Fernanda Botelho Max Garzon
botelhof at hermes.msci.memst.edu garzonm at hermes.msci.memst.edu
FAX (901)678-2480 (preferred); 678-3299
Workshop cochairs
_____________________________ cut here _________________________
The purpose of this one-day workshop is to bring together neural
network practitioners, computer scientists and mathematicians
interested in `stability' and `observability' of neural networks of
various types (discrete/continuous time and/or activations).
These two properties concern the relationship between defining
parameters (weights, transfer functions, and training sets) and the
behavior of neural networks from the point of view of an outside
observer. This behavior is affected by noise, rounding, bounded
precision, sensitivity to initial conditions, etc. Roughly
speaking, *stability* (e.g. asymptotic, Lyapunov, structural)
refers to the ability of a network (or a family of networks) to
generate trajectories/orbits that remain reasonably close (resp.,
in structure, e.g. topological conjugacy) to the original
under small perturbations of the input/initial conditions (or the
defining parameters of the network). Of course, neural networks are
well-known for their graceful degradation, but this is less clear
an issue with bounded precision, continuous time with local
interaction governed by differential equations, and learning
algorithms.
Second, the issue of *observability*, roughly speaking, concerns
the problem of error control under iteration of recurrent nets. In
dynamical systems observability is studied in terms of
shadowing. But observability can also be construed other ways,
e.g. as our ability to identify a network by observing the abstract
i/o function that it realizes (which, at some level, reduces to
essential uniqueness of an irreducible network implementing the i/o
function).
Speakers will present their views in short(< 20 min.) talks. A
panel discussion coordinated by the cochairs will discuss known
results, and identify fundamental problems and questions of
interest for further research.
F. Botelho and M. Garzon, cochairs
botelhof at hermes.msci.memst.edu garzonm at hermes.msci.memst.edu
Mathematical Sciences Institute for Intelligent Systems
Memphis State University
Memphis, TN 38152 U.S.A.
Max Garzon (preferred) garzonm at hermes.msci.memst.edu
Math Sciences garzonm at memstvx1.memst.edu
Memphis State University Phone: (901) 678-3138/-2482
Memphis, TN 38152 USA Fax: (901) 678-3299
More information about the Connectionists
mailing list