New papers on time series generation

Avner Priel priel at alon.cc.biu.ac.il
Mon Nov 17 08:05:23 EST 1997


Dear Connectionists,

This is to announce the availability of 2 new papers on the subject
of time series generation by feed-forward networks. 
The first paper will appear on the "Journal of Physics A" 
and the second on the NIPS-11 proceedings. 
The papers are available from my home-page :

	http://faculty.biu.ac.il/~priel/

comments are welcome.


*************** NO HARD COPIES ******************


----------------------------------------------------------------------


Noisy time series generation by feed-forward networks
-----------------------------------------------------
A Priel, I Kanter and D A Kessler

Department of Physics, Bar Ilan University, 52900 Ramat Gan,Israel


ABSTRACT:  We study the properties of a noisy time series generated
by a continuous-valued feed-forward network in which the next input 
vector is determined from past output values. 
Numerical simulations of a perceptron-type network exhibit the 
expected broadening of the noise-free attractor, 
without changing the attractor dimension.
We show that the broadening of
the attractor due to the noise scales inversely with the size of the 
system ,$N$, as $1/ \sqrt{N}$.
We show both analytically and numerically that the diffusion constant 
for the phase along the attractor scales inversely with $N$.
Hence, phase coherence holds up to a time that scales linearly with 
the size of the system. We find that the mean first passage time, $t$, 
to switch between attractors depends on $N$, and the reduced distance 
from bifurcation $\tau$ as $t = a {N \over \tau} \exp(b \tau N^{1/2})$, 
where $b$ is a constant which depends on the 
amplitude of the external noise. This result is obtained analytically 
for small $\tau$ and confirmed by numerical simulations.




Analytical study of the interplay between architecture and predictability
-------------------------------------------------------------------------

 Avner Priel, Ido Kanter , D.A. Kessler

 Minerva Center and Department of Physics, Bar Ilan University, 
        Ramat-Gan 52900, Israel. 


ABSTRACT:  We study model feed forward networks as time series predictors
in the stationary limit. The focus is on complex, yet non-chaotic,
behavior. The main question we address is whether the asymptotic behavior
is governed by the architecture, regardless the details of the weights. 
We find hierarchies among classes of architectures with respect to
the attractor dimension of the long term sequence they are capable of 
generating; larger number of hidden units can generate higher dimensional 
attractors.
In the case of a perceptron, we develop the stationary solution for a 
general weight vector, and show that the flow is typically one
dimensional.
The relaxation time from an arbitrary initial condition to the stationary
solution is found to scale linearly with the size of the network. 
In multilayer networks, the number of hidden
units gives bounds on the number and dimension of the possible attractors.
We conclude that long term prediction (in the non-chaotic regime)
with such models is governed by attractor dynamics related
to the architecture. 


----------------------------------------------------

  Priel Avner  < priel at mail.cc.biu.ac.il >           
  	       < http://faculty.biu.ac.il/~priel >
  Department of Physics,  Bar-Ilan University.            
  Ramat-Gan, 52900.             
  Israel.                 




More information about the Connectionists mailing list