Modelling nonlinear systems

Thanos Kehagias kehagias at eng.auth.gr
Thu Mar 18 02:18:56 EST 1993


Regarding J.R. Chen's paper:

I think it is important to define in what sense "modelling" is understtood.
I have not read the Doya paper, but my guess is that it is an approximation
result (rather than exat representation). If it is an approximation
result, the sense of approximation (norm or metric used) is important.

For instance: in the stochastic context, there is a well known statistical
theorem, the Wold theorem, which says that every continuous valued, finite
second moment, stochastic process can be approximated by ARMA models. The
models are (as one would expect) of increasing order (finite but unbounded).
The approximation is in the L2 sense (l.i.m., limit in the mean), that is 
E([X-X_n]^2) goes to 0, where X is the original process and X_n, n=1,2, ... is
the approximating ARMA process. I expect this can also handle stochastic input/
output processes, if the input output pair (X,U) is considered as a joint
process. 

I have proved a similar result in my thesis about approximating finite state
stoch. processes with Hidden Markov Models. The approximation is in two senses:
weak (approximation of measures) and cross entropy. Since  for every HMM it is 
easy to build an output equivalent network of finite automata, this gets really close to the notion of recurrent networks with sigmoid neurons. 

 Of course this
is all for stochastic networks/ probabilistic processes. In the deterministic
case one would probably be interested in a different sense of approximation,
e.g. L2 or L-infinity approximation.

Is the Doya paper in the ohio archive?




More information about the Connectionists mailing list