Modelling nonlinear systems

Lester Ingber ingber at alumni.cco.caltech.edu
Mon Mar 22 08:01:04 EST 1993


In the context of modeling discussed in the two postings referenced
below, it should be noted that multiplicative noise many times is quite
robust in modeling stochastic systems that have hidden variables and/or
that otherwise would be modeled by much higher-order ARMA models.
"Multiplicative" noise means that the typical Gaussian-Markovian noise
terms added to introduce noise to sets of differential equations have
additional factors which can be quite general functions of the other
"deterministic" variables.  Some nice work illustrating this is in
	%A K. Kishida
	%T Physical Langevin model and the time-series model in systems
	far from equilibrium
	%J Phys. Rev. A
	%V 25
	%D 1982
	%P 496-507
and
	%A K. Kishida
	%T Equivalent random force and time-series model in systems
	far from equilibrium
	%J J. Math. Phys.
	%V 25
	%D 1984
	%P 1308-1313

A very detailed reference that properly handles such systems is
	A F. Langouche
	%A D. Roekaerts
	%A E. Tirapegui
	%T Functional Integration and Semiclassical Expansions
	%I Reidel
	%C Dordrecht, The Netherlands
	%D 1982

Modelers' preferences for simple systems aside, it should be noted
that most physical systems that can reasonably be assumed to possess
Gaussian-Markovian noise should also be assumed to at least have
multiplicative noise as well.  Such arguments are given in
	%A N.G. van Kampen
	%T Stochastic Processes in Physics and Chemistry
	%I North-Holland
	%C Amsterdam
	%D 1981

In the context of neural systems, such multiplicative noise systems
arise quite naturally, as I have described in
	%A L. Ingber
	%T Statistical mechanics of neocortical interactions:
	A scaling paradigm applied to electroencephalography
	%J Phys. Rev. A
	%N 6
	%V 44
	%P 4017-4060
	%D 1991
and in
	%A L. Ingber
	%T Generic mesoscopic neural networks based on
	statistical mechanics of neocortical interactions
	%J Phys. Rev. A
	%V 45
	%N 4
	%P R2183-R2186
	%D 1992

}Article 2057 of mlist.connectionists:
}From: Thanos Kehagias <kehagias at eng.auth.gr>
}Subject: Modelling nonlinear systems
}Date: Mon, 22 Mar 93 07:03:12 GMT
}Approved: news at cco.caltech.edu
}
}
}Regarding J.R. Chen's paper:
}
}I think it is important to define in what sense "modelling" is understtood.
}I have not read the Doya paper, but my guess is that it is an approximation
}result (rather than exat representation). If it is an approximation
}result, the sense of approximation (norm or metric used) is important.
}
}For instance: in the stochastic context, there is a well known statistical
}theorem, the Wold theorem, which says that every continuous valued, finite
}second moment, stochastic process can be approximated by ARMA models. The
}models are (as one would expect) of increasing order (finite but unbounded).
}The approximation is in the L2 sense (l.i.m., limit in the mean), that is 
}E([X-X_n]^2) goes to 0, where X is the original process and X_n, n=1,2, ... is
}the approximating ARMA process. I expect this can also handle stochastic input/
}output processes, if the input output pair (X,U) is considered as a joint
}process. 
}
}I have proved a similar result in my thesis about approximating finite state
}stoch. processes with Hidden Markov Models. The approximation is in two senses:
}weak (approximation of measures) and cross entropy. Since  for every HMM it is 
}easy to build an output equivalent network of finite automata, this gets really close to the notion of recurrent networks with sigmoid neurons. 

|| Prof. Lester Ingber                         [10ATT]0-700-L-INGBER ||
|| Lester Ingber Research                        Fax: 0-700-4-INGBER ||
|| P.O. Box 857                           Voice Mail: 1-800-VMAIL-LI ||
|| McLean, VA  22101                EMail: ingber at alumni.caltech.edu ||



More information about the Connectionists mailing list