Papers available on Missing and Noisy Data in Nonlinear Time-Series Prediction

Volker Tresp tresp at traun.zfe.siemens.de
Sat Sep 23 14:03:01 EDT 1995






The  file  tresp.miss_time.ps.Z  can now be copied from Neuroprose.

The paper is 11 pages long.
Hardcopies copies are not available.


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/tresp.miss_time.ps.Z



          Missing and Noisy Data in Nonlinear Time-Series Prediction

                   by Volker Tresp and Reimar Hofmann




We discuss the issue   of missing and noisy data in nonlinear
time-series prediction. We derive fundamental equations  both for
prediction and for training.  Our discussion  shows that if
measurements are noisy or missing,  treating the time series as a
static input/output mapping problem (the usual time-delay neural
network  approach) is suboptimal. We describe approximations of the
solutions which are based on stochastic simulations.
 A special case is $K$-step prediction in which a one-step predictor is
 iterated $K$ times.  Our solutions provide error bars for prediction
with missing or noisy data and for $K$-step prediction.  Using the
$K$-step iterated logistic map as an example, we show that the proposed
solutions are  a considerable  improvement over simple heuristic
solutions.  Using our  formalism we derive  algorithms for training
recurrent networks, for   control of stochastic systems and for
reinforcement learning problems.



More information about the Connectionists mailing list