Techreport on Dynamic Pattern Selection in Neuroprose

Axel Roebel roebel at cs.tu-berlin.de
Mon Mar 14 07:07:34 EST 1994


FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/roebel.dynada.ps.Z

With Terry and Isabelle we state:
              One man`s outlyer is OUR data point

The file roebel.dynada.ps.Z (22 pages) is now available via anonymous
ftp from the neuroprose archive.   Title and abstract are given below.
We regret that hardcopies are not available. 

----------------------------------------------------------------------

              The  Dynamic Pattern Selection Algorithm:
	    Effective Training and Controlled Generalization
		 of Backpropagation Neural Networks


                             A. R"obel

   	           Technical University of Berlin
    	           Department of Computer Science

	    	      (Technical Report 93/23)

    (Subsets of this Report will appear in the conference proceedings 
       of the Intern. Conference on Neural Networks, Italy, 1994
 and the European Symposium on Artificial Neural Networks, Belgium, 1994)

	                 -- ABSTRACT --
  
  In  the following  report the  problem of selecting  proper training
  sets   for  neural  network   time  series  prediction  or  function
  approximation is addressed. As  a  result of analyzing the  relation
  between   approximation  and  generalization,  a  new  measure,  the
  generalization factor is  introduced.   Using this  factor and cross
  validation a  new algorithm, the {\em dynamic pattern  selection}, is
  developed.
\\  
  Dynamically selecting    the training patterns   during training
  establishes the    possibility   of controlling  the   generalization
  properties of  the  neural net.  As  a consequence   of the proposed
  selection   criterion, the generalization  error  is  limited to  the
  training error.  As an additional benefit,  the practical problem of
  selecting a  concise training  set  out of   known data is  likewise
  solved.
  \\
  By  employing  two time  series  prediction  tasks, the results  for
  dynamic pattern  selection training and for fixed  training sets are
  compared.   The  favorable   properties   of   the  dynamic  pattern
  selection,  namely  lower  computational  expense  and   control  of
  generalization, are demonstrated.
  \\ 
  This report describes a revised  version of the algorithm introduced
  in \cite{Roebel_e:92}.

----------------------------------------------------------------------------
Axel Roebel                             ** E-Mail: roebel at cs.tu-berlin.de **
Technische Universitaet Berlin          ** Phone : +49 - 30 - 314 24892   **
Department of Applied Computer Science  ** Fax   : +49 - 30 - 314 24891   **



More information about the Connectionists mailing list