Alopex: Pre-print available on FTP

Venugopal venu at pixel.mipg.upenn.edu
Fri Mar 4 15:40:20 EST 1994



   Pre-print of the following paper (which is to appear in NEURAL 
   COMPUTATION, vol. 4, 1994, pp. 467-488) is available on ftp from 
   neuroprose archive:


	   ALOPEX: A CORRELATION BASED LEARNING ALGORITHM
            FOR FEEDFORWARD AND RECURRENT NEURAL NETWORKS


			K. P. Unnikrishnan
	      GM Research Laboratories, Warren, MI 
	     AI Laboratory, Univ. of Michigan, Ann Arbor, MI

			 K. P. Venugopal
	            Medical Image Processing Group
	      University of Pennsylvania, Philadelphia, PA

    
    Abstract:     

    We present a learning algorithm for neural networks, called Alopex. 	
    Instead of error gradient, Alopex uses local correlations between changes 
    in individual weights and changes in the global error measure. The 
    algorithm does not make any assumptions  about transfer functions of 
    individual neurons, and does not explicitly depend on the functional form 
    of the error measure. Hence, it can be used in networks with arbitrary 
    transfer functions and for minimizing a large class of error measures. 
    The learning algorithm is the same for feed-forward and recurrent networks. 
    All the weights in a network are updated simultaneously, using only local
    computations. This allows complete parallelization of the algorithm. 
    The algorithm is stochastic and it uses a `temperature' parameter in a   
    manner similar to  that in simulated annealing. A heuristic `annealing 
    schedule' is presented which is effective in finding global minima of 
    error surfaces. In this paper, we report extensive simulation studies 
    illustrating these advantages and show that learning times are comparable 
    to those for standard gradient descent methods. Feed-forward networks 
    trained with Alopex are used to solve the MONK's problems and symmetry 
    problems. Recurrent networks trained with the same algorithm are used for 
    solving temporal XOR problems. Scaling properties of the algorithm are 
    demonstrated using encoder problems of different sizes and advantages of
    appropriate error measures are illustrated using a variety of problems.
   
             	
	-----------------------------------------

	The file at archive.cis.ohio-state.edu is

	venugopal.alopex.ps.Z
	 
	(472K compressed)


	to ftp the file:

	unix> ftp archive.cis.ohio-state.edu

	Name (archive.cis.ohio-state.edu:xxxxx): anonymous
	Password: your address

	ftp> cd pub/neuroprose
	ftp> binary
	ftp> mget venugopal.alopex.ps.Z


	uncompress the file after transfering to your machine,
	before printing.
        
	
	-----------------------------------------------------------

	K. P. Venugopal
	Medical Image Processing Group
	University of Pennsylvania
	423 Blockley Hall
	Philadelphia, PA 19104		  (venu at pixel.mipg.upenn.edu)






More information about the Connectionists mailing list