Tech Report Available

Marwan A. Jabri, Sydney Univ. Elec. Eng., Tel: (+61-2 marwan at ee.su.OZ.AU
Sat Mar 2 19:44:58 EST 1991


	***************** Technical Report Available *****************

     Weight Perturbation: An Optimal Architecture and Learning Technique
	for Analog VLSI Feedforward and Recurrent Multi-Layer Networks

		    Marwan Jabri & Barry Flower
	Systems Engineering and Design Automation Laboratory
		School of Electrical Engineering
		    University of Sydney

		        marwan at ee.su.oz.au
		   (SEDAL Tech Report 1991-1-5)

Abstract 

Previous work on analog VLSI implementation of multi-layer perceptrons
with on-chip learning has mainly targeted the implementation of algorithms
like back-propagation. Although back-propagation is 
efficient, its implementation in analog VLSI requires excessive
computational hardware. In this paper we show that using gradient descent
with direct approximation  of the gradient instead of back-propagation is
cheapest for parallel analog implementations. We also show that this
technique (we call ``weight perturbation'') is suitable for multi-layer 
recurrent networks as well. A discrete level analog implementation showing
the training of an XOR network as an example is also presented.

*** Also submitted

To ftp this report:
-------------------

ftp cheops.cis.ohio-state.edu (or ftp 128.146.8.62)
>name: anonymous
>passwork: neuron

>binary
>cd pub/neuroprose
>get jabri.wpert.ps.Z 
>quit

uncompress jabri.wpert.ps.Z
lpr -P<name of your laser printer> jabri.wpert.ps 

This file contains a large picture, and as a result you may have to set
the time-out to a large value (we do that with lpr -i1000000 ...)

If for any reasons you are unable to print the file, you can ask for a
hardcopy by writing to (and asking for SEDAL Tech Report 1991-1-5):

Marwan Jabri
Sydney University Electrical Engineering
NSW 2006 Australia


More information about the Connectionists mailing list