"For neural networks, function determines form" in neuroprose

sontag@control.rutgers.edu sontag at control.rutgers.edu
Thu Jun 11 11:35:34 EDT 1992


Title: "For neural networks, function determines form"
Authors: Francesca Albertini and Eduardo D. Sontag

Filename: albertini.ident.ps.Z

Abstract:

This paper shows that the weights of continuous-time feedback neural networks
are uniquely identifiable from input/output measurements.  Under very weak
genericity assumptions, the following is true: Assume given two nets, whose
neurons all have the same nonlinear activation function $\sigma$; if the two
nets have equal behaviors as ``black boxes'' then necessarily they must have
the same number of neurons and ---except at most for sign reversals at each
node--- the same weights.

(NOTE: this result is **not** a "learning" theorem.  It does not provide by
itself an algorithm for loading recurrent nets.  It only shows "uniqueness of
solution".  However, work is in progress to apply the techniques developed in
the proof to the learning problem.)

To obtain copies of this article:

unix> ftp archive.cis.ohio-state.edu (or 128.146.8.52)
Name : anonymous
Password: <your id>
ftp> cd pub/neuroprose
ftp> binary
ftp> get albertini.ident.ps.Z
ftp> quit
unix> uncompress albertini.ident.ps.Z
unix> lpr -Pps albertini.ident.ps (or however you print PostScript)

(With many thanks to Jordan Pollack for providing this valuable service!)

Please note: the file requires a fair amount of memory to print.
If you have problems with FTP, I can e-mail you the postscript file; I cannot
provide hardcopy, however.




More information about the Connectionists mailing list