Technical report

Peter Williams peterw at cogs.susx.ac.uk
Thu Mar 3 06:53:00 EST 1994


FTP-host: ftp.cogs.susx.ac.uk
FTP-filename: /pub/reports/csrp/csrp312.ps.Z

The following technical report is available by anonymous ftp.

------------------------------------------------------------------------

       BAYESIAN REGULARISATION AND PRUNING USING A LAPLACE PRIOR

                            Peter M Williams

              Cognitive Science Research Paper CSRP-312

               School of Cognitive and Computing Sciences
                          University of Sussex
                        Falmer, Brighton BN1 9QH
                               England

                     email: peterw at cogs.susx.ac.uk


                                Abstract

Standard techniques for improved generalisation from neural networks
include weight decay and pruning.  Weight decay has a Bayesian
interpretation with the decay function corresponding to a prior over
weights.  The method of transformation groups and maximum entropy
indicates a Laplace rather than a Gaussian prior.  After training, the
weights then arrange themselves into two classes: (1) those with a
common sensitivity to the data error (2) those failing to achieve this
sensitivity and which therefore vanish.  Since the critical value is
determined adaptively during training, pruning---in the sense of setting
weights to exact zeros---becomes a consequence of regularisation alone.
The count of free parameters is also reduced automatically as weights
are pruned.  A comparison is made with results of MacKay using the
evidence framework and a Gaussian regulariser.

------------------------------------------------------------------------
[113755 bytes, 25 pages]

unix> ftp ftp.cogs.susx.ac.uk
Name: anonymous
Password: (email address)
ftp> cd pub/reports/csrp
ftp> binary
ftp> get csrp312.ps.Z



More information about the Connectionists mailing list