No subject

Stephen J Hanson jose at tractatus.bellcore.com
Thu Mar 23 17:19:35 EST 1989



     Princeton Cognitive Science Lab Technical Report: CSL36, February, 1989.

                         COMPARING BIASES FOR MINIMAL NETWORK
                          CONSTRUCTION WITH BACK-PROPAGATION


                                 Stephen Jos'e Hanson

                                       Bellcore
                                          and
                        Princeton Cognitive Science Laboratory

                                         and

                                    Lorien Y. Pratt
                                  Rutgers University


                                       ABSTRACT

         Rumelhart (1987), has proposed a method for choosing minimal or
         "simple" representations during learning in Back-propagation
         networks.  This approach can be used to (a) dynamically select
         the number of hidden units, (b) construct a representation that
         is appropriate for the problem and (c) thus improve the
         generalization ability of Back-propagation networks. The method
         Rumelhart suggests involves adding penalty terms to the usual
         error function. In this paper we introduce Rumelhart's minimal
         networks idea and compare two possible biases on the weight
         search space.  These biases are compared in both simple counting
         problems and a speech recognition problem.  In general, the
         constrained search does seem to minimize the number of hidden
         units required with an expected increase in local minima.

to appear in Advances in Neural Information Processing, D. Touretzky Ed., 1989
Research was jointly sponsered by Princeton CSL and Bellcore.


REQUESTS FOR THIS TECHNICAL REPORT SHOULD BE SENT TO

    laura at clarity.princeton.edu


  Please do not reply to this message or forward, Thankyou.


More information about the Connectionists mailing list