postdoctoral thesis

Juergen Schmidhuber schmidhu at informatik.tu-muenchen.de
Tue Feb 15 04:06:19 EST 1994



        ---------------- postdoctoral thesis ----------------
                         Juergen Schmidhuber
                  Technische Universitaet Muenchen
	    (submitted April 1993, accepted October 1993)    
        -----------------------------------------------------

        NETZWERKARCHITEKTUREN, ZIELFUNKTIONEN UND KETTENREGEL

       Es gibt relativ neuartige, auf R"uckkopplung basierende 
       k"unstliche  neuronale  Netze (KNN), deren F"ahigkeiten 
       betr"achtlich "uber  simple Musterassoziation hinausge-
       hen. Diese KNN gestatten im Prinzip die Implementierung 
       beliebiger auf  einem herk"ommlichen sequentiell arbei-
       tenden  Digitalrechner berechenbarer Funktionen. Im Ge-
       gensatz  zu herk"ommlichen  Rechnern l"a"st  sich dabei 
       jedoch die Qualit"at der Ausgaben  (formal spezifiziert 
       durch  eine  sinnvolle  Zielfunktion)   bez"uglich  der 
       ``Software'' (bei KNN  die Gewichtsmatrix) mathematisch 
       differenzieren, was  die Anwendung der  Kettenregel zur 
       Herleitung  gradientenbasierter Software"anderungsalgo-
       rithmen erm"oglicht. Die Arbeit verdeutlicht dies durch 
       formale Herleitung einer Reihe neuartiger Lernalgorith-
       men aus  folgenden Bereichen:  (1) "uberwachtes  Lernen 
       sequentiellen Ein/Ausgabeverhaltens  mit zyklischen und 
       azyklischen Architekturen, (2) ``Reinforcement Lernen'' 
       und  Subzielgenerierung  ohne  informierten Lehrer, (3) 
       un"uberwachtes Lernen zur  Redundanzextraktion aus Ein-
       gaben und Eingabestr"omen.  Zahlreiche Experimente zei-
       gen M"oglichkeiten und Schranken dieser Lernalgorithmen 
       auf.  Zum Abschluss  wird ein  ``selbstreferentielles'' 
       neuronales  Netzwerk pr"asentiert,  welches theoretisch 
       lernen kann, seinen eigenen Software"anderungsalgorith-
       mus zu "andern.

       -----------------------------------------------------



The postdoctoral thesis above is now available (in unrevised form) 
via ftp. To obtain a copy, follow the instructions at the end of 
this message.  

Here is additional information for those who are interested 
but don't understand German (or are unfamiliar with Germany's 
academic system): The postdoctoral thesis is  part of a process 
called ``Habilitation'' which is seen as a qualification for 
tenure. The thesis is about learning algorithms derived by the 
chain rule. It addresses supervised sequence learning, variants 
of reinforcement learning, and unsupervised learning (for 
redundancy reduction).  Unlike some previous papers of mine, 
it contains lots of experiments and lots of figures.  Here is 
a very brief summary based on pointers to recent English 
publications upon which the thesis elaborates:

Chapters 2 and 3 are on supervised sequence learning and extend 
publications [1] and [4]. Chapter 4 is on variants of learning 
with a ``distal teacher'' and extends publication [7] (robot 
experiments in chapter 4  were conducted by Eldracher and Baginski, 
see e.g. [9]). Chapters 5, 6 and 7 describe  unsupervised learning 
algorithms based on detection of redundant information in input 
patterns and pattern sequences: Chapter 5 elaborates on publication 
[5], and chapter 6 extends publication [3].  Chapter 6 includes a 
result by Peter Dayan, Richard Zemel and A. Pouget (SALK Institute) 
who demonstrated that equation (4.3) in [3] with $\beta = 0, \alpha =
= \gamma =1$ is essentially equivalent to equation (5.1).  Chapter 
6 also includes experiments conducted by Stefanie Lindstaedt who 
successfully applied the method in [3] to redundant images of 
letters presented according to the probabilities of English 
language, see [10].  Chapter 7 extends publications [2] and [8]. 
Experiments show how sequence processing neural nets using algorithms 
for redundancy reduction can learn to bridge time lags (between 
correlated events) of more than 1000 discrete time steps. Other 
experiments use neural nets for text compression and compare them
to standard data compression algorithms. Finally, chapter 8 
elaborates on publication [6]. 

-------------------------- References -------------------------------

[1] J. H. Schmidhuber.  A fixed size storage O(n^3) time complexity 
learning algorithm for fully recurrent continually running networks.
Neural Computation, 4(2):243--248, 1992.

[2] J. H. Schmidhuber.  Learning complex, extended sequences using the 
principle of history compression.  Neural Computation, 4(2):234--242, 1992.

[3] J. H. Schmidhuber.  Learning factorial codes by predictability 
minimization.  Neural Computation, 4(6):863--879, 1992.

[4] J. H. Schmidhuber.  Learning to control fast-weight memories: An 
alternative to recurrent nets.  Neural Computation, 4(1):131--139, 1992.

[5] J. H. Schmidhuber and D. Prelinger.  Discovering predictable 
classifications.  Neural Computation, 5(4):625--635, 1993.

[6] J. H. Schmidhuber. A self-referential weight matrix. In Proc. of 
the Int. Conf. on Artificial Neural Networks, Amsterdam, pages 446--451. 
Springer, 1993.

[7] J. H. Schmidhuber and R. Wahnsiedler.  Planning simple trajectories 
using neural subgoal generators.  In J. A. Meyer, H. L. Roitblat, and S. W. 
Wilson, editors, Proc.  of the 2nd Int. Conf. on Simulation of Adaptive 
Behavior, pages 196--202. MIT Press, 1992.

[8] J. H. Schmidhuber, M. C. Mozer, and D. Prelinger.  Continuous history 
compression.  In H. Huening, S. Neuhauser, M. Raus, and W. Ritschel, 
editors,  Proc. of Intl. Workshop on Neural Networks, RWTH Aachen, 
pages 87--95.  Augustinus, 1993.

[9] M. Eldracher and B. Baginski. Neural subgoal generation using 
backpropagation.  In George G. Lendaris, Stephen Grossberg and Bart 
Kosko, editors, Proc.  of WCNN'93, Lawrence Erlbaum Associates, Inc.,  
Hillsdale, pages = III-145--III-148, 1993.

[10] S.  Lindstaedt.  Comparison of unsupervised neural networks for 
redundancy reduction.  In M. C. Mozer, P. Smolensky, D. S. Touretzky, 
J. L. Elman and A. S.  Weigend, editors, Proc. of the 1993 Connectionist 
Models Summer School, pages  308-315. Hillsdale, NJ: Erlbaum Associates, 
1993.

----------------------------------------------------------------------	

The thesis comes in three parts. To obtain a copy, do:

	     unix>         ftp 131.159.8.35

	     Name:         anonymous
             Password:     (your email address, please) 
	     ftp>          binary
	     ftp>          cd pub/fki
             ftp>          get schmidhuber.habil.1.ps.Z
             ftp>          get schmidhuber.habil.2.ps.Z
             ftp>          get schmidhuber.habil.3.ps.Z
	     ftp>          bye

	     unix>         uncompress schmidhuber.habil.1.ps.Z
	     unix>         lpr  schmidhuber.habil.1.ps
	     .
	     .
	     .
    
    Note: The layout is designed for conventional 
    European DINA4 format. Expect 145 pages.


----------------------------------------------------------------------	
 
Dr. habil. J. H. Schmidhuber,  Fakultaet fuer Informatik, 
Technische Universitaet Muenchen, 80290 Muenchen, Germany
schmidhu at informatik.tu-muenchen.de



        --------- postdoctoral thesis (unrevised) -----------
        NETZWERKARCHITEKTUREN, ZIELFUNKTIONEN UND KETTENREGEL
                      Juergen Schmidhuber, TUM


More information about the Connectionists mailing list