BOOK: Neuronal Adaptation Theory

Hans-Otto Carmesin carmesin at schoner.physik.uni-bremen.de
Thu May 23 09:49:18 EDT 1996


The new book       NEURONAL ADAPTATION THEORY       is now available.
             ISBN 3-631-30039-5,                US-ISBN 0-8204-3172-9
AUTHOR:      Hans-Otto Carmesin, Institute for Theoretical Physics,
             University Bremen, 28334 Bremen, Germany, Fax 0421 218 4869, 
             email: carmesin at theo.physik.uni-bremen.de,
             www: http://schoner.physik.uni-bremen.de/~carmesin/
PUBLISHER:   Peter Lang, Frankfurt/M., Berlin, Bern, New York, Paris, Wien; 
--->  --->   Please send your order to: 
             Peter Lang GmbH, Europischer Verlag der Wissenschaften,
             Abteilung WB, Box 940225, 60460 Frankfurt/M., Germany
PRICE: 59DM; PAGES: 236 (23x16cm), num.fig.

FEATURES: The book includes 29 exercises with solutions, 43 essential
ideas, 108 partially coloured figures, experiment explanations and
general theorems.

ABSTRACT: The human genotype represents at most ten billion pieces of
binary information, whereas the human brain contains more than a
million times a billion synapses. So a differentiated brain structure
is due to synaptic self-organization and adaptation. The goal is to
model the formation of observed global brain structures and cognitive
properties from local synaptic dynamics sometimes supervised by the
limbic system. A general neuro-synaptic dynamics is solved with a
novel field theory in a comprehensible manner and in quantitative
agreement with many observations. Novel results concern for instance
thermal membrane fluctuations, fluctuation dissipation theorems,
cortical maps, topological charges, operant conditioning, transitive
inference, learning hidden structures, behaviourism, attention focus,
Wittgenstein paradox, infinite generalization, schizophrenia dynamics,
perception dynamics, non-equilibrium phase transitions, emergent
valuation. Also the formation of advanced cognitive properties is
modeled. 

CONTENTS:
 1       Introduction 13  
 1.1     The role of theory 13  

 2       Neuronal Association Patterns 17  
 2.1     Classical conditioning 17  
 2.2     Typical nerve cell 18  
 2.3     Neuronal dynamics 20  
 2.3.1   Two-valued neurons 20  
 2.3.2   Two alternative formulations 21  
 2.4     Coupling dynamics 23  
 2.4.1   Usage dependent couplings 25  
 2.4.2   Neuronal activity patterns 25  
 2.5     Network model for classical conditioning 29  
 2.6     Pattern recognition 32  
 2.6.1   Task 32  
 2.6.2   One pattern 32  
 2.6.3   Several patterns 34  
 2.7     Pattern retrieval with stochastic dynamics 39  
 2.7.1   Dynamical equilibrium for a single neuron 40  
 2.7.2   Dynamical equilibrium for configurations 40  
 2.8     A physiological basis of stochastic dynamics 44  
 2.8.1   Biophysics of action potentials 44  
 2.8.2   Spherical capacitor cell model 45  
 2.8.3   Nyquist formula 46  
 2.8.4   Thermodynamic membrane potential fluctuations 50  
 2.8.5   Resulting stochastic neuronal dynamics 51  
 2.8.6   Discussion 53  
 2.9     Pattern retrieval with effectively continuous time 53  
 2.9.1   Continuous spike response function 54  
 2.9.2   Network model 54  
 2.9.3   Model analysis 55  
 2.10    Discussion of chapter 2 58  

 3       Self-Organizing Networks 60  
 3.1     Basic principle 61  
 3.2     Retinotopy as model system 61  
 3.3     General two-valued neuron coupling rules 63  
 3.3.1   Locality principle 63  
 3.3.2   Additive membrane potential rule, AMPR 64  
 3.3.3   Coupling transfer rule, CTR 64  
 3.3.4   Local linear coupling dynamics, LLCD 65  
 3.3.5   Limited neuronal couplings, LNCR 65  
 3.4     A 1D self-organizing network with Hebb-rule 65  
 3.4.1   Network architecture 65  
 3.4.2   Coupling dynamics 67  
 3.4.3   Transformed couplings 68  
 3.4.4   Single stimulation potential 68  
 3.5     Field theory of neurosynaptic dynamics 69  
 3.5.1   A general solution method 69  
 3.5.2   Ergodicity 69  
 3.5.3   Neurosynaptic states and transitions 70  
 3.5.4   Averaged neurosynaptic change field 70  
 3.5.5   Differential equation for neurosynaptic change field 71  
 3.5.6   Adiabatic principle 71  
 3.5.7   Differential equation for synaptic change field 72  
 3.5.8   Change potential field 73  
 3.5.9   Fluctuation dissipation theorems 77  
 3.5.10  Discussion 81  
 3.6     Field theory of topology preservation 81  
 3.6.1   Emergence of an injective mapping 81  
 3.6.2   Single neuron separation 83  
 3.6.3   Coincidence stabilization 84  
 3.6.4   Emergence of 1D topology preservation 86  
 3.6.5   Emergence of clusters and topology preservation 87  
 3.6.6   Discussion 92  
 3.7     Field theory of orientation preference emergence 92  
 3.7.1   Network model 92  
 3.7.2   Change potentials 94  
 3.7.3   Potential minima 95  
 3.7.4   Discussion 97  
 3.8     Field theory of orientation pattern emergence 98  
 3.8.1   Phenomenon of pinwheel structures 98  
 3.8.2   Network model 98  
 3.8.3   Effective iso-orientation interaction 100  
 3.8.4   Continuous orientation interaction 101  
 3.8.5   Orientation fluctuations 102  
 3.8.6   Instability of the ground state 103  
 3.8.7   Topological singularities according to the Poisson equation 104  
 3.8.8   Greens function solution 106  
 3.8.9   Energy of a planar system of charges 108  
 3.8.10  Prediction: Plasma phase transition 109  
 3.9     Overview for formal temperatures 110  
 3.10    Discussion of chapter 3 111  

 4       Supervised & Self-Organized Adaptation 113  
 4.1     Forms of supervised adaptation 113  
 4.2     Operant conditioning 114  
 4.2.1   The phenomenon of transitive inference 114  
 4.2.2   Network model 116  
 4.2.3   Analysis of the network model 117  
 4.2.4   Transitive inference 119  
 4.2.5   Symbolic distance effect 119  
 4.2.6   Network parameters for various species 121  
 4.3     Generalized quantitative dynamical analysis 122  
 4.3.1   General valuation dynamics 122  
 4.3.2   Transitive inference with general valuation dynamics 123  
 4.3.3   Necessary and sufficient conditions for learning the Piaget task 123  
 4.3.4   Transitive inference as a consequence of successful learning 124  
 4.3.5   General set of tasks 125  
 4.3.6   Network model with minimization of complexity 126  
 4.3.7   Complete neurosynaptic dynamics and empirical data 127  
 4.3.8   Discussion of operant conditioning 130  
 4.4     Supervised Hebb-rule 131  
 4.4.1   Network model 131  
 4.4.2   Network analysis 131  
 4.4.3   Discussion on convergence with Hebb-rules 134  
 4.5     Perceptron 134  
 4.5.1   Network and task definition 134  
 4.5.2   Network architecture capabilities 135  
 4.5.3   Perceptron convergence theorem 136  
 4.6     Discussion of chapter 4 137  

 5       Advanced Adaptations 138  
 5.1     Learning of charges 139  
 5.1.1   An especially simple experiment 140  
 5.1.2   Necessary inner neurons 141  
 5.1.3   Definition of frameworks 141  
 5.1.4   Network model 142  
 5.1.5   Analysis of the network model 143  
 5.1.6   Discussion 146  
 5.2     Attention 147  
 5.2.1   Network model with attention 148  
 5.2.2   Potential field theorem 148  
 5.2.3   Attentional learning of charges 150  
 5.2.4   Attentional adaptation convergence theorem 151  
 5.2.5   Emergence of network architectures 153  
 5.2.6   Generalized perceptron 153  
 5.2.7   Neuronal dynamics with signum function 155  
 5.2.8   Discussion 156  
 5.3     Reversal 156  
 5.3.1   A reversal experiment 157  
 5.3.2   Network model 157  
 5.3.3   Discussion of reversal 159  
 5.4     Learning of counting 159  
 5.4.1   Generalization without limitation 159  
 5.4.2   Network architecture and dynamics 160  
 5.4.3   Analysis of the network 161  
 5.4.4   An instructive network model 162  
 5.4.5   Advanced network dynamics 164  
 5.4.6   Analysis of the advanced network model 165  
 5.4.7   A solution of Wittgenstein's paradox 166  
 5.4.8   Discussion 168  
 5.5     Convergence theorem for inner feedback 168  
 5.5.1   Idea of adaptation via short dimension increase 169  
 5.5.2   Specification of the learning situation 170  
 5.5.3   Learning algorithm for inner feedback 171  
 5.5.4   Convergence theorem 174  
 5.5.5   Optimal correspondence via short dimension increase 177  
 5.5.6   Generalizations 178  
 5.5.7   Discussion 179  
 5.6     Correspondence deficit compensation: Schizophrenia model? 180  
 5.6.1   Starting point 180  
 5.6.2   Network model 181  
 5.6.3   Network characteristics 182  
 5.6.4   Transfer to schizophrenia 185  
 5.6.5   Therapy 187  
 5.6.6   Empirical findings 188  
 5.6.7   Discussion 194  
 5.7     A mesoscopic perception model 195  
 5.7.1   External stimulations 195  
 5.7.2   Network model 197  
 5.7.3   Field theoretic solution of the network 201  
 5.7.4   Modeling phenomena 203  
 5.7.5   Discussion 210  
 5.8     Emergent valuation 211  
 5.8.1   Emergence of a valuating field 211  
 5.8.2   Effect of a valuating stimulation 213  
 5.9     General adaptation dynamics 214  
 5.9.1   Definition of microscopic dynamics 214  
 5.9.2   Resulting macroscopic dynamics 216  
 5.9.3   Some special cases 218  
 5.10    Discussion of chapter 5 219  
 5.11    No Laplace demon 220  

 6       Summary 221  
 6.1     Overview 221  
 6.2     Predictions 222  
 6.3     List of ideas 224  
 6.4     Open questions 225  


More information about the Connectionists mailing list