No subject
Mon Jun 5 16:42:55 EDT 2006
This thoroughly and thoughtfully revised edtion of a very
successful textbook makes the principles and the details of neural
network modeling accessible to cognitive scientists of all
varieties as well as other scholars interested in these models.
Research since the publication of the first edition has been
systematically incorporated into a framework of proven pedagogical
value.
Features of the second edition include:
A new section on spatiotemporal pattern processing.
Coverage of ARTMAP networks (the supervised version of
adaptive resonance networks) and recurrent back-propagation
networks.
A vastly expanded section on models of specific brain
areas, such as the cerebellum, hippocampus, basal ganglia,
and visual and motor cortex.
Up-to-date coverage of applications of neural networks in
areas such as combinatorial optimization and knowledge
representation.
As in the first edition, the text includes extensive introductions
to neuroscience and to differential and difference equations as
appendices for students without the requisite background in these
areas. As graphically revealed in the flowchart in the front of
the book, the text begins with simpler processes and builds up to
more complex multilevel functional systems.
Table of contents:
Chapters 2 through 7 each include equations and exercises
(computational, mathematical, and qualitative) at the end of the
chapter. The text sections are as follows.
Flow Chart of the Book
Preface
Preface to the Second Edition
Chapter 1: Brain and Machine: The Same Principles?
What are Neural Networks?
What Are Neural Networks?
Is Biological Realism a Virtue?
What Are Some Principles of Neural Network Theory?
Methodological Considerations
Chapter 2: Historical Outline
2.1. Digital Approaches
The McCulloch-Pitts Network
Early Approaches to Modeling Learning: Hull and Hebb
Rosenblatt's Perceptrons
Some Experiments With Perceptrons
The Divergence of Artificial Intelligence and Neural Modeling
2.2. Continuous and Random Net Approaches
Rashevsky's Work
Early Random Net Models
Reconciling Randomness and Specificity
Chapter 3: Associative Learning and Synaptic Plasticity
3.1. Physiological Bases for Learning
3.2. Rules for Associative Learning
Outstars and Other Early Models of Grossberg
Anderson's Connection Matrices
Kohonen's Early Work
3.3. Learning Rules Related to Changes in Node Activities
Klopf's Hedonistic Neurons and the Sutton-Barto Learning Rule
Error Correction and Back Propagation
The Differential Hebbian Idea
Gated Dipole Theory
3.4. Associative Learning of Patterns
Kohonen's Recent Work: Autoassociation and Heteroassociation
Kosko's Bidirectional Associative Memory
Chapter 4: Competition, Lateral Inhibition, and Short-Term Memory
4.1. Contrast Enhancement, Competition, and Normalization
Hartline and Ratliff's Work, and Other Early Visual Models
Nonrecurrent Versus Recurrent Lateral Inhibition
4.2. Lateral Inhibition and Excitation Between Sensory Representations
Wilson and Cowan's Work
Work of Grossberg and Colleagues
Work of Amari and Colleagues
Energy Functions in the Cohen-Grossberg and Hopfield-Tank Models
The Implications of Approach to Equilibrium
Networks With Synchronized Oscillations
4.3. Visual Pattern Recognition Models
Visual Illusions
Boundary Detection Versus Feature Detection
Binocular and Stereoscopic Vision
Visual Motion
Comparison of Grossberg's and Marr's Approaches
4.4. Uses of Lateral Inhibition in Higher Level Processing
Chapter 5: Conditioning, Attention, and Reinforcement
5.1. Network Models of Classical Conditioning
Early Work: Brindley and Uttley
Rescorla and Wagner's Psychological Model
Grossberg: Drive Representations and Synchronization
Aversive Conditioning and Extinction
Differential Hebbian Theory Versus Gated Dipole Theory
5.2. Attention and Short-Term Memory in Conditioning Models
Grossberg's Approach to Attention
Sutton and Barto's Approach: Blocking and Interstimulus Interval Effects
Some Contrasts Between the Grossberg and Sutton-Barto Approaches
Further Connections With Invertebrate Neurophysiology
Further Connections With Vertebrate Neurophysiology
Gated Dipoles, Aversive Conditioning, and Timing
Chapter 6: Coding and Categorization
6.1. Interactions Between Short- and Long-Term Memory in Code Development
Malsburg's Model With Synaptic Conservation
Grossberg's Model With Pattern Normalization
Mathematical Results of Grossberg and Amari
Feature Detection Models With Stochastic Elements
From Feature Coding to Categorization
6.2. Supervised Classification Models
The Back Propagation Network and its Variants
The RCE Model
6.3. Unsupervised Classification Models
The Rumelhart-Zipser Competitive Learning Algorithm
Adaptive Resonance Theory
Edelman and Neural Darwinism
6.4. Models that Combine Supervised and Unsupervised Parts
ARTMAP and Other Supervised Adaptive Resonance Networks
Brain-State-in-a-Box (BSB) Models
6.5. Translation and Scale Invariance
6.6. Processing Spatiotemporal Patterns
Chapter 7
Optimization, Control, Decision, and Knowledge Representation
7.1. Optimization and Control
Classical Optimization Problems
Simulated Annealing and Boltzmann Machines
Motor Control: The Example of Eye Movements
Motor Control: Arm Movements
Speech Recognition and Synthesis
Robotic and Other Industrial Control Problems
7.2. Decision Making and Knowledge Representation
What, If Anything, Do Biological Organisms Optimize?
Affect, Habit, and Novelty in Neural Network Theories
Knowledge Representation: Letters and Words
Knowledge Representation: Concepts and Inference
7.3. Neural Control Circuits, Mental Illness, and Brain Areas
Overarousal, Underarousal, Parkinsonism, and Depression
Frontal Lobe Function and Dysfunction
Disruption of Cognitive-Motivational Interactions
Impairment of Motor Task Sequencing
Disruption of Context Processing
Models of Specific Brain Areas
Models of the Cerebellum
Models of the Hippocampus
Models of the Basal Ganglia
Models of the Cerebral Cortex
Chapter 8: A Few Recent Technical Advances
8.1. Some "Toy" and Real World Computing Applications
Pattern Recognition
Knowledge Engineering
Financial Engineering
"Oddball" Applications
8.2. Some Neurobiological Discoveries
Appendix 1: Basic Facts of Neurobiology
The Neuron
Synapses, Transmitters, Messengers, and Modulators
Invertebrate and Vertebrate Nervous Systems
Functions of Vertebrate Subcortical Regions
Functions of the Mammalian Cerebral Cortex
Appendix 2: Difference And Differential Equations in Neural Networks
Example: The Sutton-Barto Difference Equations
Differential Versus Difference Equations
Outstar Equations: Network Interpretation and Numerical Implementation
The Chain Rule and Back Propagation
Dynamical Systems: Steady States, Limit Cycles, and Chaos
ABOUT THE AUTHOR: Daniel S. Levine is Professor of Psychology at
the University of Texas at Arlington. A former president of the
International Neural Network Society, he is the organizer of the
MIND conferences, which bring together leading neural network
researchers from academia and industry. Since 1975, he has written
nearly 100 books, articles, and chapters for various audiences
interested in neural networks.
More information about the Connectionists
mailing list