new books in MIT Neural Network/Connectionsm series

Jeff Elman elman at crl.ucsd.edu
Mon May 3 23:23:25 EDT 1993


The following books have now appeared as part of the Neural Network
Modeling and Connection Series, and may be of interest to readers of
the connectionists mailing group.  Detailed descriptions of each book,
along with table of contents, follow.

Jeff Elman
============================================================

Neural Network Modeling and Connectionism Series
  Jeffrey Elman, editor.  MIT Press/Bradford Books.

* Miikkulainen, R. "Subsymbolic Natural Language Processing
    An Integrated Model of Scripts, Lexicon, and Memory"
* Mitchell, M.  "Analogy-Making as Perception A Computer Model"
* Cleeremans, A.  "Mechanisms of Implicit Learning Connectionist Models
    of Sequence Processing"
* Sereno, M.E. "Neural Computation of Pattern Motion Modeling Stages of
    Motion Analysis in the Primate Visual Cortex"
* Miller, W.T., Sutton, R.S., & Werbos, P.J. (Eds.), "Neural Networks for
    Control"
* Hanson, S.J., & Olson, C.R. (Eds.) "Connectionist Modeling and Brain
    Function The Developing Interface"
* Judd, S.J. "Neural Network Design and the Complexity of Learning"
* Mozer, M.C. "The Perception of Multiple Objects A Connectionist
    Approach"

------------------------------------------------------------
New
Subsymbolic Natural Language Processing
An Integrated Model of Scripts, Lexicon, and Memory
Risto Miikkulainen

Aiming to bridge the gap between low-level connectionist models and 
high-level symbolic artificial intelligence, Miikkulainen describes 
DISCERN, a complete natural language processing system implemented 
entirely at the subsymbolic level. In DISCERN, distributed neural 
network models of parsing, generating, reasoning, lexical processing, 
and episodic memory are integrated into a single system that learns to 
read, paraphrase, and answer questions about stereotypical narratives. 

Using the DISCERN system as an example, Miikkulainen introduces a 
general approach to building high-level cognitive models from 
distributed neural networks, and shows how the special properties of 
such networks are useful in modeling human performance. In this approach 
connectionist networks are not only plausible models of isolated 
cognitive phenomena, but also sufficient constituents for complete 
artificial intelligence systems.

Risto Miikkulainen is an Assistant Professor in the Department of 
Computer Sciences at the University of Texas,  Austin.

Contents: I.Overview. Introduction. Background. Overview of DISCERN. II. 
Processing Mechanisms. Backpropagation Networks. Developing 
Representations in FGREP Modules Building from FGREP Modules. III. 
Memory Mechanisms. Self-Organizing Feature Maps. Episodic Memory 
Organization: Hierarchical Feature Maps. Episodic Memory Storage and 
Retrieval: Trace Feature Maps. Lexicon. IV. Evaluation. Behavior of the 
Complete Model. Discussion. Comparison to Related Work. Extensions and 
Future Work. Conclusions. Appendixes: A Story Data. Implementation 
Details. Instructions for Obtaining the DISCERN Software.

A Bradford Book
May 1993 - 408 pp. - 129 illus. - $45.00
0-262-13290-7     MIISH

------------------------------------------------------------
New
Analogy-Making as Perception
A Computer Model
Melanie Mitchell

Analogy-Making as Perception is based on the premise that analogy-making 
is fundamentally a high-level perceptual process in which the 
interaction of perception and concepts gives rise to "conceptual 
slippages" which allow analogies to be made. It describes Copycat, 
developed by the author with Douglas Hofstadter, that models the 
complex, subconscious interaction between perception and concepts that 
underlies the creation of analogies. 
  
In Copycat, both concepts and high-level perception are emergent 
phenomena, arising from large numbers of low-level, parallel, 
non-deterministic activities. In the spectrum of cognitive modeling 
approaches, Copycat occupies a unique intermediate position between 
symbolic systems and connectionist systems - a position that is at 
present the most useful one for understanding the fluidity of concepts 
and high-level perception.
  
On one level the work described here is about analogy-making, but on 
another level it is about cognition in general. It explores such issues 
as the nature of concepts and perception and the emergence of highly 
flexible concepts from a lower-level "subcognitive" substrate.
  
Melanie Mitchell, Assistant Professor in the Department of Electrical 
Engineering and Computer Science at the University of Michigan, is a 
Fellow of the Michigan Society of Fellows. She is also Director of the 
Adaptive Computation Program at the Santa Fe Institute.

Contents: Introduction. High-Level Perception, Conceptual Slippage, and 
Analogy-Making in a Microworld. The Architecture of Copycat. Copycat's 
Performance on the Five Target Problems. Copycat's Performance on 
Variants of the Five Target Problems. Summary of the Comparisons between 
Copycat and Human Subjects. Some Shortcomings of the Model. Results of 
Selected "Lesions" of Copycat. Comparisons with Related Work. 
Contributions of This Research. Afterword by Douglas R. Hofstadter. 
Appendixes. A Sampler of Letter-String Analogy Problems Beyond Copycat's 
Current Capabilities. Parameters and Formulas. More Detailed 
Descriptions of Codelet Types.

A Bradford Book
May 1993 - 382 pp. - 168 illus. - $45.00
0-262-13289-3     MITAH

------------------------------------------------------------
New
Mechanisms of Implicit Learning
Connectionist Models of Sequence Processing
Axel Cleeremans

What do people learn when they do not know that they are learning? Until 
recently all of the work in the area of implicit learning focused on 
empirical questions and methods. In this book, Axel Cleeremans explores 
unintentional learning from an information-processing perspective. He 
introduces a theoretical framework that unifies existing data and models 
on implicit learning, along with a detailed computational model of human 
performance in sequence-learning situations.
  
The model, based on a simple recurrent network (SRN), is able to predict 
the successive elements of sequences generated from finite-state 
grammars. Human subjects are shown to exhibit a similar sensitivity to 
the temporal structure in a series of choice reaction time experiments 
of increasing complexity; yet their explicit knowledge of the sequence 
remains limited. Simulation experiments indicate that the SRN model is 
able to account for these data in great detail.  Other architectures 
that process sequential material are considered. These are contrasted 
with the SRN model, which they sometimes outperform. Considered 
together, the models show how complex knowledge may emerge through the 
operation of elementary mechanisms - a key aspect of implicit learning 
performance.
  
Axel Cleeremans is a Senior Research Assistant at the  National Fund for 
Scientific Research, Belgium.

Contents: Implicit Learning: Explorations in Basic Cognition. The SRN 
Model: Computational Aspects of Sequence Processing. Sequence Learning 
as a Paradigm for Studying Implicit Learning. Sequence Learning: Further 
Explorations. Encoding Remote Control. Explicit Sequence Learning. 
General Discussion.

A Bradford Book
April 1993 - 227 pp. - 60 illus. - $30.00
0-262-03205-8     CLEMH  

------------------------------------------------------------
New
Neural Computation of Pattern Motion
Modeling Stages of Motion Analysis in the Primate Visual Cortex
Margaret Euphrasia Sereno

How does the visual system compute the global motion of an object from 
local views of its contours?  Although this important problem in 
computational vision (also called the aperture problem) is key to 
understanding how biological systems work, there has been surprisingly 
little neurobiologically plausible work done on it. This book describes 
a neurally based model, implemented as a connectionist network, of how 
the aperture problem is solved. It provides a structural account of the 
model's performance on a number of tasks and demonstrates that the 
details of implementation influence the nature of the computation as 
well as predict perceptual effects that are unique to the model. The 
basic approach described can be extended to a number of different 
sensory computations.

"This is an important book, discussing a significant and very general 
problem in sensory processing.  The model presented is simple, and it is 
elegant in that we can see, intuitively, exactly why and how it works.  
Simplicity, clarity and elegance are virtues in any field, but not often 
found in work in neural networks and sensory processing.  The model 
described in Sereno's book is an exception.  This book will have a 
sizeable impact on the field." - James Anderson, Professor, Department 
of Cognitive and Linguistic Sciences, Brown University

Contents: Introduction. Computational, Psychophysical, and 
Neurobiological Approaches to Motion Measurement. The Model. Simulation 
Results. Psychophysical Demonstrations. Summary and Conclusions. 
Appendix: Aperture Problem Linearity.
  
A Bradford Book
March 1993 - 181 pp.- 41 illus. - $24.95
0-262-19329-9     SERNH 

------------------------------------------------------------
Neural Networks for Control
edited by W. Thomas Miller, III, Richard S. Sutton,
and Paul J. Werbos

This book brings together examples of all of the most important 
paradigms in artificial neural networks (ANNs) for control, including 
evaluations of possible applications. An appendix provides complete 
descriptions of seven benchmark control problems for those who wish to 
explore new ideas for building automatic controllers.
  
Contents: I.General Principles. Connectionist Learning for Control: An 
Overview, Andrew G. Barto. Overview of Designs and Capabilities, Paul J. 
Werbos. A Menu of Designs for Reinforcement Learning Over Time, Paul J. 
Werbos. Adaptive State Representation and Estimation Using Recurrent 
Connectionist Networks, Ronald J. Williams. Adaptive Control using 
Neural Networks, Kumpati S. Narendra. A Summary Comparison of CMAC 
Neural Network and Traditional Adaptive Control Systems, L. Gordon 
Kraft, III, and David P. Campagna. Recent Advances in Numerical 
Techniques for Large Scale Optimization, David F. Shanno. First Results 
with Dyna, An Integrated Architecture for Learning, Planning and 
Reacting, Richard S. Sutton.

II. Motion Control. Computational Schemes and Neural Network Models for 
Formation and Control of Multijoint Arm Trajectory, Mitsuo Kawato. 
Vision-Based Robot Motion Planning, Bartlett W. Mel. Using Associative 
Content-Addressable Memories to Control Robots, Christopher G. Atkeson 
and David J. Reinkensmeyer. The Truck Backer-Upper: An Example of 
Self-Learning in Neural Networks, Derrick Nguyen and Bernard Widrow. An 
Adaptive Sensorimotor Network Inspired by the Anatomy and Physiology of 
the Cerebellum, James C. Houk, Satinder P. Singh, Charles Fisher, and 
Andrew G. Barto. Some New Directions for Adaptive Control Theory in 
Robotics, Judy A. Franklin and Oliver G. Selfridge.

III. Application Domains. Applications of Neural Networks in Robotics 
and Automation for Manufacturing, Arthur C. Sanderson. A Bioreactor 
Benchmark for Adapive Network-based Process Control, Lyle H. Ungar. A 
Neural Network Baseline Problem for Control of Aircraft Flare and 
Touchdown, Charles C. Jorgensen and C. Schley. Intelligent Conrol for 
Multiple Autonomous Undersea Vehicles, Martin Herman, James S. Albus, 
and Tsai-Hong Hong. A Challenging Set of Control Problems, Charles W. 
Anderson and W. Thomas Miller.

A Bradford Book
1990 - 524 pp. - $52.50
0-262-13261-3  MILNH

------------------------------------------------------------
Connectionist Modeling and Brain Function
The Developing Interface
edited by Stephen Jose Hanson and Carl R. Olson

This tutorial on current research activity in connectionist-inspired 
biology-based modeling describes specific experimental approaches and 
also confronts general issues related to learning, associative memory, 
and sensorimotor development. 

"This volume makes a convincing case that data-rich brain scientists and 
model-rich cognitive psychologists can and should talk to one another. 
The topics they discuss together here - memory and perception - are of 
vital interest to both, and their collaboration promises continued 
excitement along this new scientific frontier." - George Miller, 
Princeton University

Contents: Part I: Overview. Introduction: Connectionism and 
Neuroscience, S. J. Hanson and C. R. Olson. Computational Neuroscience, 
T. J. Sejnowski, C. Koch, and P. S. Churchland. Part II: Associative 
Memory and Conditioning. The Behavioral Analysis of Associative Learning 
in the Terrestrial Mollusc Limax Maximus: The Importance of Inter-event 
Relationships, C. L. Sahley. Neural Models of Classical Conditioning: A 
Theoretical Viewpoint, G. Tesauro. Unsupervised Perceptual Learning: A 
Paleocortical Model, R. Granger, J. Ambros-Ingerson, P. Anton, and G. 
Lynch. Part III. The Somatosensory System. Biological Constraints on a 
Dynamic Network: The Somatosensory Nervous System, T. Allard. A Model of 
Receptive Field Plasticity and Topographic Reorganization in the 
Somatosensory Cortex, L. H. Finkel. Spatial Representation of the Body, 
C. R. Olson and S. J. Hanson. Part IV: The Visual System. The 
Development of Ocular Dominance Columns: Mechanisms and Models. K. D. 
Miller and M. P. Stryker. Self- Organization in a Perceptual System: How 
Network Models and Information Theory May Shed Light on Neural 
Organization, R. Linsker. Solving the Brightness-From-Luminance Problem: 
A Neural Architecture for Invariant Brightness Perception, S. Grossberg 
and D. Todorovic.

A Bradford Book
1990 - 423 pp. - $44.00
0-262-08193-8  HANCH

------------------------------------------------------------
Neural Network Design and the Complexity of Learning
J. Stephen Judd

Using the tools of complexity theory, Stephen Judd develops a formal 
description of associative learning in connectionist networks. He 
rigorously exposes the computational difficulties in training neural 
networks and explores how certain design principles will or will not 
make the problems easier. 

"Judd . . . formalized the loading problem and proved it to be 
NP-complete. This formal work is clearly explained in his book in such a 
way that it will be accessible both to the expert and nonexpert." - Eric 
B. Baum, IEEE Transactions on Neural Networks

"Although this book is the true successor to Minsky and Papert's 
maligned masterpiece of 1969 (Perceptrons), Judd is not trying to 
demolish the field of neurocomputing. His purpose is to clarify the 
limitations of a wide class of network models and thereby suggest 
guidelines for practical applications." - Richard Forsyth, Artificial 
Intelligence & Behavioral Simulation

Contents: Neural Netowrks: Hopes, Problems, and Goals. The  Loading 
Problem. Other Studies of Learning. The Intractability of Loading. 
Subcases. Shallow Architectures. Memorization and Generalization. 
Conclusions. Appendices

A Bradford Book
1990 - 150 pp. - $27.50
0-262-10045-2  JUDNH

------------------------------------------------------------
The Perception of Multiple Objects  
A Connectionist Approach
Michael C. Mozer

Building on the vision studies of David Marr and the connectionist 
modeling of the PDP group it describes a neurally inspired computational 
model of two-dimensional object recognition and spatial attention that 
can explain many characteristics of human visual perception. The model, 
called MORSEL, can actually recognize several two-dimensional objects at 
once (previous models have tended to blur multiple objects into one or 
to overload). Mozer's is a fully mechanistic account, not just a 
functional-level theory.   

"Mozer's work makes a major contribution to the study of visual 
information processing. He has developed a very creative and 
sophisticated new approach to the problem of visual object recognition. 
The combination of computational rigor with thorough and knowledgeable 
examination of psychological results is impressive and unique." - Harold 
Pashler, University of California at San Diego

Contents: Introduction. Multiple Word Recognition. The Pull-Out Network. 
The Attentional Mechanism. The Visual Short-Term Memory. Psychological 
Phenomena Explained by MORSEL. Evaluation of MORSEL. Appendixes: A 
Comparison of Hardware Requirements. Letter Cluster Frequency and 
Discriminability Within BLIRNET's Training Set.

A Bradford Book
1991 - 217 pp - $27.50
0-262-13270-2  MOZPH

-------------------------------------------------------------
ORDER FORM

Please send me the following book(s):

Qty    Author      Bookcode  Price

___    Cleeremans  CLEMH     30.00
___    Hanson      HANCH     44.00
___    Judd        JUDNH     27.50
___    Mikkulainen MIISH     45.00
___    Miller      MILNH     52.50
___    Mitchell    MITAH     45.00 
___    Mozer       MOZPH     27.50
___    Sereno      SERNH     24.95

 
___ Payment Enclosed  ___  Purchase Order Attached

Charge to my  ___  Master Card   ___  Visa

Card# _______________________________

Exp.Date _______________

Signature _________________________________________________ 

_____ Total for book
$2.75 Postage 
_____ Please add 50c postage for each additional book
_____ Canadian customers Add 7% GST
_____ TOTAL due MIT Press

Send To:

Name ______________________________________________________

Address ___________________________________________________

City ________________________ State ________ Zip __________

Daytime Phone ________________ Fax ________________________

Make checks payable and send order to:
The MIT Press  *  55 Hayward Street *  Cambridge, MA 02142  

For fastest service call (617) 625-8569 
or toll-free 1-800-356-0343 
        

The MIT Guarantee: If for any reason you are not completely satisfied, 
return your book(s) within ten days of receipt for a full refund or 
credit.

3ENET



More information about the Connectionists mailing list