edited collection of ANN papers; discount

Pankaj Mehra mehra at ptolemy.arc.nasa.gov
Mon Jun 21 18:58:56 EDT 1993


Fellow Connectionists:

	Some of you may have already seen ``Artificial Neural Networks:
Concepts and Theory,'' edited by [yours truly] and Ben Wah. It was
published by IEEE Computer Society Press in August, 1992. The table of
contents are attached at the end of this message. The book is hardback
and has 667 pages of which approx 100 are from chapter introductions written
by the editors. The list price is $70 [$55 for IEEE members].

	My intent in sending this message is not so much to announce the
availability of our book as it is to bring to your notice the following
offer of discount: If I place an order, I get an author's discount of 40%
off list price; if a school bookstore places the order, they get a 32%
discount. The IEEE order no. for the book is 1997; 1-800-CS-BOOKS.
If you are planning to teach a graduate course on neural networks, you
will probably find that our collection of papers as well as the up-to-date
bibliography at the end of each chapter introduction provide excellent
starting points for independent research.

-Pankaj Mehra

415/604-0165					mehra at ptolemy.arc.nasa.gov

				    NASA - Ames Research Center, M/S 269-3
					      Moffett Field, CA 94035-1000
								       USA

__________________________________________________________________________

TABLE OF CONTENTS:						page
-----------------

Chapter 1: INTRODUCTION

Introduction by editors						1-12

An Introduction to Computing with Neural Nets, Lippmann	       13-31

An Introduction to Neural Computing, Kohonen		       32-46


Chapter 2: CONNECTIONIST PRIMITIVES

Introduction by editors					       47-55

A General Framework for Parallel Distributed Processing,
Rumelhart, Hinton, & McClelland				       56-82

Multilayer Feedforward Potential Function Network, Lee & Kil   83-93

Learning, Invariance, and Generalization in High-Order
Networks, Giles & Maxwell				       94-100

The Subspace Learning Algorithm as a Formalism for Pattern
Recognition and Neural Networks, Oja & Kohonen		      101-108


Chapter 3: KNOWLEDGE REPRESENTATION

Introduction by editors					      109-116

BoltzCONS: Reconciling Connectionism with the Recursive
Nature of Stacks and Tree, Touretzky			      117-125

Holographic Reduced Representations: Convolution Algebra for
Compositional Distributed Representations, Plate	      126-131

Efficient Inference with Multi-Place Predicates and Variables
in a Connectionist System, Ajjanagadde and Shastri	      132-139

Integrated Architectures for Learning, Planning, and Reacting
Based on Approximating Dynamic Programming, Sutton	      140-148


Chapter 4: LEARNING ALGORITHMS I

Introduction by editors					      149-166

Connectionist Learning Procedures, Hinton		      167-216

30 Years of Adaptive Neural Networks: Perceptron, Madaline,
and back-Propagation, Widrow and Lehr			      217-244

Supervised Learning and Systems with Excess Degrees of
Freedom, Jordan						      245-285

The Cascade-Correlation Learning Architecture, Fahlman	      286-294

Learning to Predict by the Methods of Temporal Differences,
Sutton							      295-330

A Theoretical Framework for Back-Propagation, le Cun	      331-338

Two Problems with Backpropagation and other Steepest-Descent
Learning Procedures for Networks, Sutton		      339-348


Chapter 5: LEARNING ALGORITHMS II

Introduction by editors					      349-358

The Self-Organizing Map, Kohonen			      359-375

The ART of Adaptive Pattern Recognition by a Self-Organizing
Neural Network, Grossberg				      376-387

Unsupervised Learning in Noise, Kosko			      388-401

A Learning Algorithm for Boltzmann Machines, Ackley, Hinton
& Sejnowski						      402-424

Learning Algorithms and Probability Distributions in Feed-
forward and Feed-back Networks, Hopfield		      425-429

A Mean Field Theory Learning Algorithm for Neural Networks,
Peterson & Anderson					      430-454

On the Use of Backpropagation in Associative Reinforcement
Learning, Williams					      455-462


Chapter 6: COMPUTATIONAL LEARNING THEORY

Introduction by editors					      463-473

Information Theory, Complexity, and Neural Networks,
Abu-Mostafa						      474-478

Geometrical and Statistical Properties of Systems of Linear
Inequalities with Applications in Pattern Recognition, Cover  479-487

Approximation by Superpositions of a Sigmoidal Function,
Cybenko							      488-499

Approximation and Estimation Bounds for Artificial Neural
Networks, Barron					      500-506

Generalizing the PAC Model: Sample Size Bounds From Metric
Dimension-based Uniform Convergence Results, Haussler	      507-512

Complete Representations for Learning from Examples, Baum     513-534

A Statistical Approach to Learning and Generalization in
Neural Networks, Levin, Tishby & Solla			      535-542


Chapter 7: STABILITY AND CONVERGENCE

Introduction by editors					      543-550

Convergence in Neural Nets, Hirsch			      551-561

Statistical Neurodynamics of Associative Memory, Amari &
Maginu							      562-572

Stability and Adaptation in Artificial Neural Systems,
Schurmann						      573-580

Dynamics and Architecture for Neural Computation, Pineda      581-610

Oscillations and Synchronizations in Neural Networks: An
Exploration of the Labeling Hypothesis, Atiya & Baldi	      611-632


Chapter 8: EMPIRICAL STUDIES

Introduction by editors                                       633-639

Scaling Relationships in Back-Propagation Learning: Dependence
on Training Set Size, Tesauro				      640-645

An Empirical Comparison of Pattern Recognition, Neural Nets, and
Machine Learning Classification Methods, Weiss & Kapouleas    646-652

Basins of Attraction of Neural Network Models, Keeler	      653-657

Parallel Distributed Approaches to Combinatorial Optimization:
Benchmark Studies on Traveling Salesman Problem, Peterson     658-666



More information about the Connectionists mailing list