New Book: "Neural Smithing" (MIT Press - 1999) g

Robert J. Marks II marks at maxwell.ee.washington.edu
Wed Jun 16 01:20:45 EDT 1999


NEW BOOK:

Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Russell D. Reed  & Robert J. Marks II (MIT Press, 1999).
 _____________________
REVIEW
"I have added a new book to the list of "The best elementary textbooks on
practical use of NNs" in the NN FAQ ..."
"Reed, R.D., and Marks, R.J, II (1999), Neural Smithing: Supervised
Learning in Feedforward Artificial Neural Networks, Cambridge, MA:
The MIT Press, ISBN 0-262-18190-8.
"After you have read Smith (1993) or Weiss and Kulikowski (1991),
Reed and Marks provide an excellent source of practical details for
training MLPs. They cover both backprop and conventional optimization
algorithms. Their coverage of initialization methods, constructive
networks, pruning, and regularization methods is unusually thorough.
Unlike the vast majority of books on NNs, this one has lots of
really informative graphs..."
Warren S. Sarle, SAS Institute Inc. on <comp.ai.neural-nets>.
______________________
Contents:
Artificial neural networks are nonlinear mapping systems whose structure is
loosely based on principles observed in the nervous systems of humans and
animals. The basic idea is that massive systems of simple units linked
together in appropriate ways can generate many complex and interesting
behaviors. This book focuses on the subset of feedforward artificial neural
networks called multilayer perceptions (MLP). These are the most widely
used neural networks, with applications as diverse as finance
(forecasting), manufacturing (process control), and science (speech and
image recognition). This book presents an extensive and practical overview
of almost every aspect of MLP methodology, progressing from an initial
discussion of what MLPs are and how they might be used to an in-depth
examination of technical factors affecting performance. The book can be
used as a tool kit by readers interested in applying networks to specific
problems, yet it also presents theory and references outlining the last ten
years of MLP research.
 

Table of Contents 
 Preface  
1 Introduction 1 
2 Supervised Learning 7 
3 Single-Layer Networks 15 
4 MLP Representational Capabilities 31 
5 Back-Propagation 49 
6 Learning Rate and Momentum 71 
7 Weight-Initialization Techniques 97 
8 The Error Surface 113 
9 Faster Variations of Back-Propagation 135 
10 Classical Optimization Techniques 155 
11 Genetic Algorithms and Neural Networks 185 
12 Constructive Methods 197 
13 Pruning Algorithms 219 
14 Factors Influencing Generalization 239 
15 Generalization Prediction and Assessment 257 
16 Heuristics for Improving Generalization 265 
17 Effects of Training with Noisy Inputs 277 
A Linear Regression 293 
B Principal Components Analysis 299 
C Jitter Calculations 311 
D Sigmoid-like Nonlinear Functions 315 
 References 319 
 Index 339 

Ordering information:
1. MIT Press
	http://mitpress.mit.edu/book-home.tcl?isbn=0262181908
2. amazon.com

http://www.amazon.com/exec/obidos/ASIN/0262181908/qid%3D909520837/sr%3D1-21/
002-3321940-3881246
3. Barnes & Nobel

http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=1KKG10OPZT&
mscssid=A7M4XXV5DNS12MEG00CGNDBFPT573NJS&pcount=&isbn=0262181908
4. buy.com
	http://www.buy.com/books/product.asp?sku=30360116




More information about the Connectionists mailing list