new book

Marwan A. Jabri, Sydney Univ. Elec. Eng., Tel: +61-2 692 2240 marwan at sedal.usyd.edu.AU
Tue Jan 9 18:21:32 EST 1996


NEW BOOK

ADAPTIVE ANALOGUE VLSI NEURAL SYSTEMS

M.A. Jabri, R.J. Coggins, and B.G. Flower

This is the first practical book on neural networks  learning chips and
systems. It covers the entire process of implementing neural networks in  
VLSI chips, beginning with the crucial issues of learning algorithms
in an analog framework and limited precision effects, and giving actual case
studies of working systems. The approach is systems and applications
oriented throughout, demonstrating the attractiveness  of such
an approach for applications such as adaptive pattern recognition and
optical character recognition. Prof. Jabri and his co-authors from AT&T Bell
Laboratories, Bellcore and the University of Sydney provide a
comprehensive introduction to VLSI neural networks suitable for research 
and development staff and advanced students.

Key benefits to reader:

o	covers system aspects
o	examines on-chip learning
o	deals with the effect of the limited precision of VLSI techniques
o	covers the issue of low-power implementation of chips with
	learning synapses 

Book ordering info:

December 1995: 234X156: 272pp, 135 line illus, 7 halftone illus: Paperback:
0-412-61630-0: L29.95

CHAPMAN & HALL   
2-6 Boundary Row, London, SE1 8HN, U.K. 
Telephone: +44-171-865 0066 
Fax: +44-171-522 9623


Contents

1  Overview										
2  Architectures and Learning Algorithms
      2.1    Introduction
      2.2.   Framework
      2.3    Learning
      2.4    Perceptrons
      2.5    The Multi-Layer Perceptron
      2.6    The Backpropagation Algorithm
      2.7    Comments

3  MOS  Devices and Circuits

      3.1   Introduction
      3.2   Basic Properties of MOS Devices
      3.3   Conduction in MOSFETs				               
      3.4   Complementary MOSFETs					
      3.5   Noise in MOSFETs
      3.6   Circuit models of MOSFETs
      3.7   Simple CMOS Amplifiers
      3.8   Multistage OP AMPS
      3.9   Choice of Amplifiers
      3.10  Data Converters


4  Analog VLSI Building Blocks

      4.1    Functional Designs to Architectures
      4.2    Neurons and Synapses
      4.3    Layout Strategies
      4.4    Simulation Strategies

    
5  Kakadu - A  Low Power Analog VLSI MLP

      5.1    Advantages of Analog Implementation
      5.2    Architecture
      5.3    Implementation
      5.4    Chip Testing
      5.5    Discussion


6 Analog VLSI Supervised Learning

      6.1    Introduction
      6.2    Learning in an Analog Framework
      6.3    Notation 
      6.4    Weight Update Strategies
      6.5    Learning Algorithms
      6.6    Credit Assignment Efficiency
      6.7    Parallelisation Heuristics
      6.8    Experimental Methods
      6.9    ICEG Experimental Results					
      6.10   Parity 4 Experimental Results
      6.11   Discussion
      6.12   Conclusion


7  A  Micropower Neural Network

      7.1   Introduction
      7.2   Architecture
      7.3   Training System
      7.4   Classification  Performance and Power Consumption
      7.5   Discussion
      7.6   Conclusion


8  On-Chip perturbation Based Learning

      8.1   Introduction
      8.2   On-Chip Learning Multi-Layer Perceptron
      8.3   On-Chip Learning Recurrent Neural Network
      8.4   Conclusion


9  Analog Memory Techniques

      9.1   Introduction
      9.2   Self-Refreshing Storage Cells				
      9.3   Multiplying DACs
      9.4   A/D-D/A Static Storage Cell
      9.5   Basic Principle of the Storage Cell
      9.6   Circuit Limitations
      9.7   Layout Considerations
      9.8   Simulation Results
      9.9   Discussion


10  Switched Capacitor Techniques

     10.1   A Charged Based Network
     10.2   Variable Gain, Linear, Switched Capacitor	Neurons
     

11  NET32K High Speed Image Understanding System

     11.1   Introduction
     11.2   The NET32K Chip
     11.3   The NET32K Board System
     11.4   Applications
     11.5   Summary and Conclusions


12  Boltzmann Machine Learning System

     12.1   Introduction
     12.2   The Boltzmann Machine
     12.3   Deterministic Learning by Error Propagation
     12.4   Mean-field Version of Boltzmann Machine
     12.5   Electronic Implementation of a Boltzmann  Machine
     12.6   Building a System using the Learning Chips
     12.7   Other Applications


References

Index

				
       




More information about the Connectionists mailing list