Neural Computation 4:2

Terry Sejnowski terry at jeeves.UCSD.EDU
Wed Mar 25 15:14:36 EST 1992


Neural Computation Volume 4, Issue 2, March 1992

Review

First and Second-Order Methods for Learning:
Steepest Descent and Newton's Method
	Roberto Battiti

Article

Efficient Simplex-Like Methods for Equilibria of 
Nonsymmetric Analog Networks
	Douglas A. Miller and Steven W. Zucker

Note

A Volatility Measure for Annealing in Feedback Neural Networks
	Joshua Alspector, Torsten Zeppenfeld and Stephan Luna

Letters

What Does the Retina Know about Natural Scenes?
	Joseph J. Atick and A. Norman Redlich

A Simple Network Showing Burst Synchronization
without Frequency-Locking
	Christof Koch and Heinz Schuster

On a Magnitude Preserving Iterative MAXnet Algorithm
	Bruce W. Suter and Matthew Kabrisky

A Fixed Size Storage O(n3) Time Complexity Learning	
Algorithm for Fully Recurrent Continually Running
Networks
	Jurgen Schmidhuber

Learning Complex, Extended Sequences Using the 
Principle of History Compression
	Jurgen Schmidhuber

How Tight are the Vapnik-Chervonenkis Bounds?
	David Cohn and Gerald Tesauro

Working Memory Networks for Learning Temporal Order with 
Application to 3-D Visual Object Recognition
	Gary Bradski, Gail A. Carpenter, and Stephen Grossberg

-----

SUBSCRIPTIONS - VOLUME 4 - BIMONTHLY (6 issues)

______ $40     Student
______ $65     Individual
______ $150    Institution

Add $12 for postage and handling outside USA (+7% for Canada).

(Back issues from Volumes 1-3 are regularly available for $28 each.)
*****  Special Offer -- Back Issues for $17 each *****

MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142.
        (617) 253-2889.

-----




More information about the Connectionists mailing list