A NEURAL COMPUTATION course reading list

K.P.Unnikrishnan unni at neuro.cs.gmr.com
Sat Feb 20 14:57:13 EST 1993


Folks:
	Here is the reading list for a course I offered last semester at Univ. of
Michigan. 

Unnikrishnan
---------------------------------------------------------------

READING LIST FOR THE COURSE "NEURAL COMPUTATION"
EECS-598-6 (FALL 1992), UNIVERSITY OF MICHIGAN

INSTRUCTOR: K. P. UNNIKRISHNAN
-----------------------------------------------

A. COMPUTATION AND CODING IN THE NERVOUS SYSTEM

1. Hodgkin, A.L., and Huxley, A.F. A quantitative description of membrane
current and its application to conduction and excitation in nerve. J. Physiol.
117, 500-544 (1952).

2a. Del Castillo, J., and Katz, B. Quantal components of the end-plate
potential. J. Physiol. 124, 560-573 (1954).

2b. Del Castillo, J., and Katz, B. Statistical factors involved in neuromuscular
facilitation and depression. J. Physiol. 124, 574-585 (1954).

3. Rall, W. Cable theory for dendritic neurons. In: Methods in neural
modeling (Koch and Segev, eds.) pp. 9-62 (1989).

4. Koch, C., and Poggio, T. Biophysics of computation: neurons, synapses and
membranes. In: Synaptic function (Edelman, Gall, and Cowan, eds.) pp.
637-698 (1987).

B. SENSORY PROCESSING IN VISUAL AND AUDITORY SYSTEMS

1. Werblin, F.S., and Dowling, J.E. Organization of the retina of the mudpuppy,
Necturus maculosus: II. Intracellular recording. J. Neurophysiol. 32, 339-355
(1969).

2a. Barlow H.B., and Levick, W.R. The mechanism of directionally selective
units in rabbit's retina. J. Physiol. 178, 477-504 (1965).

2b. Lettvin, J.Y., Maturana, H.R., McCulloch, W.S., and Pitts, W.H. What the
frog's eye tells the frogs's brain. Proc. IRE 47, 1940-1951 (1959).

3. Hubel, D.H., and Wiesel, T.N. Receptive fields, binocular interaction and
functional architecture in the cat's visual cortex. J. Physiol. 160, 106-154
(1962).

4a. Suga, N. Cortical computational maps for auditory imaging. Neural Networks,
3, 3-21 (1990).

4b. Simmons, J.A. A view of the world through the bat's ear: the formation
of acoustic images in echolocation. Cognition, 33 155-199 (1989).


C. MODELS OF SENSORY SYSTEMS

1. Hect,S., Shlaer, S., and Pirenne, M.H. Energy, quanta, and vision. J. Gen.
Physiol. 25, 819-840 (1942).

2. Julesz, B., and Bergen, J.R. Textons, the fundamental elements in 
preattentive vision and perception of textures. Bell Sys. Tech. J. 62, 
1619-1645 (1983).

3a. Harth, E., Unnikrishnan, K.P., and Pandya, A.S. The inversion of sensory
processing by feedback pathways: a model of visual cognitive functions. 
science 237, 184-187 (1987).

3b. Harth, E., Pandya, A.S., and Unnikrishnan, K.P. Optimization of cortical
responses by feedback modification and synthesis of sensory afferents. A model
of perception and rem sleep. Concepts Neurosci. 1, 53-68 (1990).

3c. Koch, C. The action of the corticofugal pathway on sensory thalamic
nuclei: A hypothesis. Neurosci. 23, 399-406 (1987).

4a. Singer, W. et al., Formation of cortical cell assemblies. In: CSH Symposia 
on Quant. Biol. 55, pp. 939-952 (1990). 

4b. Eckhorn, R., Reitboeck, H.J., Arndt, M., and Dicke, P. Feature linking via
synchronization among distributed assemblies: Simulations of results from 
cat visual cortex. Neural Comp. 293-307 (1990). 

5. Reichardt, W., and Poggio, T. Visual control of orientation behavior in
the fly. Part I. A quantitative analysis. Q. Rev. Biophys. 9, 311-375 (1976).


D. ARTIFICIAL NEURAL NETWORKS

1a. Block, H.D. The perceptron: a model for brain functioning. Rev. Mod. Phy.
34, 123-135 (1962).

1b. Minsky, M.L., and Papert, S.A. Perceptrons. pp. 62-68 (1988).

2a. Hornik, K., Stinchcombe, M., and White, H. Multilayer feedforward 
networks are universal approximators. Neural Networks 2, 359-366 (1989).

2b. Lapedes, A., and Farber, R. How neural nets work. In: Neural Info. Proc.
Sys. (Anderson, ed.) pp. 442-456 (1987).

3a. Ackley, D.H., Hinton, G.E., and Sejnowski, T.J. A learning algorithm for
boltzmann machines. Cog. Sci. 9, 147-169 (1985).

3b. Hopfield, J.J. Learning algorithms and probability distributions in
feed-forward and feed-back networks. PNAS, USA. 84, 8429-8433 (1987).

4. Tank, D.W., and Hopfield, J.J. Simple neural optimization networks:
An A/D converter, signal decision circuit, and linear programming circuit.
IEEE Tr. Cir. Sys. 33, 533-541 (1986).

E. NEURAL NETWOK APPLICATIONS

1. LeCun, Y., et al., Backpropagation applied to handwritten zip code 
recognition. Neural Comp. 1, 541-551 (1990).

2. Lapedes, A., and Farber, R. Nonlinear signal processing using neural
networks. LA-UR-87-2662, Los Alamos Natl. Lab. (1987).

3. Unnikrishnan, K.P., Hopfield, J.J., and Tank, D.W. Connected-digit 
speaker-dependent speech recognition using a neural network with time-delayed
connections. IEEE Tr. ASSP. 39, 698-713 (1991).

4a. De Vries, B., and Principe, J.C. The gamma model - a new neural model for
temporal processing. Neural Networks 5, 565-576 (1992).

4b. Poddar, P., and Unnikrishnan, K.P. Memory neuron networks: a prolegomenon.
GMR-7493, GM Res. Labs. (1991).

5. Narendra, K.S., and Parthasarathy, K. Gradient methods for the optimization
of dynamical systems containing neural networks. IEEE Tr. NN 2, 252-262 (1991).


F. HARDWARE IMPLEMENTATIONS

1a. Mahowald, M.A., and Mead, C. Silicon retina. In: Analog VLSI and neural
systems (Mead). pp. 257-278 (1989).

1b. Mahowald, M.A., and Douglas, R. A silicon neuron. Nature 354, 515-518 
(1991).

2. Mueller, P. et al. Design and fabrication of VLSI components for a 
general purpose analog computer. In: Proc. IEEE workshop VLSI neural sys.
(Mead, ed.) pp. xx-xx (1989).

3. Graf, H.P., Jackel, L.D., and Hubbard, W.E. VLSI implementation of
a neural network model. Computer 2, 41-49 (1988).


G. ISSUES ON LEARNING

1. Geman, S., Bienenstock, E., and Doursat, R. Neural networks and the
bias/variance dilema. Neural Comp. 4, 1-58 (1992).

2. Brown, T.H., Kairiss, E.W., and Keenan, C.L. Hebbian synapses: Biophysical
mechanisms and algorithms. Ann. Rev. Neurosci. 13, 475-511 (1990).

3. Haussler, D. Quantifying inductive bias: AI learning algorithms and 
valiant's learning framework. AI 36, 177-221 (1988).

4. Reeke, G.N. Jr., and Edelman, G.M. Real brains and artificial intelligence.
Daedalus 117, 143-173 (1988). 

5. White, H. Learning in artificial neural networks: a statistical
perspective. Neural Comp. 1, 425-464 (1989).

----------------------------------------------------------------------
SUPPLEMENTAL READING

Nehr, E., and Sakmann, B. Single channel currents recorded from membrane 
of denervated frog muscle fibers. Nature 260, 779-781 (1976).

Rall, W. Core conductor theory and cable properties of neurons. In: Handbook
Physiol. (Brrokhart, Mountcastle, and Kandel eds.) pp. 39-97 (1977).

Shepherd, G.M., and Koch, C. Introduction to synaptic circuits. In: The 
synaptic organization of the brain (Shepherd, ed.) pp. 3-31 (1990).

Junge, D. Synaptic transmission. In: nerve and muscle excitation (Junge)
pp. 149-178 (1981).

Scott, A.C. The electrophysics of a nerve fiber. Rev. Mod. Phy. 47, 487-533
(1975).

Enroth-Cugell, C., and Robson, J.G. The contrast sensitivity of retinal
ganglion cells of the cat. J. Physiol. 187, 517-552 (1966).

Felleman, D.J., and Van Essen, D.C. Distributed hierarchical processing in the
primate cerebral cortex. Cerebral Cortex, 1, 1-47 (1991).

Julesz, B. Early vision and focal attention. Rev. Mod. Phy.63, 735-772 (1991).

Sejnowski, T.J., Koch, C., and Churchland, P.S. Computational neuroscience.
Science 241, 1299-1302 (1988).

Churchland, P.S., and Sejnowski, T.J. Perspectives on Cognitive Neuroscience.
Science 242, 741-745 (1988).

McCulloch, W.S., and Pitts, W. A logical calculus of ideas immanent in
nervous activity. Bull. Math. Biophy. 5, 115-133 (1943).

Hopfield, J.J. Neural networks and physical systems with emergent
collective computational abilities. PNAS, USA. 79, 2554-2558 (1982).
 
Hopfield, J.J. Neurons with graded responses have collective computational
properties like those of two-state neurons. PNAS, USA. 81, 3088-3092 (1984).
 
Hinton, G.E., and Sejnowski, T.J. Optimal perceptual inference. Proc. IEEE
CVPR. 448-453 (1983).

Rumelhart, D.E., Hinton, G.E., and Williams, R.J. Learning representations
by back-propagating errors. Nature 323, 533-536 (1986).
 
Unnikrishnan, K.P., and Venugopal, K.P. Learning in connectionist networks
using the Alopex algorithm. Proc. IEEE IJCNN. I-926 - I-931 (1992).
 
Cowan, J.D., and Sharp, D.H. Neural nets. Quart. Rev. Biophys. 21, 365-427
(1988).

Lippmann, R.P. An introduction to computing with neural nets. IEEE ASSP
Mag. 4, 4-22 (1987).

Sompolinsky, H. Statistical mechanics of neural networks. Phy. Today 41, 70-80
(1988).

Hinton, G.E. Connectionist learning procedures. Art. Intel. 40, 185-234 (1989).





More information about the Connectionists mailing list