NN distance learning course announcement

sayegh@CVAX.IPFW.INDIANA.EDU sayegh at CVAX.IPFW.INDIANA.EDU
Sun Nov 26 19:53:57 EST 1995


         FOUNDATIONS AND APPLICATIONS OF NEURAL NETWORKS
                        Course announcement   

This course is to be offered in the Spring of 1996.  Students at remote sites 
will receive and view lecture tapes at their convenience.  Handouts, homework 
and other assignments will be handled via a web site.

This is a 500 level course open to both seniors and graduate students in the
Sciences, Mathematics, Engineering, Computer Science, and Psychology or
professionals interested in the topic, provided they meet the prerequisites or 
obtain the instructor's permission.  The course is listed as PHYS 525 at Purdue.

Please contact the instructor if you are interested.

Instructor:

Dr. Samir Sayegh
sayegh at cvax.ipfw.indiana.edu
Phone: (219) 481-6157
FAX: (219) 481-6800


Description:

In the last ten years Neural Networks have become both a powerful
practical tool to approach difficult classification, optimization
and signal processing problems as well as a serious paradigm for
computation in parallel machines and biological networks.
This course is an introduction to the main concepts and algorithms
of neural networks.  Lengthy derivations and formal proofs are
avoided or minimized and an attempt is made to emphasize the
connection between the "artificial" network approaches and their
neurobiological counterparts.  In order to help achieve that latter
goal, the text "A Vision of the Brain" by Semir Zeki is required
reading, in addition to the main text "Introduction to the Theory
of Neural Computation" by Herz, Krogh and Palmer, and the
instructor's (valuable) notes.  Zeki's book recounts the
fascinating (hi)story of the discovery of the color center in the
human visual cortex and emphasizes very general organizational
principles of neuroanatomy and neurophysiology, which are highly
relevant to any serious computational approach. 

The following classic topics are covered:

- Introduction to the brain and its simpler representations
- Neural Computational Elements and Algorithms
- Perceptron
- Adaptive Linear Element
- Backpropagation
- Hopfield Model, Associative Memory and Optimization
- Kohonen networks
- Unsupervised Hebbian learning and principal component analysis
- Applications in signals, speech, robotics and forecasting.
- Introduction to Computational Neuroscience
- Introduction to functional neuroanatomy and functional imaging
- Introduction to the visual pathways and computation in retina
and visual cortex. 


Prerequisites:

Calculus, matrix algebra and familiarity with a computer language

Texts:

"A Vision of the Brain" by Semir Zeki  (Blackwell, 1993)
"Introduction to the Theory of Neural Computation" by Herz, Krogh
and Palmer (Addison Wesley, 1991)
Instructor's (valuable) notes.


Testing:

Each lecture comes with a handout that includes a list of
objectives and a set of multiple choice questions.  The two take-home midterm
exams and the final exam will be mostly multiple choice with the questions
reflecting the lecture objectives.  In addition each student will be expected 
to complete an individual project in the area of her/his interest.  The 
project may or may not be part of the final grade depending on the project's
progress.

Software:

While students are welcome to use the language of their choice, the high 
level language MATLAB and the associated toolbox for Neural Networks will 
be provided for the duration of the course at no additional charge.


Cost (US $)

                          Indiana Resident     Non resident        

Undergraduate                 249.45               644.450

Graduate                      315. 60              751.05   


Appendix (brief intro):

Neural Networks provide a fruitful approach to a variety of
engineering and scientific problems that have been traditionally 
considered difficult.  While an exact definition remains elusive
and different practitioners would emphasize one or another of the
characteristics of NN, it is possible to list the most common and
some of the most fundamental features of neural network solutions:

1) Adaptive
2) Parallel
3) Neurally inspired
4) Ability to handle non-linear problems in a transparent way

Let us look at these in some detail:

1) Adaptive solutions are desirable in a number of situations.  They
present advantages of stability as well as the ability to deal with
huge data sets with minimal memory requirements, as the patterns
are presented "one at a time."  The same advantage implies the
possibility of developing real time on-line solutions where the
totality of the data set is not available at the outset.

2) The formulation of neural networks solutions is inherently
parallel.  A large number of nodes share the burden of a
computation and often can operate independent of information made
available by other nodes.  This clearly speeds up computation and
allows implementation on highly efficient parallel hardware.

3) Though the extent is somewhat debated, it is clear that there is
some similarities between current artificial neural algorithms and
biological systems capable of intelligence.  The fact that such
biological systems still display pattern recognition capabilities
far beyond those of our algorithms is a continuing incentive to
maintain and further explore the neurobiological connection.

4) The ability to handle nonlinearity is a fundamental requirement
of modern scientific and engineering approaches.  In a number of
fields, the nonlinear approaches are developed on a case by case
basis and have often little connection to the better established
linear techniques.  On the other hand, with the general approach of
formulating a neural network and endowing it with increasingly
complex processing capabilities, it is possible to define a unified
spectrum extending from linear networks (say a one weight-layer
ADALINE) to highly nonlinear ones with powerful processing
capabilities (say a multilayer backpropagation network).

The combination of the properties outlined coupled to the near
universal model of neural networks and the availability of software
and hardware tools make NN one of the most attractive instruments
of signal processing and pattern recognition available today.
   


More information about the Connectionists mailing list