responses to textbook survey

S. Becker becker at meitner.psychology.mcmaster.ca
Mon Mar 25 15:14:12 EST 2002


Dear connectionists, thanks to all who replied to my request to exchange
info on textbooks for neural network-related courses. If you are
teaching such a course, I think you will find the comments quite useful.
As well there are several pointers to web notes etc.
So rather than attempt to summarize I've included the comments verbatim,
except where the sender requested that they remain confidential.
cheers,
Sue

_____________________________________________________________________________
Date: Wed, 6 Mar 2002 21:11:26 -0500 (EST)
From: "S. Becker" <becker at meitner.psychology.mcmaster.ca>

Dear connectionists,
This message is directed to those of you who teach a course in neural
networks & related topics, or computational neuroscience or cognitive
science, and would like to exchange opinions on textbooks they are using.

I'd like to know the name of your course, who takes it (undergrad vs grad,
comp sci/eng vs psych/bio), what textbook you are using and what you
consider to be the pros and cons of this book.

I teach a course in neural computation to 3rd year undergrads, mostly CS
majors and some psych/biopsych, and have used James Anderson's book An
Introduction to Neural Networks a number of times. I like this book a lot
-- it is the only one I know of that is truly interdisciplinary  and
suitable for undergraduates -- but it is in need of updating.

I will post a summary of the replies I receive to connectionists.
_____________________________________________________________________________
From: "Olly Downs" <t-odowns at microsoft.com>

The Hopfield Group (formerly at CalTech, and for the past 4 years at
Princeton) has taught the course 'Computational Neurobiology and
Computing Networks' for many years.  The principal text for the course
has always been 'Introduction to the Theory of Neural Computation' by
Hertz, Krough and Palmer.  It is very much oriented towards the
Physics/Applied Math people in the audience, and thus more recently we
have also used Dana Ballard's "An Introduction to Natural Computation",
which we found to be more paedagogical in it's approach, and somewhat
less mathematical - which has helped span the broad audience our course
has enjoyed - including 3rd-year undergrads through faculty in Biology,
Psychology, Applied Math, CS and Physics.
_____________________________________________________________________________
Sender: Dave_Touretzky at ammon.boltz.cs.cmu.edu

Hi Sue.  I teach my neural nets course using two books:

  Hertz, Krogh and Palmer, 1991, Introduction to the Theory of Neural
  Computation.

  Bishop, 1995, Neural Networks for Pattern Recognition.

This started out as a graduate course (in fact, it was originally
taught by Geoff Hinton) but is now dual-listed as an undergrad/grad
class.  Those who take the grad version have to do a project in
addition to the homeworks and two exams.

The syllabus for my course, along with a bunch of MATLAB demos, can be
found here:

  http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15782-s02/
_____________________________________________________________________________
I teach a course Principles of Neural Computation to 3rd year
undergrads and I am using Simon Haykin's book Neural Networks, a
Comprehensive foundation. In my course only chapters 1-5 and 9 are
used. The contents of those are: Introduction, Learning processes,
Single layer perceptrons, Multilayer perceptrons, Radial basis
function networks and Self-organizing maps.

In general the book is quite good, but the first chapter and maybe
something in the second are too loose or too general to start with
undergrads. It would be better to start with more concrete
examples. But I am using the book, because the previous lecturer of
the course used the same book and he gave me his slides and because he
and I have not found a better book.

We have also an Advanced course on Neural computing for graduated
students. In that course the material is selected among the chapters of
the same book which have not been used in the course for undergraduates.

* Kimmo Raivio, Dr. of Science in Technology | email: Kimmo.Raivio at hut.fi
* Lab. of Computer and Information Science   | http://www.cis.hut.fi/kimmo/
* Helsinki University of Technology          | phone +358 9 4515295
* P.O.BOX 5400, (Konemiehentie 2, Espoo)     | gsm   +358 50 3058427
* FIN-02015 HUT, FINLAND                     | fax   +358 9 4513277
_____________________________________________________________________________
From: Gregor Wenning <grewe at cs.tu-berlin.de>

I am a teaching assistant in the lab of Klaus Obermayer.

We offer a one year course in artificial neural networks,
supervised and unsupervised methods, mainly for
computer scientists.

The main books we use are:

1) Bishop, "Neural Networks for Pattern Recognition"
   a good book for basic methods, unfortunately the students
   do not like it
2) Haykin,"Neural Networks", good overview, many topics and aspects,
   inludes problems and one can get a solutions manual as well!
_____________________________________________________________________________
From: Nicol Schraudolph <schraudo at inf.ethz.ch>

I've just taught Machine Learning to upper-level, mostly CS undergrads
(corresponds to U.S. master's students), using Cherkassky/Mulier's
"Learning from Data".  That book gives a very nice presentation of the
basic concepts and problems in machine learning, but has two drawbacks:
1) it is too expensive for students, and 2) it is full of errors in the
details, especially in the equations.  Their two-page description of EM
contains no less than 5 errors!  I can recommend using this book to
design a machine learning course, as long as you don't try to teach
directly from it.

    Dr. Nicol N. Schraudolph               http://www.inf.ethz.ch/~schraudo/
    Chair of Computational Science                 mobile:  +41-76-585-3877
    ETH Zentrum, WET D3                            office:      -1-632-7942
    CH-8092 Zuerich, Switzerland                      fax:            -1374
_____________________________________________________________________________
From: Roseli Aparecida Francelin Romero <rafrance at icmc.sc.usp.br>

I have used to teach a course in neural networks for graduate students,
Haykin's book "Neural Networks - A Comprehensive Foundation".
This book presents many exercises in each chapther and I think is a good
book for getting the basis necessary in nn & related topics.
_____________________________________________________________________________
We used Bechtel and Abrahamsen's 'Connectionism and the mind' to teach 3-4th
year psych students in psychology and cognitive science (Univ. of Amsterdam
and earlier also Univ. of Utrecht). The book needs updating and a 2nd
edition was promised several years ago but has never come out. So, now we
are back to using a reader and I have started to write a book myself.

Prof.dr. Jaap Murre
Department of Psychology
University of Amsterdam
Roetersstraat 15
1018 WB Amsterdam
_____________________________________________________________________________
I've been mostly using Bishop's book on NNs for PR (1995), for
graduate students in CS, some engineers and mathematicians (and few or
no psych/bio people). It is very well written, and most students
appreciate it very much, despite the arduous introductory (learning
theory and non-parametric statistics) chapters. One of my colleagues
(Kegl) has used for the same course this year the new edition of the
Duda & Hart book, which is more up-to-date, however with many minor
but annoying errors.

Yoshua Bengio
Associate professor / Professeur agrégé
Canada Research Chair in Statistical Learning Algorithms /
titulaire de la chaire de recherche du Canada en algorithmes d'apprentissage statistique
Département d'Informatique et Recherche Opérationnelle
Université de Montréal,
_____________________________________________________________________________
From: "Tascillo, Anya (A.L.)" <atascill at ford.com>

I'm not teaching large classes, just all my coworkers, and I still hand them
Maureen Caudill's "Understanding Neural Networks" spiral bound books, volumes
1 and 2.  People do the exercises and and one friend turned around and taught
the neural network section of his summer class after I lent him these books.
_____________________________________________________________________________

From: Peter Dayan <dayan at gatsby.ucl.ac.uk>
> I'd like to know the name of your course,
theoretical neuroscience
> who takes it (undergrad vs grad,
grad students
> comp sci/eng vs psych/bio),
a mix of those two, more the former
> what textbook you are using
you guess :-)

> consider to be the pros and cons of this book.
The only con is the prose....

More seriously, our students take this course, plus one on machine
learning, which doesn't really have an ideal book yet. I expect that a
mixture of David MacKay and Mike Jordan's forthcoming books will
ultimately be used.
_____________________________________________________________________________
Your message was forwarded to me...  I use Principles of Neurocomputing =
for Science and Engineering by Ham and Kostanic.  The course taught in =
the ECE dept. of UNH is attended by mostly first year grad students but =
I also have two ugs.  The book is well written and comes with a large =
set of solved Matlab assignments which I use extensively.

Andrew L. Kun
Assistant Professor
University of New Hampshire
ECE Department, Kingsbury Hall
Durham, NH 03824
voice: 603 862 4175
fax: 603 862 1832

www.ece.unh.edu/bios/andrewkun.html
_____________________________________________________________________________
From: "Randall C. O'Reilly" <oreilly at grey.colorado.edu>

We have compiled a list of courses using our textbook:

http://psych.colorado.edu/~oreilly/cecn_teaching.html

We and several others use our book for undergrad and grad level
courses, and it works pretty well.  Hopefully some of these other
people will email you with a more objective 3rd-party perspective on
the pros and cons of the book.  Here's our general sense:

Pros:

- covers neuroscience, computation & cognition in a truly integrated
  fashion: units are point neurons w/ ion channel & membrane potential
  eq's, implement powerful learning algorithms, and are applied to a
  wide range of cognitive phenomena.

- includes many "research grade" cognitive models -- not just a bunch
  of toy models.

- integrated software & use of one framework throughout makes it easy
  on the students

Cons:

- not encyclopedic: generally presents one coherent view instead of a
  bunch of alternatives (alternatives are only briefly discussed).

- not as computationally focused as other books: emphasizes biology
  and cognition, but does not cover many of the more recent machine
  learning advances.
_____________________________________________________________________________
From: Andy Barto <barto at cs.umass.edu>

I teach a course entitled "Computing with Artificial Neural Networks" to
advanced undergraduates and a few graduate students. Students come
mostly from CS, but some for engineering, psychology, and
neuroscience. I have used almost all the books out there. I have used
Anderson's book, finding it unique, but a bit too low-level of
technical detail for my students. Currently I am using Haykin, second
edition.

I have used Bishop's book for independent studies, but it
is too difficult for my undergrads. I refer a lot to Reed and Marks
"Neural Smithing" but it is too narrow for the whole course. I have
not found the perfect book. Still waiting for Geoffrey to write one.
I would be most interested to hear what you are able to find out.

On another matter, as we revise our overall curriculum, I am not sure
that the NN course will survive in present form. It may happen that
we will cover major NN algorithms as part of a broader machine
learning course sequence. We like the Duda, Hard, and Stork book, and
I really like the book "The Elements of Statistical learning" by
Hastie, Tisbhirani, and Friedman, although it is not completely
satisfactory for a course either, being too difficult and also
lacking the algorithmic point of view that I think is important for
CS students.
______________________________________________________________________
I teach a graduate course out of my own book (now in the 2nd edition, D. S.
Levine, Intro. to Neural and Cognitive Modeling, Erlbaum, 2000).  My current
class is predominantly psychology students, but I have also taught a class
out of the first edition that was predominantly computer science students
with a substantial minority from EE.  (The second edition is not too changed
from the first in Chs. 1-5 except for updating, but Chapters 6 and 7 have
several new or expanded sections).

In my biased author's view, I see some pros of my book as its discussions of
general modeling principles, of the history of the field, of the relations
of network architectures to psychological functions, and of the approaches
of Grossberg and his colleagues.  (I remember you noted some of these when
reviewing the first edition for an AI journal.)  But perhaps more important
than any of these, it is the only NN book I know of that proceeds
systematically from building blocks (such as lateral inhibition and
associative learning) to modeling of complex cognitive/behavioral functions
such as categorization, conditioning, and decision making.  This is
reflected in its chapters which are organized not primarily by "schools" or
"paradigms" as are many other NN introductions, but primarily by functions
and secondarily by paradigms.  The flow chart which illustrates this
organization has been moved to the front of the book to be more apparent.
The book is multidisciplinary friendly because the equations are mostly
clustered at the ends of chapters, and there are neuroscience and math
appendices for those who lack background in those areas.  Also it is
relatively accessible monetarily (the last I checked, $36 in paperback).
More description of the book's distinctive features can be obtained from my
web site (www.uta.edu/psychology/faculty/levine), under Books, or the
Erlbaum web site (www.erlbaum.com).

When I teach out of the book I supplement it with copies of the original
papers being discussed (by Grossberg, Sutton-Barto, Klopf, Anderson,
Rumelhart et al., et cetera).  That is probably a good idea for any textbook
in the field, particularly for grad courses.

The cons of my book are that it is probably not the best one for those
interested in detailed engineering or computational implementations or
mathematical algorithms.  My experience is that many CS or EE students found
a course based on the book a useful complement to another course based on a
book, or professor's notes, that have those other strengths -- although I
know one EE professor doing research on vision and robotics, Haluk Ogmen of
the U of Houston, who has taught from my book extensively with his students.
Also, while the book does not demand specific technical prerequisites in any
one area, it may or may not be the best book for undergraduates.  Some
undergrads have found it stylistically difficult because the discussion cuts
across traditional boundaries, though many undergrads already involved in
beginning research have taken my course or a roughly equivalent independent
study course and felt comfortable with the book.

My book has a variety of exercises, some computational and some open-ended.
I have planned to write a web-based instructor's manual but that has
remained on the back burner: the idea is still alive and audience interest
might help push it forward.  But the following exercises have been
particularly valuable for the students in my classes: Ch. 2, Exercises 2
(McCulloch-Pitts), 3 (Rosenblatt perceptron: very detailed!), 8 (ADALINE);
Ch. 3, 2 (outstar), 4 (Grossberg gated dipole), 6 (BAM); Ch. 4, 1 and 2
(shunting lateral inhibition); Ch. 5, 2 (Klopf), 3 (Sutton-Barto); Ch. 6, 4
(backprop T vs. C), 6 (BSB).

Best,
Dan Levine
levine at uta.edu
______________________________________________________________________
From: Alex Smola <smola at axiom.anu.edu.au>

you're welcome to have a look at my slides (linked on my homepage) for
the past two courses i gave. and there'll be slides on bayesian kernel
methods online really soon, too (they're, in fact, at
http://mlg.anu.edu.au/~smola/summer2002). most of that is taken from
bernhard's and my book
	   http://www.learning-with-kernels.org
for a free sample of the first 1/3, just go to contents and download
it. and, of course, i like that book a lot ;). but you won't find a
word on neural networks in it (except for the perceptron).

                                     ''~``
                                    ( o o )
+------------------------------.oooO--(_)--Oooo.----------------------------+
| Alexander J. Smola                           http://mlg.anu.edu.au/~smola |
| Australian National University               Alex.Smola at anu.edu.au        |
| Research School for Information              Tel: (+61) 2 6125-8652       |
| Sciences and Engineering       .oooO         Fax: (+61) 2 6125-8651       |
| Canberra, ACT 0200 (#00120C)   (   )   Oooo. Cel: (+61) 410 457 686       |
+---------------------------------\ (----(   )------------------------------+
                                   \_)    ) /
                                         (_/

______________________________________________________________________
I use Simon Haykin's 2nd edition of Neural Networks
(It doesn't cover ART). For density estimation and the
Bayesian framework I use Chris Bishop's Neural Networks
for Pattern Recognition (which I see as an updated version
of Duda and Hart).
My students are final year CS or Math and stats undergraduates.

Wael El-Deredy, PhD
Cognitive Neuroscience Group          Visiting Lecturer
Unilever Research Port Sunlight        School of Comp. and Math Sci.
Bebington CH63 3JW    - UK            Liverpool John Moores Univ.
______________________________________________________________________
I am using in my 3rd and 4rth year courses:

"Foundations of neural networks, fuzzy systems and knowledge engineering",
N.Kasabov, MIT Press, 1966 with a WWW page being updated annually:
http://divcom.otago.ac.nz/infosci/kel/courses/

Nik Kasabov
Professor of Information Science
University of Otago
New Zealand
______________________________________________________________________
This is a brief response to your broadcast query for information about
neural network classes.  I guess I have three experiences to
contribute:

At Carnegie Mellon University I taught a class in the Department of
Psychology on Parallel Distributed Processing.  The focus of the
course was on cognitive modeling.  The students were a mix of
undergraduates, graduate students, and even a few visiting postdocs
and faculty members.  The undergraduates included computer science
students, psychology majors, and participants in an interdisciplinary
cognitive science program.  I used the original PDP volumes, including
the handbook of exercises, but assignments were completed using the
PDP++ software.  Selected papers were also used as assigned readings.
In terms of course credit, this class counted as a class and a half,
so the students were worked pretty hard.  I found the PDP volumes to
still be an excellent source of readings, but I think that most
students depended on lectures heavily to clarify concepts.  Materials
from this course are collecting dust in a corner of the web at:
"http://www.cnbc.cmu.edu/~noelle/classes/PDP/".

I am currently teaching two relevant classes at Vanderbilt University.
The first is a graduate level seminar on Computational Modeling
Methods for Cognitive Neuroscience, offered through the Department of
Electrical Engineering and Computer Science.  Between a third and a
half of the participants are engineering students, most of which are
interested in robotics.  The remainder are psychology students, with a
couple of neuroscientists among them.  I am following the O'Reilly and
Munakata text, _Computational Explorations in Cognitive Neuroscience_,
fairly closely.  I am really enjoying teaching from this book, though
I sense that the seminar participants feel a bit like they're trying
to drink from a fire hose -- the pages are densely packed with
material presented at a fairly rapid clip.  Most lecture time has been
spent providing guidance through this wealth of material, but some
class time has been used to provide balance and counterpoint to the
biases expressed by the authors.  The book does not provide the usual
taxonomic review of techniques, opting instead to focus on a specific
set of methods which may be fruitfully integrated into a unified
modeling framework.  This means that I've had to put in a bit of extra
work to make sure that the students are at least briefly exposed to
the range of techniques that they will encounter in the literature.
Despite these issues, the book has been a very useful tool.  The
text's integrated computer exercises using PDP++, packaged so as to
require only minimal computer skills, have proven to be especially
helpful.  The materials that I have prepared for this seminar are at:
"http://www.vuse.vanderbilt.edu/~noelledc/classes/S02/cs396/".

I am also teaching an undergraduate computer science class entitled
"Project in Artificial Intelligence".  My version of this course is
subtitled "Fabricating Intelligent Systems With Artificial Neural
Networks".  The class is intended to provide seniors in computer
science with an opportunity to develop and evaluate a substantial
software project.  These students are being introduced to neural
network technology with the help of _Elements of Artificial Neural
Networks_ by Mehrotra, Mohan, and Ranka.  I selected this text because
it appeared to present the basics of neural networks in a concise
manner, it focused on engineering issues in favor of cognitive
modeling issues, and it included some discussions of applications.
Unfortunately, my students have, for the most part, found the text to
be too concise, and they have had difficulty with some of the
mathematical notation.  I have had to supplement this material
extensively in order to communicate central concepts to the students.
I have fabricated a web site for this course, which may be found at:
"http://www.vuse.vanderbilt.edu/~noelledc/classes/S02/cs269/".
-- David Noelle ---- Vanderbilt University, Computer Science ------

-- noelle at acm.org -- http://people.vanderbilt.edu/~david.noelle/ --
________________________________________________________________________
From: Chris Williams <ckiw at dai.ed.ac.uk>

I no longer teach a NN course.

For machine learning I use Tom Mitchell's book. This includes some NN
stuff (I also add stuff on unsupervised learning)

For probabilistic modelling (belief nets) I use the upcoming Jordan &
Bishop text.
_____________________________________________________________________________
I am teaching at UQAM (Montreal) and have the same problem. Please let me
know if you get some useful answers.

  Alex Friedmann                      Neural Networks Group
  LEIBNIZ-IMAG                   Tel:  (33) 4 76 57 46 58
  46, avenue Felix Viallet       Fax:  (33) 4 76 57 46 02
  38000 Grenoble (France)       Email: Alex.Friedmann at imag.fr
_____________________________________________________________________________
I have taught a comp neuro course several times.  I have used the Koch &
Segev book supplemented with lots of guest lectures. Recently I used Dayan
& Abbott and will probably do so again since it is a pretty comprehensive
book.  My syllabus can be found in various formats on my web page
http://www.pitt.edu/~phase

Bard Ermentrout
_____________________________________________________________________________
With two collegues I teach a course called "Connectist modeling in Psychology"
at the University of Amsterdam. It is a course at undergraduate level, but
Dutch degrees do not translate neatly to American ones. When in three years we
switch to an Anglosaxon bachelor-master system, it will probably be part of
the master's curriculum.

As the name suggests, it is a course within the field of psychology. Next to
psychology students, some biology and AI majors also take our course. We have
used a book of Bechtel and Abrahamson, called "Connectionism and the
mind". This book, from the early nineties, is mainly targeted at cognitive
science: cognitive psychology, philosophy. It is not a good book. Students
complain that it is unorganized, and it misses out on many important topics
while rambling on about symbolism vs. connectionism, important in the eighties
but now not a very lively discussion anymore. Moreover, it has run out of
print (the second edition promised years ago still isn't there), so we had to
give students the opportunity to photocopy the book.

For all these reasons, we decided to this year not use the book, but instead
make a reader. We are now collecting articles to include in it (the
introductory chapters will be written by one of us, Jaap Murre).

To give our students hands-on experience, we have also made excercises. These
are available at www.neuromod.org/connectionism2002/opdrachten/, but alas most
of the explanations are in Dutch.

I hope that you get a lot of reactions. Would it perhaps be possible for you
to, if you do get many reactions, somehow share the information? Perhaps by
putting reactions on the web or sending a mail to Connectionists? I would be
very interested in what books others have used. Thank you in advance.

Sincerely,
Martijn Meeter
_____________________________________________________________________________
I give a course simply called "Neural Networks" for undergraduate
computer science students (3rd year). The book is S. Haykin, Neural
Networks: A Comprehensive Foundation, Second edition, Prentice Hall,
1999. ISBN 0-13-273350-1.

The book is indeed comprehensive, but in my opinion not an ideal book
for computer science students. More so for engineering students, I
think. The signal-theoretic viewpoint on neural networks is lost on
computer science students, who would prefer a more computational view.

But I still use the book, since it is so comprehensive. The book is
both deeper and wider than the course content, which makes it possible
for the students to use the same book as a reference book when they
want to dig deeper or broaden their view. This is useful since the
course ends with a project-like assignment, defined by the students
themselves. Different students dig in difference places, but most of
them can use the same source.

----- If God is real, why did he create discontinuous functions? -----
Olle Gällmo, Dept. of Computer Systems, Uppsala University
Snail Mail: Box 325, SE-751 05 Uppsala, Sweden.   Tel: +46 18 471 10 09
URL: http://www.docs.uu.se/~crwth                 Fax: +46 18 55 02 25
Email: crwth at DoCS.UU.SE
_____________________________________________________________________________
 Germán Mato and myself are giving a course on biologically oriented
 neural networks, for advanced undergraduate students, in physics
 and engineering. This is the material, and the books we include:

 1. An introduction to the nervous system, neurons and synapses:
    a snapshot from the books of "Essentials of Neural Science and
Behavior"
    Kandel, Schwartz and Jessell
    2 classes.

 2. Single cell dynamics:
    3 classes. We take the theory of dynamical systems and excitable
    media in "Nonlinear Dynamics and Chaos : With Applications to
Physics,
    Biology, Chemistry, and Engineering" S. Strogatz,
    and then adapt it to several neuronal models
    (Hodgkin Huxley, integrate and fire, FitzHugh-Nagumo). We study
    the stable states, their stability, phase transitions, and the
    emergence of cycles.

 3. Dynamic behavior of a large ensemble of neurons:
   We use chapters 3 and 5 of Kuramoto's book "Chemical Oscillations,
    Waves and Turbulence", to introduce the
    phase models, and study the conditions for synchronization.
    We exemplify with the integrate and fire and the Hodgkin Huxley
    models. 3 classes

 4. Associative memories:
    "Introduction to the theory of neural computation", Hertz, Krogh and
Palmer
    8 classes

 5. Information theoretical analysis of neural coding and decoding
    Cover and Thomas, to give the basic definitions. These two books:
    "Neural Networks and Brain Function", Rolls and Treves, and
    "Spikes", Rieke et al. provide us with examples in the nervous
    system. We study the problem of decoding the neural signal, how
    to overcome limited sampling, rate coding, the role of correlations
    and time  dependent codes, and how to calculate the information
    transmitted in very short time windows.

    This unit is not actually one where we analyze how to build or
    model a neural network, but rather, how to extract information
    from it.

    10 classes

 6. Modeling cortical maps
    The book of Hertz "Introduction to the theory of neural computation"
    gives us the basics for unsupervised learning.  In Haykin's book
    "Neural Networks" we find an information theoretical description of
    the Linsker effect, and of self organized feature maps.
    In the book of Neural Networks and Brain Function, Rolls and Treves
    give examples of this, in the brain.
    6 classes

 Ines Samengo
 samengo at cab.cnea.gov.ar
 http://www.cab.cnea.gov.ar/users/samengo/samengo.html
 tel: +54 2944 445100 - fax: +54 2944 445299
 Centro Atomico Bariloche
 (8.400) San Carlos de Bariloche
 Rio Negro, Argentina
_______________________________________________________________________
From: Geoff Goodhill <geoff at cns.georgetown.edu>

What a great question! I have twice so far at Georgetown taught a
"computational neuroscience" course, an an elective course for
graduate students in the Interdisciplinary Program in Neuroscience
(backgrouns: bio/psych).  The class has been around 4-5 each time,
including one physics undergrad in each case. I used mostly papers,
plus a few chapters from Churchland and Sejnowski. Overall I found the
experience very unsatisfying, since none of the students could deal
with any math (not even what I would regard as high school math).

This year I am instead organizing an informal (no credit) reading
group to work through the new Dayan and Abbott book. I think this is
excellent: broad, sophisticated, though certainly not for anyone
without a very solid math background. I think next year I will offer a
formal course again (fortunately it's entirely up to me if and when I
teach it) based around this book, and basically tell prospective
students that if they can't understand the math then don't bother
coming.
_______________________________________________________________________
From: Stevo Bozinovski <sbozinovski at sets.scsu.edu>

I am teaching Artificial Intelligence, at the Computer Science department,
South Carolina state University. It is an
undergraduate level (CS 323) course. I use the Pfeifer's book "Understanding
Intelligence", in which a whole chapter
is dedicated to Neural Networks. The book covers sufficiently
the myltilayer percepton type networks.
The adaptive neural arrays are not covered.
I use my book "Consequence Driven Systems" in order to cover
the adaptive neural arrays.
_______________________________________________________________________
From: Lalit Gupta <gupta at engr.siu.edu>

At SIU-C, we offer a graduate level Neural Networks course in the Dept of
Electrical & Computer Engineering.

The text that is currently being used is:

Neural Networks, Second Edition, Simon Haykin, Prentice Hall, 1999.

It covers many aspects of neural networks.  It is a relatively difficult
text for students to read directly without instruction.  It is suitable for
both engineers and scientists.


Information about the text is available at:
http://www.amazon.com/exec/obidos/tg/stores/detail/-/books/0132733501/glance
/ref=pm_dp_ln_b_1/103-6340812-8312600
_______________________________________________________________________
From: Todd Troyer <ttroyer at glue.umd.edu>

I have been teaching a one semester graduate level survey course entitled
Computational Neuroscience for the last two years. The course is part of an
optional core sequence of required courses in Maryland's Neuroscience and
Cognitive Science interdisciplinary Ph.D. program.  The course is designed to
be taken by students spanning the range of neuroscience and cognitive science
- everyone from engineers, and computer scientists to biologists and
psychologists who have only a rudimentary math background. I too was
frustrated with the lack of available textbooks and so have spent a lot of
time writing extensive "class notes" in semi-book form.  At this point things
are obviously incomplete.  I'm also spending some time developing computer
simulations and I hope to have a "lab" component of the course next fall.
Both things take huge amounts of time.

The topic areas I focus on have extensive overlap with the topics presented in
the new Dayan and Abbott book, but I spend more time on the linear
algebra/network stuff.  Overall the course is at a more basic level than Dayan
and Abbott.  I feel that their book ramps up the math too fast for many of my
students.  Next fall I will also require Churchland and Sejnowski's
Computational Brain.  I found that in focusing on the presentation of the
technical material, I was having a hard time organizing the presentation of
the biological examples.  The Computational Brain, while getting a bit dated,
does a wonderful job at this (with the usual minor qualms).  I also recommend
the Rieke et al Spikes book for my more advanced students.

I've looked at some other books as well.  I like the Anderson book, but it
doesn't focus on the material that I'd like.  I think Randy O'Reilly's new
book presents an interesting attempt to present a connectionist perspective on
cognitive neuroscience, although I haven't had a chance to look through it in
detail.

Overall, I've found that the key hinge point in thinking about such a class is
the computational skills.  It is much easier either to assume a reasonably
sophisticated math background or not to delve into the math much at all.  In
the first case you can use formal language to talk about how the equations
relate to the underlying ideas, but the course excludes students with
noncomputational backgrounds.  Without assuming such a background you can talk
about some of the underlying concepts, but such a presentation tends to be
vague and often either overstates what computation adds or seems obvious and
not interesting.  While I've had mixed success so far, I beleive that you can
have a unified class where you teach enough math to bring the students that
need it up to speed, at least to the degree that all students can begin to
explore the *relationship* between the math and the neuroscience.  I've been
trying to set things up so that the students with a stronger background can
work on self-directed outside reading and/or projects during some of the basic
math sessions.  It's a stretch.  Finally, I'm thinking about developing the
lab component into a joint lab/lecture course that I could teach to advanced
undergraduates, again from a range of backgrounds.


Todd W. Troyer                          Ph: 301-405-9971
Dept. of Psychology                     FAX: 301-314-9566
Program in Neuroscience                 ttroyer at glue.umd.edu
   and Cognitive Science                www.glue.umd.edu/~ttroyer
University of Maryland
College Park, MD 20742
____________________________________________________________________________
I teach a course Computational Neurobiology, to graduate students in the
school of computational sciences.  Most students are computational
neuroscience students, though some engineers, math majors, and
psychologists have taken it.  I use Johnston and Wu, Foundations of
Cellular Neurophysiology.  It is fabulous, because is has all the
equations describing  cell properties.  I.e. cable equation,
Hodgkin-Huxley equations, etc.  It is probably a bit advanced for
undergrads, and even the psych majors have trouble if they don't have a
strong math background (which they don't at GMU).  The only thing against
the book is the lack of good figures, such as found in From Neuron to
Brain, by Nicholls et al.  But I supplement my lectures from that book,
since the students have not had a basic non-computational neuroscience
course.

I also teach a course Comptutational Neuroscience Systems.  However, I
haven't found a good textbook.  So I use Kandel and Schwartz, or Fundamental
Neuroscience for the basic neuroscience systems, and supplement each lecture
with one or two modeling papers on the system we discuss.  I'm still hoping
to find a book that presents different methods of modeling systems of
neurons, e.g. A chapter or two on (1) using genesis and neuron to develop
large scaled detailed models, and (2) a linear systems approach, (3) a neural
network approach, (4) integrate and fire neuron networks (5) spike train
analysis, (6) higher level abstract models, such as the Rescorla-Wagner
model.   If you know of any like this, please let me know.

Avrama Blackwell
____________________________________________________________________________
I teach an undergrad course on "Information processing models."
Its mainly psych and neuro students, and a handful of graduate students.
Most have little or no math or programming background.  I have not
found any one book that's good for this level.  I have written up some
of my own lecture notes and mini-tutorials which you can get at

  http://redwood.ucdavis.edu/bruno/psc128

I am also teaching a graduate course this spring on computational
neuroscience in which I will use Peter and Larry's book for the first
time.  I offer this course only once every two years though.

Bruno A. Olshausen              Phone: (530) 757-8749
Center for Neuroscience         Fax:   (530) 757-8827
UC Davis                        Email: baolshausen at ucdavis.edu
1544 Newton Ct.                 WWW:   http://redwood.ucdavis.edu/bruno
Davis, CA 95616
_____________________________________________________________________________





More information about the Connectionists mailing list