From tibs at stat.Stanford.EDU Wed Aug 1 01:14:22 2001
From: tibs at stat.Stanford.EDU (Rob Tibshirani)
Date: Tue, 31 Jul 2001 22:14:22 -0700 (PDT)
Subject: book announcement
Message-ID: <200108010514.WAA744941@rgmiller.Stanford.EDU>
Book announcement:
The Elements of Statistical Learning- data mining, inference and prediction
536p (in full color)
Trevor Hastie, Robert Tibshirani, and Jerome Fridman
Springer-Verlag, 2001
For full details see
http://www-stat.stanford.edu/ElemStatLearn
Here is a brief description:
During the past decade there has been an explosion in computation and
information technology. With it has come vast amounts of data in a
variety of fields such as medicine, biology, finance, and marketing.
The challenge of understanding these data has led to the development of
new tools in the field of Statistics, and spawned new areas such as
data mining, machine learning and bioinformatics.
Many of these tools have common underpinnings but are often expressed
with different terminology. This book describes the important ideas
in these areas in a common conceptual framework. While the approach
is statistical, the emphasis is on concepts rather than mathematics.
Many examples are given, with a liberal use of color graphics. It
should be a valuable resource for statisticians and anyone interested in
data-mining in science or industry.
The book's coverage is broad, from supervised learning (prediction) to
unsupervised learning. The many topics include neural networks,
support vector machines, classification trees and boosting --- the
first comprehensive treatment of this topic in any book.
Jerome Friedman, Trevor Hastie, and Robert Tibshirani are Professors
of Statistics at Stanford University. They are prominent researchers
in this area: Friedman is the (co-)inventor of many data-mining tools
including CART, MARS, and projection pursuit. Hastie and Tibshirani
developed generalized additive models and wrote a popular book of that
title. Hastie wrote much of the statistical modelling software in
S-PLUS, and invented principal curves and surfaces. Tibshirani proposed
the Lasso and co-wrote the best selling book ``An Introduction to the
Bootstrap''.
**********************************************
Rob Tibshirani, Dept of Health Research & Policy
and Dept of Statistics
HRP Redwood Bldg
Stanford University
Stanford, California 94305-5405
phone: HRP: 650-723-7264 (Voice mail), Statistics 650-723-1185
FAX 650-725-8977
tibs at stat.stanford.edu
http://www-stat.stanford.edu/~tibs
From ingber at ingber.com Thu Aug 2 17:59:56 2001
From: ingber at ingber.com (Lester Ingber)
Date: Thu, 2 Aug 2001 16:59:56 -0500
Subject: Paper: Probability tree algorithm for general diffusion processes
Message-ID: <20010802165956.A13979@ingber.com>
The following preprint is available:
%A L. Ingber
%A C. Chen
%A R.P. Mondescu
%A D. Muzzall
%A M. Renedo
%T Probability tree algorithm for general diffusion processes
%J Physical Review E
%P (to be published)
%D 2001
%O URL http://www.ingber.com/path01_pathtree.ps.gz
ABSTRACT
Motivated by path-integral numerical solutions of diffusion
processes, PATHINT, we present a new tree algorithm, PATHTREE,
which permits extremely fast accurate computation of probability
distributions of a large class of general nonlinear diffusion
processes.
--
Prof. Lester Ingber ingber at ingber.com www.ingber.com
ingber at alumni.caltech.edu www.alumni.caltech.edu/~ingber
From allan at biomedica.org Thu Aug 2 18:24:25 2001
From: allan at biomedica.org (Allan Kardec Barros)
Date: Thu, 02 Aug 2001 19:24:25 -0300
Subject: Extraction of Specific Signals with Temporal Structure
Message-ID: <3B69D319.562840E6@biomedica.org>
Apologies if you receive multiple copies of this message.
Dear Everyone,
I would like to announce the following paper, recently published in
Neural Computation. For those familiar with ICA, the difference in this
algorithm is basically that, given some simple assumptions, we prove
that the permutation problem can be avoided, while the algorithm is
quite simple and based on second order statistics, which does not
require that at most one signal to be Gaussian.
Please feel free to mail me requesting either PS or PDF copies of
our work.
Best Regards,
ak.
TITLE: Extraction of Specific Signals with Temporal Structure.
AUTORS: A. K. Barros and A. Cichocki.
ABSTRACT:
In this work we develop a very simple batch learning algorithm for
semi-blind extraction of a desired source signal with temporal
structure from linear mixtures. Although we use the concept of
sequential blind extraction of sources and independent component
analysis (ICA), we do not carry out the extraction in a completely
blind manner neither we assume that sources are statistically
independent. In fact, we show that the {\it a priori} information
about the auto-correlation function of primary sources can be used to
extract the desired signals (sources of interest) from their linear
mixtures. Extensive computer simulations and real data application
experiments confirm the validity and high performance of the proposed
algorithm.
From ted.carnevale at yale.edu Fri Aug 3 12:07:49 2001
From: ted.carnevale at yale.edu (Ted Carnevale)
Date: Fri, 03 Aug 2001 12:07:49 -0400
Subject: NEURON course at SFN 2001 meeting
Message-ID: <3B6ACC55.93EBBAD1@yale.edu>
Short Course Announcement
USING THE NEURON SIMULATION ENVIRONMENT
Satellite Symposium, Society for Neuroscience Meeting
9 AM - 5 PM on Saturday, Nov. 10, 2001
Speakers: N.T. Carnevale, M.L. Hines,
J.W. Moore, and G.M. Shepherd
This 1 day course with lectures and live demonstrations will
present information essential for teaching and research
applications of NEURON, an advanced simulation environment
that handles realistic models of biophysical mechanisms,
individual neurons, and networks of cells. The emphasis is
on practical issues that are key to the most productive use
of this powerful and convenient modeling tool.
Features that will be covered include:
constructing and managing models with the
CellBuilder, Network Builder,
and Linear Circuit Builder
importing detailed morphometric data
using the Multiple Run Fitter to optimize models
with high-dimensional parameter spaces
database resources for empirically-based modeling
Each registrant will a comprehensive set of notes which
include material that has not appeared elsewhere in print.
For more information see the course's WWW pages at
http://www.neuron.yale.edu/sd2001.html
--Ted
Supported in part by the National Science Foundation.
Opinions expressed are those of the authors
and not necessarily those of the Foundation.
From Alton.Ford at tmp.com Fri Aug 3 18:14:06 2001
From: Alton.Ford at tmp.com (Ford, Alton)
Date: Fri, 3 Aug 2001 17:14:06 -0500
Subject: Postdoc positions at Los Alamos National Laboratory
Message-ID:
Postdoctoral Positions in Experimental and Computational Neuroscience
The Biophysics Group (http://www.biophysics.lanl.gov/) in the Physics
Division at Los Alamos National Laboratory seeks several postdoctoral
candidates in the areas of experimental and computational neuroscience.
Existing projects include recording of fast optical transients from neural
tissue and the development of associated high speed data acquisition
systems, imaging devices and optical technology, analysis of evoked MEG and
fMRI signals, computational modeling of information processing within the
biological neural systems, and collaborative work on the development of a
retinal prosthetic device. Successful candidates could combine work in
several of these areas. For further technical information, contact Dr. John
George at jsg at lanl.gov.
A Ph.D. in Physics, Electrical Engineering, Biology, or a related discipline
completed within the last three years or soon to be completed is required.
Current starting salaries range from $54,100 - $58,300. Further details
about the Postdoctoral Program may be found at:
http://www.hr.lanl.gov/postdoc/. For consideration, submit a resume and
publications list with a cover letter outlining current research interests,
including contact information for three references, to postdoc-jobs at lanl.gov
(reference PD017639), or submit two copies to:
Postdoc Program Office, PD017639, MS-P290, Los Alamos National Laboratory,
Los Alamos, NM 87545.
Los Alamos National Laboratory is operated by the University of California
for the U.S. Department of Energy. AA/EOE
Alton Ford
Account Executive
tmp.worldwide
Advertising & Communications
3032 Bunker Hill Lane, Suite 207
Santa Clara, CA 95054
* 408.844.0150
* 408.496.6704 fax
* alton.ford at tmp.com
www.tmp.com
Compliment your recruitment advertising with Web Dragon! ...A service
provided by TMP Worldwide, in which a team of our live professionals mine
through the millions of resumes on the Internet to find qualified resumes to
meet your specified recruitment needs. There are more resumes online than
ever. Fill your pipeline with quality resumes and let TMP do the work for
you! Please contact me for more information.
From juergen at idsia.ch Mon Aug 6 12:20:15 2001
From: juergen at idsia.ch (juergen@idsia.ch)
Date: Mon, 6 Aug 2001 18:20:15 +0200
Subject: PhD fellowship
Message-ID: <200108061620.SAA08768@ruebe.idsia.ch>
I am seeking a PhD student for research on state-of-the-art recurrent
neural networks. Please see http://www.idsia.ch/~juergen/phd2001.html
Interviews also possible at ICANN 2001 (Aug 21-25) in Vienna or at the
ICANN recurrent net workshop: http://www.idsia.ch/~doug/icann/index.html
-------------------------------------------------
Juergen Schmidhuber director
IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland
juergen at idsia.ch www.idsia.ch/~juergen
From CogSci at psyvax.psy.utexas.edu Tue Aug 7 13:34:51 2001
From: CogSci at psyvax.psy.utexas.edu (Cognitive Science Society)
Date: Tue, 07 Aug 2001 12:34:51 -0500
Subject: Richard M. Shiffrin awarded the Rumelhart Prize
Message-ID: <5.0.0.25.2.20010807123359.00b05848@psy.utexas.edu>
Richard M. Shiffrin Chosen to Receive the David E. Rumelhart Prize
for Contributions to the Formal Analysis of Human Cognition
The Glushko-Samuelson Foundation and the Cognitive Science Society are
pleased to announce that Richard M. Shiffrin has been chosen as the
second recipient of the $100,000 David E. Rumelhart Prize, awarded
annually for an outstanding contribution to the formal analysis of
human cognition. Shiffrin will receive this prize and give the Prize
Lecture at the 2002 Meeting of the Cognitive Science Society, at
George Mason University, August 7-11, 2002.
Shiffrin has made many contributions to the modeling of human
cognition in areas ranging from perception to attention to learning,
but is best known for his long-standing efforts to develop explicit
models of human memory. His most recent models use Bayesian, adaptive
approaches, building on previous work but extending it in a critical
new manner, and carrying his theory beyond explicit memory to implicit
learning and memory processes. The theory has been evolving for about
35 years, and as a result represents a progression similar to the best
theories seen in any branch of science.
Shiffrin's major effort began in 1968, in a chapter with Atkinson [1]
that laid out a model of the components of short- and long-term memory
and described the processes that control the operations of memory.
The Atkinson-Shiffrin model encapsulated empirical and theoretical
results from a very large number of publications that modeled
quantitatively the relation of short- to long-term memory. It
achieved its greatest success by showing the critical importance---and
the possibility---of modeling the control processes of cognition.
This chapter remains one of the most cited works in the entire field
of psychology.
Shiffrin's formal theory was taken forward in a quantum leap in 1980
[2] and 1981 [3] with the SAM (Search of Associative Memory) model.
This was a joint effort with Jeroen Raaijmakers, then a graduate
student. The SAM model quantified the nature of retrieval from
long-term memory, and characterized reCALL as a memory search with
cycles of sampling and recovery. The SAM theory precisely
incorporates the notions of interactive cue combination that are now
seen to lie at the heart of memory retrieval. Another major quantum
step occurred in 1984 [4] when the theory was extended to recognition
memory. With another former student, Gary Gillund, Shiffrin initiated
what has become the standard approach to recognition memory, in which
a decision is based on summed activation of related memory traces. It
was a major accomplishment that the same retrieval activations that
had been used in the recall model could be carried forward and used to
predict a wide range of recognition phenomena. The next major step
occurred in 1990, when Shiffrin published two articles on the
list-length effect with his student Steve Clark and his colleague,
Roger Ratcliff [5, 6]. This research was of critical importance in
that it established clearly that experience leads to the
differentiation, rather than the mere stregthening, of the
representations of items in memory.
In 1997, the theory evolved in a radical direction in an important
paper with another former student, Mark Steyvers [7]. Although the
changes were fundamental, the new model retained the best concepts of
its predecessors, so that the previous successful predictions were
also a part of the new theory. REM added featural representations, to
capture similarity relations among items in memory. Building on
earlier ideas by John Anderson, and related ideas developed in
parallel by McClelland and Chappell, Shiffrin used Bayesian principles
of adaptive and optimal decision making under constraints to guide the
selection of the quantitative form of the activation functions. In
addition, storage principles were set forth that provided mechanisms
by which episodic experience could coalesce over development and
experience into permanent non-contextualized knowledge. This latter
development allowed the modeling of implicit memory phenomena, in work
that is just now starting to appear in many journals, including a
theory of long-term priming [with Schooler and Raaijmakers, 8] and a
theory of short-term priming [with his student David Huber and others,
9]. The short-term priming research showed that the direction of
priming can be reversed by extra study given to particular primes,
leading to another conceptual breakthrough. A new version of the REM
model explains this and other findings by assuming that some prime
features are confused with test item features, and that the system
attempts to deal with this situation optimally by appropriate
discounting of evidence from certain features.
Biographical Information
Shiffrin received his Ph. D. from the Mathematical Psychology Program
in the Department of Psychology at Stanford University in 1968, the
year after Rumelhart received his degree from the same program. Since
1968 he has been on the faculty of the Department of Psychology at
Indiana University, where he is now the Luther Dana Waterman Professor
of Psychology and Director of the Cognitive Science Program. Shiffrin
has accumulated many honors, including membership in the National
Academy of Sciences, the American Academy of Arts and Sciences, the
Howard Crosby Warren Award of the Society of Experimental
Psychologists, and a MERIT Award from the National Institute of Mental
Health. Shiffrin has served the field as editor of the Journal of
Experimental Psychology: Learning Memory and Cognition, and as a
member of the governing boards of several scientific societies.
Cited Articles By Richard M. Shiffrin
[1] Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A
proposed system and its control processes. In K. W. Spence and
J. T. Spence (Eds.), The Psychology of Learning and Motivation:
Advances in Research and Theory (Vol. 2, pp. 89-195). New York:
Aaademic Press.
[2] Raaijmakers, J. G. W., & Shiffrin, R. M. (1980). SAM: A theory of
probabilistic search of associative memory. In Bower, G. H. (Ed.),
The Psychology of Learning and Motivation, Vol. 14, 207-262. New
York: Academic Press.
[3] Raaijmakers, J. G. W., & Shiffrin, R. M. (1981). Search of
associative memory. Psychological Review, 88, 93-134.
[4] Gillund, G., & Shiffrin, R. M. (1984). A retrieval model for both
recognition and recall. Psychological Reviw, 91, 1-67.
[5] Ratcliff, R., Clark, S., & Shiffrin, R. M. (1990). The
list-strength effect: I. Data and discussion. Journal of
Experimental Psychology: Learning, Memory, and Cognition, 16, 163-178.
[6] Shiffrin, R. M., Ratcliff, R., & Clark, S. (1990). The
list-strength effect: II. Theoretical mechanisms. Journal of
Experimental Psychology: Learning, Memory, and Cognition, 16, 179-195.
[7] Shiffrin, R. M., & Steyvers, M. (1997). A model for recognition
memory: REM: Retrieving effectively from memory. Psychonomic Bulletin
and Review, 4 (2), 145-166.
[8] Schooler, L., Shiffrin, R. M., & Raaijmakers, J. G. W. (2001). A
model for implicit effects in perceptual identification. Psychological
Review, 108, 257-272.
[9] Huber, D. E., Shiffrin, R. M., Lyle, K. B., & Ruys, K. I. (2001).
Perception and preference in short-term word priming. Psychological
Review, 108, 149-182.
================================================================
Geoffrey E. Hinton Named First Recipient of the David E. Rumelhart Prize
May 3, 2001
Today the Glushko-Samuelson foundation and the Cognitive Science
Society jointly announced that Geoffrey E. Hinton has been named the first
recipient of the David E. Rumelhart Prize for contemporary
contributions to the formal analysis of human cognition. Hinton, the
Director of the Gatsby Computational Neuroscience Unit at University
College, London, was chosen from a large field of outstanding nominees
because of his seminal contributions to the understanding of neural
networks.
"Hinton's insights into the analysis of neural networks played a
central role in launching the field in the mid-1980's" said Professor
James McClelland of Carnegie Mellon University, Chair of the Prize
Selection Committee, "Geoff also played a major role in conveying the
relevance of neural networks to higher-level cognition." Professor
Lawrence Barsalou of Emory University, President of the Cognitive
Science Society, agreed with this assessment. "Hinton's contributions
to Cognitive Science have been pivotal", said Barsalou. "As the first
recipient he sets a great example for future awards." Hinton will
receive the prize, which includes a monetary award of $100,000, at the
annual meeting of the Society in Edinburgh, Scotland, in early August,
2001.
The Rumelhart prize acknowledges intellectual generosity and effective
mentoring as well as scientific insight. "Dave Rumelhart gave away
many scientific ideas, and made important contributions to the work of
many of his students and co-workers" said Robert J. Glushko, President of
the Glushko-Samuelson foundation. He added "Hinton stands out not
only for his own contributions but for his exemplary record in
mentoring young scientists." A total of eighteen graduate students
have received their Ph. D.'s under Hinton's supervision.
In conjunction with naming Hinton as the first recipient of the David
E. Rumelhart Prize, the Glushko-Samuelson foundation announced that
the prize will be awarded on an annual basis, instead of biennially.
"This change reflects the number of outstanding scientists who were
nominated for the award" noted Glushko. "I am pleased that my
foundation can play a role in honoring their contributions to
cognitive science." The second recipient of the Prize will be
announced at the Edinburgh meeting of the society, and will give the
prize lecture at the next annual meeting, which will be at George
Mason University in August, 2002.
For further information, please visit the David E. Rumelhart Prize
web site:
http://www.cnbc.cmu.edu/derprize/DerPrize2001.html
or contact:
Robert J. Glushko, 415-644-8731
James L. McClelland, 412-268-3157
----------
Cognitive Science Society
c/o Tanikqua Young
Department of Psychology
University of Texas
Austin, TX 78712
Phone: (512) 471-2030
Fax: (512) 471-3053
----------
From amari at brain.riken.go.jp Thu Aug 9 00:57:01 2001
From: amari at brain.riken.go.jp (Shun-ichi Amari)
Date: Thu, 9 Aug 2001 13:57:01 +0900
Subject: FW: new book on Information Geometry
Message-ID:
********************
????
??????????????????????????
?????????????????????????
????????
?????????????????
Dear Connectionists
I have announced the publication of the book
"Methods of Information Geometry"
but heard complaints that the book is out of stock.
Now they printed further, and you can order from
AMS or Oxford University Press through bookshops.
*************
It is my pleasure to announce the publication of
a book on Information Geometry. I have been
often asked if there is a good book on information
geometry to know its general perspectives. Here it is.
S.Amari and H.Nagaoka, Methods of Information Geometry,
AMS Translations of Mathematical Monographs, vol 191
(translated by Daishi Harada)
American Mathematical Society (AMS) and Oxford University Press,
206 + x pages, 2000. (See http://www.ams.org/)
********************
Shun-ichi Amari
Vice Director, RIKEN Brain Science Institute
Laboratory for Mathematical Neuroscience
Research Group on Brain-Style Information Systems
tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687
amari at brain.riken.go.jp
http://www.bsis.brain.riken.go.jp/
From orhan at ait-tech.com Wed Aug 8 14:16:02 2001
From: orhan at ait-tech.com (Orhan Karaali)
Date: Wed, 8 Aug 2001 14:16:02 -0400
Subject: Research Scientist position at Advanced Investment Technology
Message-ID:
ADVANCED INVESTMENT TECHNOLOGY, INC.
www.ait-tech.com
Advanced Investment Technology, Inc. (AIT) is a registered investment
advisor based in Clearwater, Florida focusing on institutional
domestic equity asset management. Our partners include Boston-based
State Street Global Advisors, a global leader in institutional
financial services, and Amsterdam-based Stichting Pensioenfonds ABP,
one of the world's largest pension plans. AIT's reputation as an
innovative entrepreneur within the asset management community is built
upon the research and development of nontraditional quantitative stock
valuation techniques (neural networks and genetic algorithms) for
which a patent was issued in 1998.
POSITION: RESEARCH SCIENTIST
The position will involve developing software and valuation algorithms
for stock selection and portfolio management. Job responsibilities
include database development, running weekly production jobs, working
with financial data vendor feeds, contributing to financial research
projects, and developing applications in the areas of multifactor
stock models.
AIT uses Windows 2000; Visual Studio 6 and Visual Studio Net; MS SQL 2000;
C++ STL; OLE DB; XML; SOAP; and OLAP technologies.
Minimum Qualifications:
Bachelors Degree in Computer Science or a related field
Masters Degree in Computer Science or MBA
Very strong C++ and STL background
Working knowledge of SQL
Bonus Qualifications:
Familiarity with financial data and asset management
Experience developing object oriented software with C++ and STL
Familiarity with Microsoft Visual Studio
Knowledge of machine learning algorithms (NN, GA, GP, SVM)
To apply, please send your resume to:
E-mail: orhan at ait-tech.com
Fax: (727) 799-1232 (Attn: Orhan Karaali)
From norman at psych.colorado.edu Sat Aug 11 00:03:12 2001
From: norman at psych.colorado.edu (Ken Norman)
Date: Fri, 10 Aug 2001 22:03:12 -0600 (MDT)
Subject: new paper: modeling hippocampal and neocortical contributions to
recognition memory
Message-ID:
Dear Connectionists,
The following technical report is now available for downloading as:
ftp://grey.colorado.edu/pub/oreilly/papers/NormanOReilly01_recmem.pdf
webpage: http://psych.colorado.edu/~oreilly/pubs-abstr.html#01_recmem
Modeling Hippocampal and Neocortical Contributions to Recognition
Memory: A Complementary Learning Systems Approach
Kenneth A. Norman and Randall C. O'Reilly
Department of Psychology
University of Colorado
Boulder, CO 80309
ICS Technical Report 01-02
Abstract:
We present a computational neural network model of recognition memory
based on the biological structures of the hippocampus and medial
temporal lobe cortex (MTLC), which perform complementary learning
functions. The hippocampal component of the model contributes to
recognition by recalling specific studied details. MTLC can not
support recall, but it is possible to extract a scalar familiarity
signal from MTLC that tracks how well the test item matches studied
items. We present simulations that establish key qualitative
differences in the operating characteristics of the hippocampal recall
and MTLC familiarity signals, and we identify several manipulations
(e.g., target-lure similarity, interference) that differentially
affect the two signals. We also use the model to address the
stochastic relationship between recall and familiarity (i.e., are they
independent), and the effects of partial vs. complete hippocampal
lesions on recognition.
From yokoy at brain.riken.go.jp Tue Aug 14 02:12:01 2001
From: yokoy at brain.riken.go.jp (Yoko Yamaguchi)
Date: Tue, 14 Aug 2001 15:12:01 +0900
Subject: Postdoctorial/ technical staff positions in cognitive science and
computational neurosicence
Message-ID:
Please post:
POSTDOCTORAL SCIENTIST/ TECHNICAL STAFF POSITIONS at RIKEN BSI
Laboratory for Dynamics of Emergent Intelligence, Brain-Style Intelligence
Research Group, RIKEN Brain Science Institute (BSI) invites applicants for
postdoctoral and technical staff scientists in the fields of cognitive
science and computational neurosicence.
Our objective is to clarify the neural principle for the dynamics of
emergent intelligence in novel situations. Particular emphasis is given to
synchronization of oscillations in hierarchical neural networks.
For further information see http://www.dei.brain.riken.go.jp/
Applicants for postdoctoral positions must have a PhD.
Technical staff are expected to have a bachelors or masters degree.
Applicant should submit a full CV detailing education and experience,
attached with your photograph, in addition to a complete bibliography of
publications. The names, addresses, email addresses of two referees must
also be supplied. Send all applications to:
Contact::
Dr. Yoko yamaguchi (Laboratory Head)
Lab. for Dynamics of Emergent Intelligence
Brain Science Institute, RIKEN
2-1 Hirosawa, Wako, Saitama, 351-0198 Japan
FAX: +81-48-467-6938
E-mail : yokoy at brain.riken.go.jp
-------------------------------------------------------------------
Yoko Yamaguchi
Lab. for Dynamics of Emergent Intelligence
RIKEN Brain Science Institute(BSI)
From bbs at bbsonline.org Mon Aug 13 16:27:04 2001
From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor))
Date: Mon, 13 Aug 2001 16:27:04 -0400
Subject: Norman: Two Visual Systems -- BBS Call for Commentators
Message-ID:
Dear Dr. Connectionists List User,
Below is the abstract of a forthcoming BBS target article
Two Visual Systems and Two Theories of Perception:
An Attempt to Reconcile the Constructivist and Ecological Approaches
by
Joel Norman
http://www.bbsonline.org/Preprints/Norman/
http://psy.haifa.ac.il/~maga/tvs&ttp.pdf
This article has been accepted for publication in Behavioral and Brain
Sciences (BBS), an international, interdisciplinary journal providing
Open Peer Commentary on important and controversial current research in
the biobehavioral and cognitive sciences.
Commentators must be BBS Associates or nominated by a BBS Associate. To
be considered as a commentator for this article, to suggest other
appropriate commentators, or for information about how to become a BBS
Associate, please reply by EMAIL within three (3) weeks to:
calls at bbsonline.org
The Calls are sent to 8000 BBS Associates, so there is no expectation
(indeed, it would be calamitous) that each recipient should comment
on every occasion! Hence there is no need to reply except if you wish
to comment, or to nominate someone to comment.
If you are not a BBS Associate, please approach a current BBS
Associate (there are currently over 10,000 worldwide) who is familiar
with your work to nominate you. All past BBS authors, referees and
commentators are eligible to become BBS Associates. A full electronic
list of current BBS Associates is available at this location to help
you select a name:
http://www.bbsonline.org/Instructions/assoclist.html
If no current BBS Associate knows your work, please send us your
Curriculum Vitae and BBS will circulate it to appropriate Associates to
ask whether they would be prepared to nominate you. (In the meantime,
your name, address and email address will be entered into our database
as an unaffiliated investigator.)
To help us put together a balanced list of commentators, please give
some indication of the aspects of the topic on which you would bring
your areas of expertise to bear if you were selected as a commentator.
To help you decide whether you would be an appropriate commentator for
this article, an electronic draft is retrievable from the online
BBSPrints Archive, at the URL that follows the abstract below.
_____________________________________________________________
Two Visual Systems and Two Theories of Perception:
An Attempt to Reconcile the Constructivist and Ecological Approaches
Joel Norman
Department of Psychology
University of Haifa
Haifa, Israel
jnorman at psy.haifa.ac.il
KEYWORDS: Visual perception theories, ecological, constructivist,
two visual systems, space perception, size perception,
dual-process approach
ABSTRACT: The two contrasting theoretical approaches to visual
perception, the constructivist and the ecological, are briefly
presented and illustrated through their analyses of space perception
and size perception. Earlier calls for their reconciliation and
unification are reviewed. Neurophysiological, neuropsychological, and
psychophysical evidence for the existence of two quite distinct visual
systems, the ventral and the dorsal, is presented. These two
perceptual systems differ in their functions; the ventral systems
central function is that of identification, while the dorsal system is
mainly engaged in the visual control of motor behavior. The strong
parallels between the ecological approach and the functioning of the
dorsal system and between the constructivist approach and the
functioning of the ventral system are noted. It is also shown that the
experimental paradigms used by the proponents of these two approaches
match the functions of the respective visual systems. A dual-process
approach to visual perception emerges from this analysis, with the
ecological-dorsal process transpiring mainly without conscious
awareness, while the constructivist-ventral process is normally
conscious. Some implications of this dual-process approach to
visual-perceptual phenomena are presented, with emphasis on space
perception.
http://www.bbsonline.org/Preprints/Norman/
http://psy.haifa.ac.il/~maga/tvs&ttp.pdf
___________________________________________________________
Please do not prepare a commentary yet. Just let us know, after having
inspected it, what relevant expertise you feel you would bring to bear
on what aspect of the article. We will then let you know whether it was
possible to include your name on the final formal list of invitees.
_______________________________________________________________________
*** SUPPLEMENTARY ANNOUNCEMENTS ***
(1) The authors of scientific articles are not paid money for their
refereed research papers; they give them away. What they want is to
reach all interested researchers worldwide, so as to maximize the
potential research impact of their findings.
Subscription/Site-License/Pay-Per-View costs are accordingly
access-barriers, and hence impact-barriers for this give-away
research literature.
There is now a way to free the entire refereed journal literature,
for everyone, everywhere, immediately, by mounting interoperable
university eprint archives, and self-archiving all refereed research
papers in them.
Please see: http://www.eprints.org
http://www.openarchives.org/
http://www.dlib.org/dlib/december99/12harnad.html
---------------------------------------------------------------------
(2) All authors in the biobehavioral and cognitive sciences are
strongly encouraged to self-archive all their papers in their own
institution's Eprint Archives or in CogPrints, the Eprint Archive
for the biobehavioral and cognitive sciences:
http://cogprints.soton.ac.uk/
It is extremely simple to self-archive and will make all of our
papers available to all of us everywhere, at no cost to anyone,
forever.
Authors of BBS papers wishing to archive their already published
BBS Target Articles should submit it to BBSPrints Archive.
Information about the archiving of BBS' entire backcatalogue will
be sent to you in the near future. Meantime please see:
http://www.bbsonline.org/help/
and
http://www.bbsonline.org/Instructions/
---------------------------------------------------------------------
(3) Call for Book Nominations for BBS Multiple Book Review
In the past, Behavioral and Brain Sciences (BBS) had only been able
to do 1-2 BBS multiple book treatments per year, because of our
limited annual page quota. BBS's new expanded page quota will make
it possible for us to increase the number of books we treat per
year, so this is an excellent time for BBS Associates and
biobehavioral/cognitive scientists in general to nominate books you
would like to see accorded BBS multiple book review.
(Authors may self-nominate, but books can only be selected on the
basis of multiple nominations.) It would be very helpful if you
indicated in what way a BBS Multiple Book Review of the book(s) you
nominate would be useful to the field (and of course a rich list of
potential reviewers would be the best evidence of its potential
impact!).
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
Please note: Your email address has been added to our user database for
Calls for Commentators, the reason you received this email. If you do
not wish to receive further Calls, please feel free to change your
mailshot status through your User Login link on the BBSPrints homepage.
Check the helpfiles for details of how to obtain your username and
password.
http://www.bbsonline.org/
For information about the mailshot, please see the help file at:
http://www.bbsonline.org/help/node5.html#mailshot
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
From murphyk at cs.berkeley.edu Tue Aug 14 20:42:21 2001
From: murphyk at cs.berkeley.edu (Kevin Murphy)
Date: Tue, 14 Aug 2001 17:42:21 -0700
Subject: OpenBayes
Message-ID: <3B79C56D.43E951A5@cs.berkeley.edu>
Richard Dybowski formed the OpenBayes discussion group/email list
on 17 January 2001. The goal is to discuss the development of an open
source library for probabilistic graphical models. We had our first
meeting at the recent UAI conference in Seattle. The only concrete
decision reached was that we should advertise the existence of this
group more widely - hence this email.
For more details on the OpenBayes project, please see
http://HTTP.CS.Berkeley.EDU/~murphyk/OpenBayes/index.html
This page includes a list of people who attended the meeting, more
details on the project's goals, achievements to date, ways you can
subscribe to the list and/or contribute code, etc.
Kevin Murphy
P.S. If you have problems subscribing to the list, please send email
to openbayes-owner at egroups.com, not to me! I am not the moderator.
From neted at anc.ed.ac.uk Wed Aug 15 05:54:39 2001
From: neted at anc.ed.ac.uk (Network Editor)
Date: Wed, 15 Aug 2001 10:54:39 +0100
Subject: NETWORK: Computation in Neural Systems
Message-ID: <15226.18143.81476.664143@gargle.gargle.HOWL>
Here is the contents page for the current issue of NETWORK:
Computation in Neural Systems. NETWORK publishes original research
work on theoretical and computational aspects of the development and
functioning of the nervous system, at all levels of analysis,
particularly at the network, cellular and subcellular levels.
Professor David Willshaw
Editor-in-Chief
NETWORK: Computation in Neural Systems
Institute for Adaptive & Neural Computation
Division of Informatics
University of Edinburgh
5 Forrest Hill
Edinburgh EH1 2QL
UK
Tel: +44-(0)131-650 4404
Fax: +44-(0)131-650 4406
Email: neted at anc.ed.ac.uk
========================================================================
NETWORK: COMPUTATION IN NEURAL SYSTEMS - VOLUME 12, ISSUE 3, AUGUST 2001
Special issue featuring selected papers from the Natural Stimulus
Statistics Workshop, October 2000, Cold Spring Harbor, USA
EDITORIALS
Publishing papers in Network: Special Issues
D J Willshaw (p 235)
Natural stimulus statistics
P Reinagel and S Laughlin (pp 237-240)
PAPERS
Redundancy reduction revisited
H Barlow (pp 241-253)
Characterizing the sparseness of neural codes
B Willmore and D J Tolhurst (pp 255-270)
Beats, kurtosis and visual coding
M G A Thomson (pp 271-287)
Estimating spatio-temporal receptive fields of auditory and visual
neurons from their responses to natural stimuli
F E Theunissen, S V David, N C Singh, A Hsu, W E Vinje and J L Gallant
(pp 289-316)
Neural coding of naturalistic motion stimuli
G D Lewen, W Bialek and R R de Ruyter van Steveninck (pp 317-329)
Nonlinear and extra-classical receptive field properties and the
statistics of natural scenes C Zetzsche and F R?hrbein (pp 331-350)
Neuronal processing of behaviourally generated optic flow: experiments
and model simulations
R Kern, M Lutterklas, C Petereit, J P Lindemann and M Egelhaaf (pp 351-369)
Can recent innovations in harmonic analysis `explain' key findings in
natural image statistics?
D L Donoho and A G Flesia (pp 371-393)
Optimal nonlinear codes for the perception of natural colours
T von der Twer and D I A MacLeod (pp 395-407)
From colette.faucher at wanadoo.fr Thu Aug 16 05:40:32 2001
From: colette.faucher at wanadoo.fr (colette faucher)
Date: Thu, 16 Aug 2001 02:40:32 -0700
Subject: cfp for FLAIRS special track : Categorization and Concept
Representation : Models and Implications
Message-ID: <3b7b16df3cc49870@amyris.wanadoo.fr> (added by amyris.wanadoo.fr)
===========================================================================
FLAIRS 2002
15th International Florida Artificial Intelligence Research Society
Conference
Pensacola, Florida
May 16-18, 2002
Special Track : "Categorization and Concept Representation :
Models and Implications"
===========================================================================
This track seeks to bring together researchers working on issues related to
categorization and concept representation in the areas of Artificial
Intelligence and Cognitive Psychology.
Topic Description
------------------
Categorization is the process by which distinct entities are treated as
equivalent. It is one of the most fundamental and pervasive cognitive
activities. It is fundamental because categorization allows us to understand
and make predictions about objects and events in our world. The problem of
understanding what criteria are used to group together entities in a same
category is indeed central in categorization. Though most works in that
topic have proposed that perceptual or structural similarity is the "glue"
that binds objects of a same category, some psychologists have claimed that
similarity is insufficient to account for the acquisition and use of
categories and have proposed more abstract forms of criteria that make
categories coherent and give them a kind of homogeneity in terms of the
entities that belong to them.
The different new propositions psychologists have suggested are that objects
are grouped together because they facilitate a common goal or serve the same
function. Some categories are viewed as coherent because they rest on a
theory which explains the commonalities of their elements. Similarity and
goals, on one hand, and theories, on the other hand, have not been paid the
same attention in computational models of categorization. Similarity-based
models abound and the notion of categorization goals has also been exploited
in computational models. On the other hand, the notion of an underlying
theory that makes a category coherent just begins to be further analyzed and
specified. New computational models of categorization reflecting this new
tendency are thus expected.
The representation of concepts that a categorization system generates is of
course intimately tied to the criteria this system uses to group entities
into categories, so along with new models of categorization, we expect to
see the emergence of new models of concept representation apart from the
classical ones deriving from the Aristotelician, the Prototypical and the
Exemplar Views. The representation of the entities to categorize plays also
an important part in the categorization process. In particular, the context
in which the entities occur may influence the way they are classified.
The purpose of this track is to bring fresh insights concerning a perhaps
revisited notion of similarity, the way goals of categorization influence
this process, how the notion of the theory of a concept can be formalized
and implemented in computational models of categorization and the
implications those elements may have on the representation of concepts.
The contributions to this track may be situated in the symbolic approach of
categorization or the connectionist one.
Contributions in the following sub-topics would be welcomed :
- Computational models of similarity,
- Computational models of theory-based categorization,
- Computational models of similarity-based categorization,
- Computational models of human categorization,
- Models of concept representation which are relevant as regards to the
process of categorization,
- Models of concept representation and elicitation,
- Formalization of the notion of theory which underlies a category,
- Formalization of the context of occurrence of the entities to categorize
and its influence on the categorization process.
This list is not exclusive provided that the contributions are relevant to
the definition of the track specified above.
Paper Review and Publication
------------------------------
Only full papers will be considered for the track. Submitted papers will be
reviewed by two program committee members. An author for an accepted paper
is expected to present the paper in the track. Papers accepted for the track
will be published in the FLAIRS 2002 Conference Proceedings.
The best papers will be invited for modification, extension and submission
to a special issue in an international AI journal.
Important dates
----------------
Paper Submission Deadline : November 15, 2001
Notification of Acceptance-Rejection : January 10, 2002
Camera Ready Copy Due : March 4, 2002
Journal Invitation : February 10, 2002
Journal Paper Due : May 10, 2002
Conference Dates : May 16-18, 2002
Program Committee
------------------
David W. Aha, Navy Center for Applied Research in AI, Washington, USA
Ralph Bergmann, University of Kaiserslautern, Germany
Max Bramer, University of Portsmouth, UK
Colette Faucher (Chair), University of Aix-Marseille III, France
Paolo Frasconi, University of Florence, Italy
Robert L. Goldstone, Indiana University, USA
James Hampton, City University, London, UK
David Leake, Indiana University, USA
Bradley C. Love, University of Texas, USA
Paul Mc Kevitt, University of Ulster, Northern Ireland
Ryszard S. Michalski, George Mason University, USA
Philip Resnik, University of Maryland, USA
Lance J. Rips, Northwestern University, USA
Steven A. Sloman, Brown University, USA
Paper Submission Information
-----------------------------
Authors must submit an electronic copy of their complete manuscript of no
more than 5 pages. All submissions must be original work.
The review will be blind. Author names and affiliations are to appear ONLY
on a separate cover page. The presenter (if different) from the first author
must be specified on that cover page. All appropriate contact information
must be mentioned for each author (e-mail, phone, fax, etc.).
Papers must be written using MS Word, RTF or PDF formats according to AAAI's
standard format for authors.
All submissions must be sent in electronic form to :
colette.faucher at iuspim.u-3mrs.fr and colette.faucher at wanadoo.fr
For any problem or question, please contact the chair track, Colette
Faucher, at : colette.faucher at iuspim.u-3mrs.fr or
colette.faucher at wanadoo.fr.
Track Website
--------------
http://perso.wanadoo.fr/colette.faucher/categorization.html
FLAIRS 2002 Website
--------------------
http://altair.coginst.uwf.edu/~jkolen/Flairs2002/intro.php3
From wolfskil at MIT.EDU Fri Aug 17 13:17:26 2001
From: wolfskil at MIT.EDU (Jud Wolfskill)
Date: Fri, 17 Aug 2001 13:17:26 -0400
Subject: book announcement--Leen
Message-ID: <5.0.2.1.2.20010817115831.00ae3e28@hesiod>
Hello,
I thought readers of the Connectionists List might be interested in this
book. For more information please visit
http://mitpress.mit.edu/catalog/item/default.asp?sid=59E8DBE7-4980-48C7-A87F-F0917571FB1E&ttype=2&tid=8662
Best,
Jud
Advances in Neural Information Processing Systems 13
edited by Todd K. Leen, Thomas G. Dietterich, and Volker Tresp
The annual conference on Neural Information Processing Systems (NIPS) is
the flagship conference on neural computation. The conference is
interdisciplinary, with contributions in algorithms, learning theory,
cognitive science, neuroscience, vision, speech and signal processing,
reinforcement learning and control, implementations, and diverse
applications. Only about 30 percent of the papers submitted are accepted
for presentation at NIPS, so the quality is exceptionally high. These
proceedings contain all of the papers that were presented at the 2000
conference.
Todd K. Leen is Professor of Computer Science and Engineering, and of
Electrical and Computer Engineering, at Oregon Graduate Institute of
Science and Technology. Thomas G. Dietterich is Professor of Computer
Science at Oregon State University. Volker Tresp heads a research group at
Siemens Corporate Technology in Munich.
7 x 10, 1100 pp., cloth ISBN 0-262-12241-3
Neural Information Processing series
A Bradford Book
Jud Wolfskill
Associate Publicist
MIT Press
5 Cambridge Center, 4th Floor
Cambridge, MA 02142
617.253.2079
617.253.1709 fax
wolfskil at mit.edu
From cindy at cns.bu.edu Fri Aug 17 10:04:17 2001
From: cindy at cns.bu.edu (Cynthia Bradford)
Date: Fri, 17 Aug 2001 10:04:17 -0400
Subject: Neural Networks 14(6/7): 2001 Special Issue
Message-ID: <200108171404.KAA06299@retina.bu.edu>
NEURAL NETWORKS 14(6/7)
Contents - Volume 14, Numbers 6/7 - 2001
2001 Special Issue
"Spiking Neurons in Neuroscience and Technology"
Stephen Grossberg, Wolfgang Maass, and Henry Markram, co-editors
------------------------------------------------------------------
Neural assemblies: Technical issues, analysis, and modeling
George L. Gerstein and Lyle L. Kirkland
Coding properties of spiking neurons:
Reverse and cross-correlations
Wulfram Gerstner
ON-OFF retinal ganglion cells temporally encode OFF/ON sequence
Hiroyuki Uchiyama, Koichi Goto, and Hiroyuki Matsunobu
Building blocks for electronic spiking neural networks
Andre van Schaik
Orientation-selective aVLSI spiking neurons
Shih-Chii Liu, Jorg Kramer, Giacomo Indiveri, Tobias Delbruck,
Thomas Burg, and Rodney Douglas
Space-rate coding in an adaptive silicon neuron
Kai Hynna and Kwabena Boahen
Propagation of cortical synfire activity:
Survival probability in single trials and stability in the mean
Marc-Oliver Gewaltig, Markus Diesmann, and Ad Aertsen
Fokker-Planck approach to the pulse packet propagation in
synfire chain
H. Cateau and T. Fukai
Connection topology dependence of synchronization of neural
assemblies on class 1 and 2 excitability
Luis F. Lago-Fernandez, Fernando J. Corbacho, and Ramon Huerta
Deterministic dynamics emerging from a cortical functional
architecture
Ralph M. Siegel and Heather L. Read
Spike-based strategies for rapid processing
Simon Thorpe, Arnaud Delorme, and Rufin van Rullen
Zero-lag synchronous dynamics in triplets of interconnected
cortical areas
D. Chawla, K.J. Friston, and E.D. Lumer
Neural timing nets
P.A. Cariani
Spike-based VLSI modeling of the ILD system in the echolocating bat
Timothy Horiuchi and Kai Hynna
Pattern separation and synchronization in spiking associative
memories and visual areas
Andreas Knoblauch and Gunther Palm
Probabilistic synaptic weighting in a reconfigurable network of
VLSI integrate-and-fire neurons
David H. Goldberg, Gert Cauwenberghs, and Andreas G. Andreou
Face identification using one spike per neuron:
Resistance to image degradations
A. Delorme and S.J. Thorpe
Temporal receptive fields, spikes, and Hebbian delay selection
Christian Leibold and J. Leo van Hemmen
Distributed synchrony in a cell assembly of spiking neurons
Nir Levy, David Horn, Isaac Meilijson, and Eytan Ruppin
Associative memory in networks of spiking neurons
Friedrich T. Sommer and Thomas Wennekers
Trajectory estimation from place cell data
Nanayaa Twum-Danso and Roger Brockett
A pulsed neural network model of bursting in the basal ganglia
Mark D. Humphries and Kevin N. Gurney
Regularization mechanisms of spiking-bursting neurons
P. Varona, J.J. Torres, R. Huerta, H.D.I. Abarbanel, and
M.I. Rabinovich
Optimal firing rate estimation
Michael G. Paulin and Larry F. Hoffman
Resonate-and-fire neurons
Eugene M. Izhikevich
Coherence resonance and discharge time reliability in neurons
and neuronal models
K. Pakdaman, Seiji Tanabe, and Tetsuya Shimokawa
Adaptation in single spiking neurons based on a noise shaping
neural coding hypothesis
Jonghan Shin
The double queue method:
A numerical method for integrate-and-fire neuron networks
Geehyuk Lee and Nabil H. Farhat
A spiking neural network architecture for nonlinear function
approximation
Nicolangelo Iannella and Andrew D. Back
From kenm at uwo.ca Sun Aug 19 16:02:44 2001
From: kenm at uwo.ca (Ken McRae)
Date: Sun, 19 Aug 2001 16:02:44 -0400
Subject: Postdoctoral Postion
Message-ID:
Postdoctoral Fellowship in Psycholinguistics & Computational Modeling
I have funding for a two-year Postdoctoral Fellowship in my Cognitive
Science laboratory at the University of Western Ontario in London, Ontario,
Canada. The stipend is $35,000 per year plus $2,500 per year for conference
travel. There are no citizenship restrictions.
Our research focuses on the interrelated issues of noun meaning, verb
meaning, and sentence processing. Our research integrates theories and
methodologies from a number of areas, including: word recognition, semantic
memory, concepts and categorization, sentence processing, connectionist
modeling, and cognitive neuropsychology. Central to our research program is
connectionist modeling of the computation of noun and verb meaning, as well
as competition-integration modeling of on-line sentence reading time. Thus,
a postdoctoral fellow in my lab will have the opportunity to participate in
projects in a number of areas of Cognitive Science.
Our department has a number of Cognition faculty, all of whom conduct
research related to language processing. Thus, our faculty and graduate
students provide a rich research environment. I am also involved in a number
of collaborations with researchers from other universities. My lab is
well-equipped for both human experimentation and computational modeling. UWO
also has a 4T magnet that is used for research only.
London is a pleasant city of approximately 350,000, and is located 2 hours
drive from either Toronto or Detroit. Note that a reasonable one-bedroom
apartment in London costs approximately $500 per month.
For further information about our lab, and Cognition at UWO, see:
http://www.sscl.uwo.ca/psychology/cognitive/faculty.html
If you are interested in this position, please send a cv, a statement of
research interests, and 3 letters of reference to me at the address below.
Sending all information electronically is preferable. The start-date for
this position is flexible. If you would like more information about this
position, please contact me directly.
***********************************************************
Ken McRae
Associate Professor
Department of Psychology & Neuroscience Program
Social Science Centre
University of Western Ontario
London, Ontario CANADA N6A 5C2
email: mcrae at uwo.ca
http://www.sscl.uwo.ca/psychology/cognitive/mcrae/mcrae.html
phone: (519) 661-2111 ext. 84688 fax: (519) 661-3961
***********************************************************
From cohn+jmlr at cs.cmu.edu Mon Aug 20 14:01:50 2001
From: cohn+jmlr at cs.cmu.edu (JMLR)
Date: Mon, 20 Aug 2001 14:01:50 -0400
Subject: New paper in the Journal of Machine Learning Research: Bayes Point Machines
Message-ID:
The Journal of Machine Learning Research (www.jmlr.org) is pleased to
announce the availability of a new paper in electronic form.
----------------------------------------
Bayes Point Machines
Ralf Herbrich, Thore Graepel and Colin Campbell. Journal of Machine Learning
Research 1 (August 2001), pp. 245-279.
Abstract
Kernel-classifiers comprise a powerful class of non-linear decision
functions for binary classification. The support vector machine is an
example of a learning algorithm for kernel classifiers that singles out the
consistent classifier with the largest margin, i.e. minimal real-valued
output on the training sample, within the set of consistent hypotheses, the
so-called version space. We suggest the Bayes point machine as a
well-founded improvement which approximates the Bayes-optimal decision by
the centre of mass of version space. We present two algorithms to
stochastically approximate the centre of mass of version space: a billiard
sampling algorithm and a sampling algorithm based on the well known
perceptron algorithm. It is shown how both algorithms can be extended to
allow for soft-boundaries in order to admit training errors. Experimentally,
we find that - for the zero training error case - Bayes point machines
consistently outperform support vector machines on both surrogate data and
real-world benchmark data sets. In the soft-boundary/soft-margin case, the
improvement over support vector machines is shown to be reduced. Finally, we
demonstrate that the real-valued output of single Bayes points on novel test
points is a valid confidence measure and leads to a steady decrease in
generalisation error when used as a rejection criterion.
This paper and earlier papers in Volume 1 are available electronically at
http://www.jmlr.org in PostScript, PDF and HTML formats; a bound, hardcopy
edition of Volume 1 will be available later this year.
-David Cohn,
Managing Editor, Journal of Machine Learning Research
-------
This message has been sent to the mailing list "jmlr-announce at ai.mit.edu",
which is maintained automatically by majordomo. To subscribe to the list,
send mail to listserv at ai.mit.edu with the line "subscribe jmlr-announce" in
the body; to unsubscribe send email to listserv at ai.mit.edu with the line
"unsubscribe jmlr-announce" in the body.
From jf218 at hermes.cam.ac.uk Mon Aug 20 17:15:19 2001
From: jf218 at hermes.cam.ac.uk (Dr J. Feng)
Date: Mon, 20 Aug 2001 22:15:19 +0100 (BST)
Subject: five years post at cambridge
In-Reply-To: <200108171404.KAA06299@retina.bu.edu>
Message-ID:
The Babraham Institute, Cambridge
Computational Neuroscientist/Electrophysiologist (Ref. KK/CNE)
Applications are invited for a postdoctoral scientist to join a group of
systems neuroscientists within the Laboratory of Cognitive and
Developmental Neuroscience investigating how the brain encodes visual and
olfactory cues associated with recognition or both social and non-social
objects using novel multi-array electrophysiological recording techniques
in both rodent and sheep models. This post is available initially for 5
years. It would either suit an individual with primary expertise in
computational analysis and modelling of sensory system functioning or an
in vivo electrophysiologist with good expertise in computational analysis
of complex single-unit data. In both cases there would be significant
involvement in carrying out multi-array electrophysiological recording
experiments and subsequent data analysis and representation. The
individual would also be expected to work closely with
electrophysiologists both within the group and the USA and to co-ordinate
with other UK-based Computational Neuroscientists involved with the
projects. The group already has excellent computational facilities to deal
with the large amounts data associated with multi-array recording
experiments
Informal enquiries on these Neuroscience vacancies should be directed to
Dr. Keith Kendrick, Head of Neurobiology Programme: tel: 44(0) 1223
496385, fax. 44(0)1223 496028, e-mail keith.kendrick at bbsrc.ac.uk
Starting salary in the range ?19,500 - ?23,000 per annum. Benefits include
a non-contributory pension scheme, 25 days leave and 10? public holidays a
year. On site Refectory, Nursery and Sports & Social Club as well as free
car parking.
Further details and an application form available from the Personnel
Office, The Babraham Institute, Babraham, Cambridge CB2 4AT. Tel. 01223
496000, e-mail babraham.personnel at bbsrc.ac.uk. The closing date for these
positions is 28th September 2001.
AN EQUAL OPPORTUNITIES EMPLOYER
An Institute supported by the Biotechnology and Biological Sciences
Research Council
Jianfeng Feng
The Babraham Institute
Cambridge CB2 4AT
UK
http://www.cosg.susx.ac.uk/users/jianfeng
http://www.cus.cam.ac.uk/~jf218
From wolfskil at MIT.EDU Mon Aug 20 10:25:54 2001
From: wolfskil at MIT.EDU (Jud Wolfskill)
Date: Mon, 20 Aug 2001 10:25:54 -0400
Subject: book announcement--O'Reilly
Message-ID: <5.0.2.1.2.20010820102443.00a82000@hesiod>
I thought readers of the Connectionists List might be interested in this
book. For more information please visit
http://mitpress.mit.edu/catalog/item/default.asp?sid=16CDFF8A-3F4A-4FB5-B713-D8725D0A6969&ttype=2&tid=3345
Best,
Jud
Computational Explorations in Cognitive Neuroscience
Understanding the Mind by Simulating the Brain
Randall C. O'Reilly and Yuko Munakata
foreword by James L. McClelland
The goal of computational cognitive neuroscience is to understand how the
brain embodies the mind by using biologically based computational models
comprising networks of neuronlike units. This text, based on a course
taught by Randall O'Reilly and Yuko Munakata over the past several years,
provides an in-depth introduction to the main ideas in the field. The
neural units in the simulations use equations based directly on the ion
channels that govern the behavior of real neurons, and the neural networks
incorporate anatomical and physiological properties of the neocortex. Thus
the text provides the student with knowledge of the basic biology of the
brain as well as the computational skills needed to simulate large-scale
cognitive phenomena.
The text consists of two parts. The first part covers basic neural
computation mechanisms: individual neurons, neural networks, and learning
mechanisms. The second part covers large-scale brain area organization and
cognitive phenomena: perception and attention, memory, language, and
higher-level cognition. The second part is relatively self-contained and
can be used separately for mechanistically oriented cognitive neuroscience
courses. Integrated throughout the text are more than forty different
simulation models, many of them full-scale research-grade models, with
friendly interfaces and accompanying exercises. The simulation software
(PDP++, available for all major platforms) and simulations can be
downloaded free of charge from the Web. Exercise solutions are available,
and the text includes full information on the software.
Randall C. O'Reilly is Assistant Professor in the Department of Psychology
and at the Institute for Cognitive Science at the University of Colorado,
Boulder. Yuko Munakata is Assistant Professor in Developmental Cognitive
Neuroscience at the University of Denver.
8 x 9, 512 pp., 213 illus., paper ISBN 0-262-65054-1
A Bradford Book
Jud Wolfskill
Associate Publicist
MIT Press
5 Cambridge Center, 4th Floor
Cambridge, MA 02142
617.253.2079
617.253.1709 fax
wolfskil at mit.edu
From ps629 at columbia.edu Tue Aug 21 15:54:59 2001
From: ps629 at columbia.edu (Paul Sajda)
Date: Tue, 21 Aug 2001 15:54:59 -0400
Subject: Postdoctoral Position in Computational Neural Modeling
Message-ID: <3B82BC93.34337935@columbia.edu>
Postdoctoral Position in Computational Neural Modeling--a two year
position is available immediately for conducting research in modeling
of neural mechanisms for visual scene analysis, with particular
applications to spatio-temporal and hyperspectral imagery. A
mathematical and computational background is desired, particularly in
probabilistic modeling and optimization. This position will be part of
a multi-university research team (UPenn, Columbia and MIT)
investigating biomimetic methods for analysis of literal and
non-literal imagery through a combination of experimental physiology,
neuromorphic design and simulation, computational modeling and visual
psycophysics.
Applicants should send a CV, three representative papers and the names
of three references to Prof. Paul Sajda, Department of Biomedical
Engineering, Columbia University, 530 W 120th Street, NY, NY 10027. Or
email to ps629 at columbia.edu.
--
Paul Sajda, Ph.D.
Associate Professor
Department of Biomedical Engineering
530 W 120th Street
Columbia University
New York, NY 10027
tel: (212) 854-5279
fax: (212) 854-8725
email: ps629 at columbia.edu
http://www.columbia.edu/~ps629
From wolfskil at MIT.EDU Wed Aug 22 14:17:30 2001
From: wolfskil at MIT.EDU (Jud Wolfskill)
Date: Wed, 22 Aug 2001 14:17:30 -0400
Subject: book announcement--Opper
Message-ID: <5.0.2.1.2.20010822140915.00b083c0@hesiod>
I thought readers of the Connectionists List might be interested in this
book. For more information please visit
http://mitpress.mit.edu/catalog/item/default.asp?sid=5CEC3656-296C-4C48-B6E3-6BDFAC7EBADD&ttype=2&tid=3847
Best,
Jud
Advanced Mean Field Methods
Theory and Practice
edited by Manfred Opper and David Saad
A major problem in modern probabilistic modeling is the huge computational
complexity involved in typical calculations with multivariate probability
distributions when the number of random variables is large. Because exact
computations are infeasible in such cases and Monte Carlo sampling
techniques may reach their limits, there is a need for methods that allow
for efficient approximate computations. One of the simplest approximations
is based on the mean field method, which has a long history in statistical
physics. The method is widely used, particularly in the growing field of
graphical models.
Researchers from disciplines such as statistical physics, computer science,
and mathematical statistics are studying ways to improve this and related
methods and are exploring novel application areas. Leading approaches
include the variational approach, which goes beyond factorizable
distributions to achieve systematic improvements; the TAP
(Thouless-Anderson-Palmer) approach, which incorporates correlations by
including effective reaction terms in the mean field theory; and the more
general methods of graphical models.
Bringing together ideas and techniques from these diverse disciplines, this
book covers the theoretical foundations of advanced mean field methods,
explores the relation between the different approaches, examines the
quality of the approximation obtained, and demonstrates their application
to various areas of probabilistic modeling.
Manfred Opper is a Reader and David Saad is Professor, the Neural Computing
Research Group, School of Engineering and Applied Science, Aston
University, UK.
7 x 10, 300 pp.
cloth ISBN 0-262-15054-9
Neural Information Processing series
Jud Wolfskill
Associate Publicist
MIT Press
5 Cambridge Center, 4th Floor
Cambridge, MA 02142
617.253.2079
617.253.1709 fax
wolfskil at mit.edu
From abrowne at lgu.ac.uk Thu Aug 23 07:50:50 2001
From: abrowne at lgu.ac.uk (Tony Browne)
Date: Thu, 23 Aug 2001 12:50:50 +0100 (GMT Daylight Time)
Subject: Connectionist Inference Preprint
Message-ID:
Apologies if you receive this posting more than once.
A preprint is available for download, of the paper
'Connectionist Inference Models' by Antony Browne and Ron
Sun (to appear in `Neural Networks'). 62 Pages, 155
References.
Abstract: The performance of symbolic inference tasks has
long been a challenge to connectionists. In this paper, we
present an extended survey of this area. Existing
connectionist inference systems are reviewed, with
particular reference to how they perform variable binding
and rule-based reasoning, and whether they involve
distributed or localist representations. The benefits and
disadvantages of different representations and systems are
outlined, and conclusions drawn regarding the capabilities
of connectionist inference systems when compared with
symbolic inference systems or when used for cognitive
modeling.
Keywords: Symbolic inference, resolution, variable binding,
localist representations, distributed representations.
Download Instructions: Go to
http://www.lgu.ac.uk/~abrowne/abrowne.htm and scroll down
to the section 'Downloadable Technical Reports and
Preprints'. Click on the file to download (in zipped
Postscript [190K] or Zipped PDF [228K] format).
Comments Welcome
If you have problems downloading, please e-mail me.
Tony Browne
=======================================================
Dr. Antony Browne abrowne at lgu.ac.uk
http://www.lgu.ac.uk/~abrowne/abrowne.htm
Reader in Intelligent Systems
School of Computing, Information Systems & Mathematics
London Guildhall University
100 Minories
London EC3 1JY, UK
Tel: (+44) 0207 320 1307
Fax: (+44) 0207 320 1717
=======================================================
From stefan.wermter at sunderland.ac.uk Thu Aug 23 13:02:13 2001
From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter)
Date: Thu, 23 Aug 2001 18:02:13 +0100
Subject: EmerNet book: Emergent Neural Computational Architectures
Message-ID: <3B853714.6E58E4CA@sunderland.ac.uk>
Emergent Neural Computational Architectures
based on Neuroscience
Stefan Wermter, Jim Austin, David Willshaw
2001, Springer, Heidelberg, 577p
For more detailed information, table of contents, abstracts
and chapters see:
http://www.his.sunderland.ac.uk/emernet/newbook.html
Summary:
This book is the result of a series of International
Workshops organised by the EmerNet project on
Emergent Neural Computational Architectures based
on Neuroscience sponsored by the Engineering and
Physical Sciences Research Council (EPSRC). The
overall aim of the book is to present a broad spectrum
of current research into biologically inspired
computational systems and hence encourage the
emergence of new computational approaches based
on neuroscience. It is generally understood that the
present approaches for computing do not have the
performance, flexibility and reliability of biological
information processing systems. Although there is a
massive body of knowledge regarding how processing
occurs in the brain and central nervous system this has
had little impact on mainstream computing so far.
The process of developing biologically inspired
computerised systems involves the examination of the
functionality and architecture of the brain with an
emphasis on the information processing activities.
Biologically inspired computerised systems address
neural computation from the position of both
neuroscience, and computing by using experimental
evidence to create general neuroscience-inspired
systems.
The book focuses on the main research areas of
modular organisation and robustness, timing and
synchronisation, and learning and memory storage.
The issues considered as part of these include: How
can the modularity in the brain be used to produce
large scale computational architectures? How does
the human memory manage to continue to operate
despite failure of its components? How does the brain
synchronise its processing? How does the brain
compute with relatively slow computing elements but
still achieve rapid and real-time performance? How
can we build computational models of these processes
and architectures? How can we design incremental
learning algorithms and dynamic memory
architectures? How can the natural information
processing systems be exploited for artificial
computational methods?
Emergent Neural Computational Architectures based on
Neuroscience can be ordered from Springer-Verlag using the
booking form and accessed on-line using the appropriate login
and password from Springer.
http://www.his.sunderland.ac.uk/emernet/newbook.html
http://www.springer.de/cgi-bin/search_book.pl?isbn=3-540-42363-X
--------------------------------------
***************************************
Professor Stefan Wermter
Chair for Intelligent Systems
University of Sunderland
Centre of Informatics, SCET
St Peters Way
Sunderland SR6 0DD
United Kingdom
phone: +44 191 515 3279
fax: +44 191 515 3553
email: stefan.wermter at sunderland.ac.uk
http://www.his.sunderland.ac.uk/~cs0stw/
http://www.his.sunderland.ac.uk/
****************************************
From rid at ecs.soton.ac.uk Fri Aug 24 05:56:32 2001
From: rid at ecs.soton.ac.uk (Bob Damper)
Date: Fri, 24 Aug 2001 10:56:32 +0100 (BST)
Subject: Source of a famous quotation ...
Message-ID:
Dear connectionists,
does anyone know the exact source of the famous quotation:
``neural networks are the second best way of solving every
problem'' ?
I'd be eternally grateful for an answer.
Bob.
***************************************************************
* R I Damper PhD *
* Reader and Head: *
* Image, Speech and Intelligent Systems (ISIS) *
* Research Group *
* Building 1 *
* Department of Electronics and Computer Science *
* University of Southampton *
* Southampton SO17 1BJ *
* England *
* *
* Tel: +44 (0) 23 8059 4577 (direct) *
* FAX: +44 (0) 23 8059 4498 *
* Email: rid at ecs.soton.ac.uk *
* WWW: http://www.ecs.soton.ac.uk/~rid *
* *
***************************************************************
From gabr-ci0 at wpmail.paisley.ac.uk Fri Aug 24 12:40:10 2001
From: gabr-ci0 at wpmail.paisley.ac.uk (Bogdan Gabrys)
Date: Fri, 24 Aug 2001 17:40:10 +0100
Subject: PhD studentship available
Message-ID:
PhD Studentship
Applied Computational Intelligence Research Unit (ACIRU)
School of Information and Communication Technologies,
University of Paisley, Scotland, UK
Applications are invited for a 3 year PhD research studentship which
can start from October 2001 and is jointly funded by the University of
Paisley (http://www.cis.paisley.ac.uk) and the Lufthansa Systems
Berlin GmbH (http://www.lsb.de).
The proposed research project will investigate and develop various
approaches for combining predictions (forecasts). There is a large
potential market for applications offering accurate and reliable
predictions ranging from stock market exchange to estimating the
demand for sales of goods and services. One such example, which will
be looked at in more detail in this project, is an accurate estimation
of the demand for various types of airplane tickets. Combination,
aggregation and fusion of information are major problems for all kinds
of knowledge-based systems, from image processing to decision making,
from pattern recognition to automatic learning. Various machine
learning and hybrid intelligent techniques will be used for processing
and modelling of imperfect data and information utilizing the
methodologies like probability, fuzzy, evidence and possibility
theories.
The student will be joining an enthusiastic and vibrant research group
and will be primarily based in the ACIRU in Paisley (near Glasgow),
Scotland but two extended visits to the Lufthansa Systems Berlin site
in Berlin, Germany are planned in the second and third year of the
project.
The studentship carries a remuneration of ?7500 tax-free (increased
to ?8k and ?9k in the second and third year respectively) and
payment of tuition fees paid at Home/EU rate. The stipend may be
augmented by a limited amount of teaching.
Applicants should have a strong mathematical background and hold a
first or upper second class honours degree or equivalent in
mathematics, physics, engineering, statistics, computer science or a
similar discipline. Additionally the candidate should have strong
programming experience using any or combination of C,C++,Matlab or
Java. Knowledge of ORACLE will be an advantage.
For further details please contact Dr. Bogdan Gabrys, e-mail:
gabr-ci0 at paisley.ac.uk.
Interested candidates should send a detailed CV and a letter of
application with the names and addresses of two referees to:
Dr. Bogdan Gabrys, School of Information and Communication
Technologies, Div. of Computing and Information Systems, University of
Paisley, Paisley PA1 2BE, Scotland, UK. The applications can be also
sent by e-mail.
******************************************************
Dr Bogdan Gabrys
Applied Computational Intelligence Research Unit
Division of Computing and Information Systems
University of Paisley
High Street, Paisley PA1 2BE
Scotland, United Kingdom
Tel: +44 (0) 141 848 3752
Fax: +44 (0) 141 848 3542
E-mail: gabr-ci0 at paisley.ac.uk
******************************************************
Legal disclaimer
--------------------------
The information transmitted is the property of the University of
Paisley and is intended only for the person or entity to which it is
addressed and may contain confidential and/or privileged material.
Statements and opinions expressed in this e-mail may not represent
those of the company. Any review, retransmission, dissemination and
other use of, or taking of any action in reliance upon, this
information by persons or entities other than the intended recipient
is prohibited. If you received this in error, please contact the
sender immediately and delete the material from any computer.
--------------------------
From cindy at cns.bu.edu Fri Aug 24 16:34:07 2001
From: cindy at cns.bu.edu (Cynthia Bradford)
Date: Fri, 24 Aug 2001 16:34:07 -0400
Subject: Call for Papers: 6th ICCNS
Message-ID: <200108242034.QAA17997@retina.bu.edu>
Apologies if you receive this more than once.
***** CALL FOR PAPERS *****
SIXTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS
Tutorials: May 29, 2002
Meeting: May 30 - June 1, 2002
Boston University
677 Beacon Street
Boston, Massachusetts 02215
http://www.cns.bu.edu/meetings/
Sponsored by Boston University's
Center for Adaptive Systems
and
Department of Cognitive and Neural Systems
with financial support from the
National Science Foundation
and the
Office of Naval Research
This interdisciplinary conference has drawn about 300 people from around
the world each time that it has been offered. Last year's conference was
attended by scientists from 31 countries. The conference is structured to
facilitate intense communication between its participants, both in the
formal sessions and during its other activities. As during previous years,
the conference will focus on solutions to the fundamental questions:
How Does the Brain Control Behavior?
How Can Technology Emulate Biological Intelligence?
The conference will include invited tutorials and lectures, and
contributed lectures and posters by experts on the biology and
technology of how the brain and other intelligent systems adapt to
a changing world. The conference is aimed at researchers and students
of computational neuroscience, connectionist cognitive science,
artificial neural networks, neuromorphic engineering, and artificial
intelligence.
A single oral or poster session enables all presented work to be
highly visible.
Abstract submissions encourage submissions of the latest results.
Costs are kept at a minimum without compromising the quality of
meeting handouts and social events.
CALL FOR ABSTRACTS
Session Topics:
* vision * spatial mapping and navigation
* object recognition * neural circuit models
* image understanding * neural system models
* audition * mathematics of neural systems
* speech and language * robotics
* unsupervised learning * hybrid systems (fuzzy, evolutionary, digital)
* supervised learning * neuromorphic VLSI
* reinforcement and emotion * industrial applications
* sensory-motor control * cognition, planning, and attention
* other
Contributed abstracts must be received, in English, by January 31,
2002. Notification of acceptance will be provided by email by February
28, 2002. A meeting registration fee must accompany each Abstract. See
Registration Information below for details. The fee will be returned if
the Abstract is not accepted for presentation and publication in the
meeting proceedings. Registration fees of accepted Abstracts will be
returned on request only until April 19, 2002.
Each Abstract should fit on one 8.5" x 11" white page with 1" margins
on all sides, single-column format, single-spaced, Times Roman or
similar font of 10 points or larger, printed on one side of the page
only. Fax submissions will not be accepted. Abstract title, author
name(s), affiliation(s), mailing, and email address(es) should begin
each Abstract. An accompanying cover letter should include: Full title
of Abstract; corresponding author and presenting author name, address,
telephone, fax, and email address; requested preference for oral or
poster presentation; and a first and second choice from the topics
above, including whether it is biological (B) or technological (T)
work. Example: first choice: vision (T); second choice: neural system
models (B). (Talks will be 15 minutes long. Posters will be up for a
full day. Overhead, slide, VCR, and LCD projector facilities will be
available for talks.) Abstracts which do not meet these requirements
or which are submitted with insufficient funds will be returned. Accepted
Abstracts will be printed in the conference proceedings volume. No longer
paper will be required. The original and 3 copies of each Abstract should
be sent to: Cynthia Bradford, Boston University, Department of Cognitive
and Neural Systems, 677 Beacon Street, Boston, MA 02215.
REGISTRATION INFORMATION: Early registration is recommended. To
register, please fill out the registration form below. Student
registrations must be accompanied by a letter of verification from a
department chairperson or faculty/research advisor. If accompanied by
an Abstract or if paying by check, mail to the address above. If
paying by credit card, mail as above, or fax to (617) 353-7755, or
email to cindy at cns.bu.edu. The registration fee will help to pay for a
reception, 6 coffee breaks, and the meeting proceedings.
STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral
fellows are available to help cover meeting travel and living costs. The
deadline to apply for fellowship support is January 31, 2002. Applicants
will be notified by email by February 28, 2002. Each application should
include the applicant's CV, including name; mailing address; email
address; current student status; faculty or PhD research advisor's name,
address, and email address; relevant courses and other educational data;
and a list of research articles. A letter from the listed faculty or PhD
advisor on official institutional stationery should accompany the
application and summarize how the candidate may benefit from the meeting.
Fellowship applicants who also submit an Abstract need to include the
registration fee with their Abstract submission. Those who are awarded
fellowships are required to register for and attend both the conference
and the day of tutorials. Fellowship checks will be distributed after
the meeting.
REGISTRATION FORM
Sixth International Conference on Cognitive and Neural Systems
Department of Cognitive and Neural Systems
Boston University
677 Beacon Street
Boston, Massachusetts 02215
Tutorials: May 29, 2002
Meeting: May 30 - June 1, 2002
FAX: (617) 353-7755
http://www.cns.bu.edu/meetings/
(Please Type or Print)
Mr/Ms/Dr/Prof: _____________________________________________________
Name: ______________________________________________________________
Affiliation: _______________________________________________________
Address: ___________________________________________________________
City, State, Postal Code: __________________________________________
Phone and Fax: _____________________________________________________
Email: _____________________________________________________________
The conference registration fee includes the meeting program,
reception, two coffee breaks each day, and meeting proceedings.
The tutorial registration fee includes tutorial notes and two
coffee breaks.
CHECK ONE:
( ) $85 Conference plus Tutorial (Regular)
( ) $55 Conference plus Tutorial (Student)
( ) $60 Conference Only (Regular)
( ) $40 Conference Only (Student)
( ) $25 Tutorial Only (Regular)
( ) $15 Tutorial Only (Student)
METHOD OF PAYMENT (please fax or mail):
[ ] Enclosed is a check made payable to "Boston University".
Checks must be made payable in US dollars and issued by
a US correspondent bank. Each registrant is responsible
for any and all bank charges.
[ ] I wish to pay my fees by credit card
(MasterCard, Visa, or Discover Card only).
Name as it appears on the card: _____________________________________
Type of card: _______________________________________________________
Account number: _____________________________________________________
Expiration date: ____________________________________________________
Signature: __________________________________________________________
From jzhu at stanford.edu Fri Aug 24 12:35:24 2001
From: jzhu at stanford.edu (Ji Zhu)
Date: Fri, 24 Aug 2001 09:35:24 -0700 (PDT)
Subject: No subject
Message-ID:
Dear all,
This is a repost of our paper "Kernel Logistic Regression and the
Import Vector Machine".
We want to apologize that we missed several important references in
our previous draft. The revised version is available at
http://www.stanford.edu/~jzhu/research/nips01.ps
Thank you!
Best regards,
-Ji Zhu
From skremer at q.cis.uoguelph.ca Mon Aug 27 16:29:04 2001
From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer)
Date: Mon, 27 Aug 2001 16:29:04 -0400 (EDT)
Subject: Announce: New Unlabeled Data Competition and Workshop
Message-ID:
Apologies if you receive multiple copies of this mailing.
ANNOUNCEMENT: The Second Annual NIPS Unlabeled Data Competition and Workshop
It's time to put-up or shut-up!
Synopsis:
We are please to announce the NIPS*2001 Unlabeled Data Competition and
Workshop, to be held in Whistler, British Columbia, Canada, Dec 7 or 8, 2001.
This competition is a challenge to the machine learning community to develop
and demonstrate methods to use unlabeled data to improve supervised
learning. We have created a web-site where participants can download
and submit problem sets and compete head to head with other contestants
in a series of challenging unlabeled-data, supervised-learning problems.
Recently, there has been much interest in applying techniques that
incorporate knowledge from unlabeled data into systems performing
supervised learning. The potential advantages of such techniques are
obvious in domains where labeled data is expensive and unlabeled data
is cheap. Many such techniques have been proposed, but only recently has
any effort been made to compare the effectiveness of different approaches
on real world problems.
Our contest presents a challenge to the proponents of methods to
incorporate unlabeled data into supervised learning. Can you really
use unlabeled data to help train a supervised classification (or
regression) system? Do recent (and not so recent) theories stand up to
the data test?
On the contest web-site you can find challenge problems where you can try
out your methods head-to-head against anyone brave enough to face
you. Then, at the end of the contest we will release the results and find
out who really knows something about using unlabeled data, and if
unlabeled data are really useful or we are all just wasting our time. So
ask yourself, are you (and your theory) up to the challenge?? Feeling
lucky???
For more details on the competition or the workshop and to sign up for
the Unlabeled Data Mailing List, please visit our
web-page at "http://q.cis.uoguelph.ca/~skremer/NIPS2001/".
Stefan
--
--
Dr. Stefan C. Kremer, Assistant Prof.,
Dept. of Computing and Information Science
University of Guelph, Guelph, Ontario N1G 2W1
WWW: http://hebb.cis.uoguelph.ca/~skremer
Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323
E-mail: skremer at snowhite.cis.uoguelph.ca
From bbs at bbsonline.org Tue Aug 28 16:55:50 2001
From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor))
Date: Tue, 28 Aug 2001 16:55:50 -0400
Subject: BBS Call for Commentators--Preston & De Waal: Empathy: Its ultimate and proximate bases
Message-ID:
Dear Dr. Connectionists List User,
Below is the abstract of a forthcoming BBS target article
Empathy: Its ultimate and proximate bases
by
Stephanie D. Preston & Frans B. M. de Waal
http://www.bbsonline.org/Preprints/Preston/
or
http://www.bbsonline.org/Preprints/Preston/Preston.pdf
This article has been accepted for publication in Behavioral and Brain
Sciences (BBS), an international, interdisciplinary journal providing
Open Peer Commentary on important and controversial current research in
the biobehavioral and cognitive sciences.
Commentators must be BBS Associates or nominated by a BBS Associate. To
be considered as a commentator for this article, to suggest other
appropriate commentators, or for information about how to become a BBS
Associate, please reply by EMAIL within three (3) weeks to:
calls at bbsonline.org
The Calls are sent to 10,000 BBS Associates, so there is no expectation
(indeed, it would be calamitous) that each recipient should comment
on every occasion! Hence there is no need to reply except if you wish
to comment, or to nominate someone to comment.
If you are not a BBS Associate, please approach a current BBS
Associate (there are currently over 10,000 worldwide) who is familiar
with your work to nominate you. All past BBS authors, referees and
commentators are eligible to become BBS Associates. A full electronic
list of current BBS Associates is available at this location to help
you select a name:
http://www.bbsonline.org/Instructions/assoclist.html
If no current BBS Associate knows your work, please send us your
Curriculum Vitae and BBS will circulate it to appropriate Associates to
ask whether they would be prepared to nominate you. (In the meantime,
your name, address and email address will be entered into our database
as an unaffiliated investigator.)
To help us put together a balanced list of commentators, please give
some indication of the aspects of the topic on which you would bring
your areas of expertise to bear if you were selected as a commentator.
To help you decide whether you would be an appropriate commentator for
this article, an electronic draft is retrievable from the online
BBSPrints Archive, at the URL that follows the abstract below.
_____________________________________________________________
Empathy: Its ultimate and proximate bases
Stephanie D. Preston
Department of Psychology
3210 Tolman Hall #1650
University of California at Berkeley
Berkeley, CA 94720-1650
USA
spreston at socrates.berkeley.edu
http://socrates.berkeley.edu/~spreston
Frans B. M. de Waal
Living Links,
Yerkes Primate Center and Psychology Department,
Emory University,
Atlanta, GA 30322
USA
dewaal at rmy.emory.edu
http://www.emory.edu/LIVING_LINKS/
KEYWORDS:
altruism; cognitive empathy; comparative; emotion;
emotional contagion; empathy; evolution; human; perception-action;
perspective taking;
ABSTRACT:
There is disagreement in the literature about the exact nature of the
phenomenon of empathy. There are emotional, cognitive, and conditioning
views, applying in varying degrees across species. An adequate description
of the ultimate and proximate mechanism can integrate these views.
Proximately, the perception of an object's state activates the subject's
corresponding representations, which in turn activate somatic and
autonomic responses. This mechanism supports basic behaviors (e.g., alarm,
social facilitation, vicariousness of emotions, mother-infant
responsiveness, and the modeling of competitors and predators) that are
crucial for the reproductive success of animals living in groups. The
"Perception-Action Model" (PAM) together with an understanding of how
representations change with experience can explain the major empirical
effects in the literature (similarity, familiarity, past experience,
explicit teaching and salience). It can also predict a variety of empathy
disorders. The interaction between the PAM and prefrontal functioning can
also explain different levels of empathy across species and age groups.
This view can advance our evolutionary understanding of empathy beyond
inclusive fitness and reciprocal altruism and can explain different levels
of empathy across individuals, species, stages of development, and
situations.
http://www.bbsonline.org/Preprints/Preston/
or
http://www.bbsonline.org/Preprints/Preston/Preston.pdf
___________________________________________________________
Please do not prepare a commentary yet. Just let us know, after having
inspected it, what relevant expertise you feel you would bring to bear
on what aspect of the article. We will then let you know whether it was
possible to include your name on the final formal list of invitees.
_______________________________________________________________________
*** SUPPLEMENTARY ANNOUNCEMENTS ***
(1) The authors of scientific articles are not paid money for their
refereed research papers; they give them away. What they want is to
reach all interested researchers worldwide, so as to maximize the
potential research impact of their findings.
Subscription/Site-License/Pay-Per-View costs are accordingly
access-barriers, and hence impact-barriers for this give-away
research literature.
There is now a way to free the entire refereed journal literature,
for everyone, everywhere, immediately, by mounting interoperable
university eprint archives, and self-archiving all refereed research
papers in them.
Please see: http://www.eprints.org
http://www.openarchives.org/
http://www.dlib.org/dlib/december99/12harnad.html
---------------------------------------------------------------------
(2) All authors in the biobehavioral and cognitive sciences are
strongly encouraged to self-archive all their papers in their own
institution's Eprint Archives or in CogPrints, the Eprint Archive
for the biobehavioral and cognitive sciences:
http://cogprints.soton.ac.uk/
It is extremely simple to self-archive and will make all of our
papers available to all of us everywhere, at no cost to anyone,
forever.
Authors of BBS papers wishing to archive their already published
BBS Target Articles should submit it to BBSPrints Archive.
Information about the archiving of BBS' entire backcatalogue will
be sent to you in the near future. Meantime please see:
http://www.bbsonline.org/help/
and
http://www.bbsonline.org/Instructions/
---------------------------------------------------------------------
(3) Call for Book Nominations for BBS Multiple Book Review
In the past, Behavioral and Brain Sciences (BBS) had only been able
to do 1-2 BBS multiple book treatments per year, because of our
limited annual page quota. BBS's new expanded page quota will make
it possible for us to increase the number of books we treat per
year, so this is an excellent time for BBS Associates and
biobehavioral/cognitive scientists in general to nominate books you
would like to see accorded BBS multiple book review.
(Authors may self-nominate, but books can only be selected on the
basis of multiple nominations.) It would be very helpful if you
indicated in what way a BBS Multiple Book Review of the book(s) you
nominate would be useful to the field (and of course a rich list of
potential reviewers would be the best evidence of its potential
impact!).
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
Please note: Your email address has been added to our user database for
Calls for Commentators, the reason you received this email. If you do
not wish to receive further Calls, please feel free to change your
mailshot status through your User Login link on the BBSPrints homepage,
useing your username and password above:
http://www.bbsonline.org/
For information about the mailshot, please see the help file at:
http://www.bbsonline.org/help/node5.html#mailshot
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
From brody at cshl.org Tue Aug 28 18:35:49 2001
From: brody at cshl.org (Carlos Brody)
Date: Tue, 28 Aug 2001 18:35:49 -0400 (EDT)
Subject: Postdoctoral positions in computational neuroscience
Message-ID: <15244.7365.990066.563845@sonnabend.cshl.org>
-- PLEASE POST --
POSTDOCTORAL OPPORTUNITIES IN COMPUTATIONAL NEUROSCIENCE
Postdoctoral positions for computational neuroscientists and
psychophysicists are available in Carlos Brody's research group at
Cold Spring Harbor Laboratory. (see
http://www.cns.caltech.edu/~carlos/temporary/Lab). Applicants should
have an interest in quantitative approaches to neuroscience, and
should have, or be near completing, a Ph.D. in Neuroscience,
Experimental Psychology, or in a quantitative field (e.g. Physics,
Math, Engineering).
Successful applicants will be expected, after appropriate guidance
and/or any necessary self-education, to lead the group's research
efforts in one or more of the projects listed below. For more
information on each of these projects, visit the lab's web page. In
addition, those who wish to develop and pursue their own, independent,
self-originated, line(s) of research will be very much encouraged to
do so: the lab seeks an atmosphere of vigorous discussion and creative
independence. Applications from self-guided, motivated, and
independent-minded scientists are particularly welcome.
Applicants should send a CV, the names of three references, and a
summary of research interests and experience to: Carlos Brody, 1
Bungtown Road, Freeman Building, Cold Spring Harbor, NY 11724,
USA. The positions are open immediately; salaries are on the NIH pay
scale.
----------
Lab interest areas (in order of descending current emphasis in the lab):
1) Psychophysics and neurocomputational modeling of working memory.
2) Encoding and representation of time.
3) Computation with spiking neurons.
4) Automated mapping of complex receptive fields.
From engp9286 at nus.edu.sg Wed Aug 29 03:39:47 2001
From: engp9286 at nus.edu.sg (Duan Kaibo)
Date: Wed, 29 Aug 2001 15:39:47 +0800
Subject: a technical report
Message-ID: <9C4C56CDF89E0440A6BD571E76D2387FB7559B@exs23.ex.nus.edu.sg>
Dear Connectionists:
We have recently completed a technical report that evaluates some simple
performance measures for tuning hyperparameters of Support Vector Machines.
A pdf file containing this report can be downloaded from:
http://guppy.mpe.nus.edu.sg/~mpessk/comparison.shtml
Here are the details of the report...
__________________________________________________________________
Title: Evaluation of Simple Performance Measures for Tuning SVM
Hyperparameters
Authors:
Kaibo Duan ( engp9286 at nus.edu.sg
)
S. Sathiya Keerthi ( mpessk at nus.edu.sg
)
Aun Neow Poo ( mpepooan at nus.edu.sg
)
Abstract:
Choosing optimal hyperparameter values for support vector machines is an
important step in SVM design. This is usually done by minimizing either an
estimate of generalization error or some other related performance measure.
In this paper, we empirically study the usefulness of several simple
performance measures that are inexpensive to compute (in the sense that they
do not require expensive matrix operations involving the kernel matrix). The
results point out which of these measures are adequate functionals for
tuning SVM hyperparameters. For SVMs with L1 soft margin formulation, none
of the simple measures yields a performance uniformly as good as k-fold
cross validation; Joachims' Xi-Alpha bound and Wahba et al's GACV come next
and perform reasonably well. For SVMs with L2 soft margin formulation, the
radius margin bound gives a very good prediction of optimal hyperparameter
values.
__________________________________________________________________
We are interested in knowing about the comparitive performance of the
measures that we have considered, on other data sets that we haven't tried.
Best regards,
Kaibo
From mike at stats.gla.ac.uk Wed Aug 29 10:17:12 2001
From: mike at stats.gla.ac.uk (Mike Titterington)
Date: Wed, 29 Aug 2001 15:17:12 +0100 (BST)
Subject: Postdoctoral post in Glasgow
Message-ID:
(Re-advertisement)
UNIVERSITY OF GLASGOW
DEPARTMENT OF STATISTICS
POSTDOCTORAL RESEARCH ASSISTANT
Applications are invited for a Postdoctoral Research Assistantship
(IA) post in the Department of Statistics, University of Glasgow, to
work with Professor D.M. Titterington for a period of up to 3 years,
starting as soon as possible. The post is funded by the UK Engineering
and Physical Sciences Research Council.
The research topic is 'Approximate Approaches to Likelihood and
Bayesian Statistical Inference in Incomplete-data problems'.
Applications, supported by full curriculum vitae and the names of
three referees, should be sent, to arrive no later than September 21, 2001,
to Professor D. M. Titterington, Department of Statistics, University of
Glasgow, Glasgow G12 8QQ, Scotland, from whom further particulars are
available. Informal enquiries by electronic mail (mike at stats.gla.ac.uk)
are welcomed.
From juergen at idsia.ch Thu Aug 30 10:57:29 2001
From: juergen at idsia.ch (juergen@idsia.ch)
Date: Thu, 30 Aug 2001 16:57:29 +0200
Subject: metalearner
Message-ID: <200108301457.QAA01240@ruebe.idsia.ch>
I would like to draw your attention to Sepp Hochreiter's astonishing
recent result on "learning to learn."
He trains gradient-based "Long Short-Term Memory" (LSTM) recurrent
networks with roughly 5000 weights to _metalearn_ fast online learning
algorithms for nontrivial classes of functions, such as all quadratic
functions of two variables. LSTM is necessary because metalearning
typically involves huge time lags between important events, and standard
gradient-based recurrent nets cannot deal with these. After a month
of metalearning on a PC he freezes all weights, then uses the frozen
net as follows: He selects some new function f, and feeds a sequence of
random training exemplars of the form ...data/target/data/target/data...
into the input units, one sequence element at a time. After about 30
exemplars the frozen recurrent net correctly predicts target inputs before
it sees them. No weight changes! How is this possible? After metalearning
the frozen net implements a sequential learning algorithm which apparently
computes something like error signals from data inputs and target inputs
and translates them into changes of internal estimates of f. Parameters
of f, errors, temporary variables, counters, computations of f and of
parameter updates are all somehow represented in form of circulating
activations. Remarkably, the new - and quite opaque - online learning
algorithm running on the frozen network is much faster than standard
backprop with optimal learning rate. This indicates that one can use
gradient descent to metalearn learning algorithms that outperform gradient
descent. Furthermore, the metalearning procedure automatically avoids
overfitting in a principled way, since it punishes overfitting online
learners just like it punishes slow ones, simply because overfitters
and slow learners cause more cumulative errors during metalearning.
Hochreiter himself admits the paper is not well-written. But the results
are quite amazing: http://www.cs.colorado.edu/~hochreit
@inproceedings{Hochreiter:01meta,
author = "S. Hochreiter and A. S. Younger and P. R. Conwell",
title = "Learning to learn using gradient descent",
booktitle= "Lecture Notes on Comp. Sci. 2130,
Proc. Intl. Conf. on Artificial Neural Networks (ICANN-2001)",
editors = "G. Dorffner and H. Bischof and K. Hornik",
publisher= "Springer: Berlin, Heidelberg",
pages = "87-94",
year = "2001"}
-------------------------------------------------
Juergen Schmidhuber director
IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland
juergen at idsia.ch www.idsia.ch/~juergen
From tibs at stat.Stanford.EDU Wed Aug 1 01:14:22 2001
From: tibs at stat.Stanford.EDU (Rob Tibshirani)
Date: Tue, 31 Jul 2001 22:14:22 -0700 (PDT)
Subject: book announcement
Message-ID: <200108010514.WAA744941@rgmiller.Stanford.EDU>
Book announcement:
The Elements of Statistical Learning- data mining, inference and prediction
536p (in full color)
Trevor Hastie, Robert Tibshirani, and Jerome Fridman
Springer-Verlag, 2001
For full details see
http://www-stat.stanford.edu/ElemStatLearn
Here is a brief description:
During the past decade there has been an explosion in computation and
information technology. With it has come vast amounts of data in a
variety of fields such as medicine, biology, finance, and marketing.
The challenge of understanding these data has led to the development of
new tools in the field of Statistics, and spawned new areas such as
data mining, machine learning and bioinformatics.
Many of these tools have common underpinnings but are often expressed
with different terminology. This book describes the important ideas
in these areas in a common conceptual framework. While the approach
is statistical, the emphasis is on concepts rather than mathematics.
Many examples are given, with a liberal use of color graphics. It
should be a valuable resource for statisticians and anyone interested in
data-mining in science or industry.
The book's coverage is broad, from supervised learning (prediction) to
unsupervised learning. The many topics include neural networks,
support vector machines, classification trees and boosting --- the
first comprehensive treatment of this topic in any book.
Jerome Friedman, Trevor Hastie, and Robert Tibshirani are Professors
of Statistics at Stanford University. They are prominent researchers
in this area: Friedman is the (co-)inventor of many data-mining tools
including CART, MARS, and projection pursuit. Hastie and Tibshirani
developed generalized additive models and wrote a popular book of that
title. Hastie wrote much of the statistical modelling software in
S-PLUS, and invented principal curves and surfaces. Tibshirani proposed
the Lasso and co-wrote the best selling book ``An Introduction to the
Bootstrap''.
**********************************************
Rob Tibshirani, Dept of Health Research & Policy
and Dept of Statistics
HRP Redwood Bldg
Stanford University
Stanford, California 94305-5405
phone: HRP: 650-723-7264 (Voice mail), Statistics 650-723-1185
FAX 650-725-8977
tibs at stat.stanford.edu
http://www-stat.stanford.edu/~tibs
From ingber at ingber.com Thu Aug 2 17:59:56 2001
From: ingber at ingber.com (Lester Ingber)
Date: Thu, 2 Aug 2001 16:59:56 -0500
Subject: Paper: Probability tree algorithm for general diffusion processes
Message-ID: <20010802165956.A13979@ingber.com>
The following preprint is available:
%A L. Ingber
%A C. Chen
%A R.P. Mondescu
%A D. Muzzall
%A M. Renedo
%T Probability tree algorithm for general diffusion processes
%J Physical Review E
%P (to be published)
%D 2001
%O URL http://www.ingber.com/path01_pathtree.ps.gz
ABSTRACT
Motivated by path-integral numerical solutions of diffusion
processes, PATHINT, we present a new tree algorithm, PATHTREE,
which permits extremely fast accurate computation of probability
distributions of a large class of general nonlinear diffusion
processes.
--
Prof. Lester Ingber ingber at ingber.com www.ingber.com
ingber at alumni.caltech.edu www.alumni.caltech.edu/~ingber
From allan at biomedica.org Thu Aug 2 18:24:25 2001
From: allan at biomedica.org (Allan Kardec Barros)
Date: Thu, 02 Aug 2001 19:24:25 -0300
Subject: Extraction of Specific Signals with Temporal Structure
Message-ID: <3B69D319.562840E6@biomedica.org>
Apologies if you receive multiple copies of this message.
Dear Everyone,
I would like to announce the following paper, recently published in
Neural Computation. For those familiar with ICA, the difference in this
algorithm is basically that, given some simple assumptions, we prove
that the permutation problem can be avoided, while the algorithm is
quite simple and based on second order statistics, which does not
require that at most one signal to be Gaussian.
Please feel free to mail me requesting either PS or PDF copies of
our work.
Best Regards,
ak.
TITLE: Extraction of Specific Signals with Temporal Structure.
AUTORS: A. K. Barros and A. Cichocki.
ABSTRACT:
In this work we develop a very simple batch learning algorithm for
semi-blind extraction of a desired source signal with temporal
structure from linear mixtures. Although we use the concept of
sequential blind extraction of sources and independent component
analysis (ICA), we do not carry out the extraction in a completely
blind manner neither we assume that sources are statistically
independent. In fact, we show that the {\it a priori} information
about the auto-correlation function of primary sources can be used to
extract the desired signals (sources of interest) from their linear
mixtures. Extensive computer simulations and real data application
experiments confirm the validity and high performance of the proposed
algorithm.
From ted.carnevale at yale.edu Fri Aug 3 12:07:49 2001
From: ted.carnevale at yale.edu (Ted Carnevale)
Date: Fri, 03 Aug 2001 12:07:49 -0400
Subject: NEURON course at SFN 2001 meeting
Message-ID: <3B6ACC55.93EBBAD1@yale.edu>
Short Course Announcement
USING THE NEURON SIMULATION ENVIRONMENT
Satellite Symposium, Society for Neuroscience Meeting
9 AM - 5 PM on Saturday, Nov. 10, 2001
Speakers: N.T. Carnevale, M.L. Hines,
J.W. Moore, and G.M. Shepherd
This 1 day course with lectures and live demonstrations will
present information essential for teaching and research
applications of NEURON, an advanced simulation environment
that handles realistic models of biophysical mechanisms,
individual neurons, and networks of cells. The emphasis is
on practical issues that are key to the most productive use
of this powerful and convenient modeling tool.
Features that will be covered include:
constructing and managing models with the
CellBuilder, Network Builder,
and Linear Circuit Builder
importing detailed morphometric data
using the Multiple Run Fitter to optimize models
with high-dimensional parameter spaces
database resources for empirically-based modeling
Each registrant will a comprehensive set of notes which
include material that has not appeared elsewhere in print.
For more information see the course's WWW pages at
http://www.neuron.yale.edu/sd2001.html
--Ted
Supported in part by the National Science Foundation.
Opinions expressed are those of the authors
and not necessarily those of the Foundation.
From Alton.Ford at tmp.com Fri Aug 3 18:14:06 2001
From: Alton.Ford at tmp.com (Ford, Alton)
Date: Fri, 3 Aug 2001 17:14:06 -0500
Subject: Postdoc positions at Los Alamos National Laboratory
Message-ID:
Postdoctoral Positions in Experimental and Computational Neuroscience
The Biophysics Group (http://www.biophysics.lanl.gov/) in the Physics
Division at Los Alamos National Laboratory seeks several postdoctoral
candidates in the areas of experimental and computational neuroscience.
Existing projects include recording of fast optical transients from neural
tissue and the development of associated high speed data acquisition
systems, imaging devices and optical technology, analysis of evoked MEG and
fMRI signals, computational modeling of information processing within the
biological neural systems, and collaborative work on the development of a
retinal prosthetic device. Successful candidates could combine work in
several of these areas. For further technical information, contact Dr. John
George at jsg at lanl.gov.
A Ph.D. in Physics, Electrical Engineering, Biology, or a related discipline
completed within the last three years or soon to be completed is required.
Current starting salaries range from $54,100 - $58,300. Further details
about the Postdoctoral Program may be found at:
http://www.hr.lanl.gov/postdoc/. For consideration, submit a resume and
publications list with a cover letter outlining current research interests,
including contact information for three references, to postdoc-jobs at lanl.gov
(reference PD017639), or submit two copies to:
Postdoc Program Office, PD017639, MS-P290, Los Alamos National Laboratory,
Los Alamos, NM 87545.
Los Alamos National Laboratory is operated by the University of California
for the U.S. Department of Energy. AA/EOE
Alton Ford
Account Executive
tmp.worldwide
Advertising & Communications
3032 Bunker Hill Lane, Suite 207
Santa Clara, CA 95054
* 408.844.0150
* 408.496.6704 fax
* alton.ford at tmp.com
www.tmp.com
Compliment your recruitment advertising with Web Dragon! ...A service
provided by TMP Worldwide, in which a team of our live professionals mine
through the millions of resumes on the Internet to find qualified resumes to
meet your specified recruitment needs. There are more resumes online than
ever. Fill your pipeline with quality resumes and let TMP do the work for
you! Please contact me for more information.
From juergen at idsia.ch Mon Aug 6 12:20:15 2001
From: juergen at idsia.ch (juergen@idsia.ch)
Date: Mon, 6 Aug 2001 18:20:15 +0200
Subject: PhD fellowship
Message-ID: <200108061620.SAA08768@ruebe.idsia.ch>
I am seeking a PhD student for research on state-of-the-art recurrent
neural networks. Please see http://www.idsia.ch/~juergen/phd2001.html
Interviews also possible at ICANN 2001 (Aug 21-25) in Vienna or at the
ICANN recurrent net workshop: http://www.idsia.ch/~doug/icann/index.html
-------------------------------------------------
Juergen Schmidhuber director
IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland
juergen at idsia.ch www.idsia.ch/~juergen
From CogSci at psyvax.psy.utexas.edu Tue Aug 7 13:34:51 2001
From: CogSci at psyvax.psy.utexas.edu (Cognitive Science Society)
Date: Tue, 07 Aug 2001 12:34:51 -0500
Subject: Richard M. Shiffrin awarded the Rumelhart Prize
Message-ID: <5.0.0.25.2.20010807123359.00b05848@psy.utexas.edu>
Richard M. Shiffrin Chosen to Receive the David E. Rumelhart Prize
for Contributions to the Formal Analysis of Human Cognition
The Glushko-Samuelson Foundation and the Cognitive Science Society are
pleased to announce that Richard M. Shiffrin has been chosen as the
second recipient of the $100,000 David E. Rumelhart Prize, awarded
annually for an outstanding contribution to the formal analysis of
human cognition. Shiffrin will receive this prize and give the Prize
Lecture at the 2002 Meeting of the Cognitive Science Society, at
George Mason University, August 7-11, 2002.
Shiffrin has made many contributions to the modeling of human
cognition in areas ranging from perception to attention to learning,
but is best known for his long-standing efforts to develop explicit
models of human memory. His most recent models use Bayesian, adaptive
approaches, building on previous work but extending it in a critical
new manner, and carrying his theory beyond explicit memory to implicit
learning and memory processes. The theory has been evolving for about
35 years, and as a result represents a progression similar to the best
theories seen in any branch of science.
Shiffrin's major effort began in 1968, in a chapter with Atkinson [1]
that laid out a model of the components of short- and long-term memory
and described the processes that control the operations of memory.
The Atkinson-Shiffrin model encapsulated empirical and theoretical
results from a very large number of publications that modeled
quantitatively the relation of short- to long-term memory. It
achieved its greatest success by showing the critical importance---and
the possibility---of modeling the control processes of cognition.
This chapter remains one of the most cited works in the entire field
of psychology.
Shiffrin's formal theory was taken forward in a quantum leap in 1980
[2] and 1981 [3] with the SAM (Search of Associative Memory) model.
This was a joint effort with Jeroen Raaijmakers, then a graduate
student. The SAM model quantified the nature of retrieval from
long-term memory, and characterized reCALL as a memory search with
cycles of sampling and recovery. The SAM theory precisely
incorporates the notions of interactive cue combination that are now
seen to lie at the heart of memory retrieval. Another major quantum
step occurred in 1984 [4] when the theory was extended to recognition
memory. With another former student, Gary Gillund, Shiffrin initiated
what has become the standard approach to recognition memory, in which
a decision is based on summed activation of related memory traces. It
was a major accomplishment that the same retrieval activations that
had been used in the recall model could be carried forward and used to
predict a wide range of recognition phenomena. The next major step
occurred in 1990, when Shiffrin published two articles on the
list-length effect with his student Steve Clark and his colleague,
Roger Ratcliff [5, 6]. This research was of critical importance in
that it established clearly that experience leads to the
differentiation, rather than the mere stregthening, of the
representations of items in memory.
In 1997, the theory evolved in a radical direction in an important
paper with another former student, Mark Steyvers [7]. Although the
changes were fundamental, the new model retained the best concepts of
its predecessors, so that the previous successful predictions were
also a part of the new theory. REM added featural representations, to
capture similarity relations among items in memory. Building on
earlier ideas by John Anderson, and related ideas developed in
parallel by McClelland and Chappell, Shiffrin used Bayesian principles
of adaptive and optimal decision making under constraints to guide the
selection of the quantitative form of the activation functions. In
addition, storage principles were set forth that provided mechanisms
by which episodic experience could coalesce over development and
experience into permanent non-contextualized knowledge. This latter
development allowed the modeling of implicit memory phenomena, in work
that is just now starting to appear in many journals, including a
theory of long-term priming [with Schooler and Raaijmakers, 8] and a
theory of short-term priming [with his student David Huber and others,
9]. The short-term priming research showed that the direction of
priming can be reversed by extra study given to particular primes,
leading to another conceptual breakthrough. A new version of the REM
model explains this and other findings by assuming that some prime
features are confused with test item features, and that the system
attempts to deal with this situation optimally by appropriate
discounting of evidence from certain features.
Biographical Information
Shiffrin received his Ph. D. from the Mathematical Psychology Program
in the Department of Psychology at Stanford University in 1968, the
year after Rumelhart received his degree from the same program. Since
1968 he has been on the faculty of the Department of Psychology at
Indiana University, where he is now the Luther Dana Waterman Professor
of Psychology and Director of the Cognitive Science Program. Shiffrin
has accumulated many honors, including membership in the National
Academy of Sciences, the American Academy of Arts and Sciences, the
Howard Crosby Warren Award of the Society of Experimental
Psychologists, and a MERIT Award from the National Institute of Mental
Health. Shiffrin has served the field as editor of the Journal of
Experimental Psychology: Learning Memory and Cognition, and as a
member of the governing boards of several scientific societies.
Cited Articles By Richard M. Shiffrin
[1] Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A
proposed system and its control processes. In K. W. Spence and
J. T. Spence (Eds.), The Psychology of Learning and Motivation:
Advances in Research and Theory (Vol. 2, pp. 89-195). New York:
Aaademic Press.
[2] Raaijmakers, J. G. W., & Shiffrin, R. M. (1980). SAM: A theory of
probabilistic search of associative memory. In Bower, G. H. (Ed.),
The Psychology of Learning and Motivation, Vol. 14, 207-262. New
York: Academic Press.
[3] Raaijmakers, J. G. W., & Shiffrin, R. M. (1981). Search of
associative memory. Psychological Review, 88, 93-134.
[4] Gillund, G., & Shiffrin, R. M. (1984). A retrieval model for both
recognition and recall. Psychological Reviw, 91, 1-67.
[5] Ratcliff, R., Clark, S., & Shiffrin, R. M. (1990). The
list-strength effect: I. Data and discussion. Journal of
Experimental Psychology: Learning, Memory, and Cognition, 16, 163-178.
[6] Shiffrin, R. M., Ratcliff, R., & Clark, S. (1990). The
list-strength effect: II. Theoretical mechanisms. Journal of
Experimental Psychology: Learning, Memory, and Cognition, 16, 179-195.
[7] Shiffrin, R. M., & Steyvers, M. (1997). A model for recognition
memory: REM: Retrieving effectively from memory. Psychonomic Bulletin
and Review, 4 (2), 145-166.
[8] Schooler, L., Shiffrin, R. M., & Raaijmakers, J. G. W. (2001). A
model for implicit effects in perceptual identification. Psychological
Review, 108, 257-272.
[9] Huber, D. E., Shiffrin, R. M., Lyle, K. B., & Ruys, K. I. (2001).
Perception and preference in short-term word priming. Psychological
Review, 108, 149-182.
================================================================
Geoffrey E. Hinton Named First Recipient of the David E. Rumelhart Prize
May 3, 2001
Today the Glushko-Samuelson foundation and the Cognitive Science
Society jointly announced that Geoffrey E. Hinton has been named the first
recipient of the David E. Rumelhart Prize for contemporary
contributions to the formal analysis of human cognition. Hinton, the
Director of the Gatsby Computational Neuroscience Unit at University
College, London, was chosen from a large field of outstanding nominees
because of his seminal contributions to the understanding of neural
networks.
"Hinton's insights into the analysis of neural networks played a
central role in launching the field in the mid-1980's" said Professor
James McClelland of Carnegie Mellon University, Chair of the Prize
Selection Committee, "Geoff also played a major role in conveying the
relevance of neural networks to higher-level cognition." Professor
Lawrence Barsalou of Emory University, President of the Cognitive
Science Society, agreed with this assessment. "Hinton's contributions
to Cognitive Science have been pivotal", said Barsalou. "As the first
recipient he sets a great example for future awards." Hinton will
receive the prize, which includes a monetary award of $100,000, at the
annual meeting of the Society in Edinburgh, Scotland, in early August,
2001.
The Rumelhart prize acknowledges intellectual generosity and effective
mentoring as well as scientific insight. "Dave Rumelhart gave away
many scientific ideas, and made important contributions to the work of
many of his students and co-workers" said Robert J. Glushko, President of
the Glushko-Samuelson foundation. He added "Hinton stands out not
only for his own contributions but for his exemplary record in
mentoring young scientists." A total of eighteen graduate students
have received their Ph. D.'s under Hinton's supervision.
In conjunction with naming Hinton as the first recipient of the David
E. Rumelhart Prize, the Glushko-Samuelson foundation announced that
the prize will be awarded on an annual basis, instead of biennially.
"This change reflects the number of outstanding scientists who were
nominated for the award" noted Glushko. "I am pleased that my
foundation can play a role in honoring their contributions to
cognitive science." The second recipient of the Prize will be
announced at the Edinburgh meeting of the society, and will give the
prize lecture at the next annual meeting, which will be at George
Mason University in August, 2002.
For further information, please visit the David E. Rumelhart Prize
web site:
http://www.cnbc.cmu.edu/derprize/DerPrize2001.html
or contact:
Robert J. Glushko, 415-644-8731
James L. McClelland, 412-268-3157
----------
Cognitive Science Society
c/o Tanikqua Young
Department of Psychology
University of Texas
Austin, TX 78712
Phone: (512) 471-2030
Fax: (512) 471-3053
----------
From amari at brain.riken.go.jp Thu Aug 9 00:57:01 2001
From: amari at brain.riken.go.jp (Shun-ichi Amari)
Date: Thu, 9 Aug 2001 13:57:01 +0900
Subject: FW: new book on Information Geometry
Message-ID:
********************
????
??????????????????????????
?????????????????????????
????????
?????????????????
Dear Connectionists
I have announced the publication of the book
"Methods of Information Geometry"
but heard complaints that the book is out of stock.
Now they printed further, and you can order from
AMS or Oxford University Press through bookshops.
*************
It is my pleasure to announce the publication of
a book on Information Geometry. I have been
often asked if there is a good book on information
geometry to know its general perspectives. Here it is.
S.Amari and H.Nagaoka, Methods of Information Geometry,
AMS Translations of Mathematical Monographs, vol 191
(translated by Daishi Harada)
American Mathematical Society (AMS) and Oxford University Press,
206 + x pages, 2000. (See http://www.ams.org/)
********************
Shun-ichi Amari
Vice Director, RIKEN Brain Science Institute
Laboratory for Mathematical Neuroscience
Research Group on Brain-Style Information Systems
tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687
amari at brain.riken.go.jp
http://www.bsis.brain.riken.go.jp/
From orhan at ait-tech.com Wed Aug 8 14:16:02 2001
From: orhan at ait-tech.com (Orhan Karaali)
Date: Wed, 8 Aug 2001 14:16:02 -0400
Subject: Research Scientist position at Advanced Investment Technology
Message-ID:
ADVANCED INVESTMENT TECHNOLOGY, INC.
www.ait-tech.com
Advanced Investment Technology, Inc. (AIT) is a registered investment
advisor based in Clearwater, Florida focusing on institutional
domestic equity asset management. Our partners include Boston-based
State Street Global Advisors, a global leader in institutional
financial services, and Amsterdam-based Stichting Pensioenfonds ABP,
one of the world's largest pension plans. AIT's reputation as an
innovative entrepreneur within the asset management community is built
upon the research and development of nontraditional quantitative stock
valuation techniques (neural networks and genetic algorithms) for
which a patent was issued in 1998.
POSITION: RESEARCH SCIENTIST
The position will involve developing software and valuation algorithms
for stock selection and portfolio management. Job responsibilities
include database development, running weekly production jobs, working
with financial data vendor feeds, contributing to financial research
projects, and developing applications in the areas of multifactor
stock models.
AIT uses Windows 2000; Visual Studio 6 and Visual Studio Net; MS SQL 2000;
C++ STL; OLE DB; XML; SOAP; and OLAP technologies.
Minimum Qualifications:
Bachelors Degree in Computer Science or a related field
Masters Degree in Computer Science or MBA
Very strong C++ and STL background
Working knowledge of SQL
Bonus Qualifications:
Familiarity with financial data and asset management
Experience developing object oriented software with C++ and STL
Familiarity with Microsoft Visual Studio
Knowledge of machine learning algorithms (NN, GA, GP, SVM)
To apply, please send your resume to:
E-mail: orhan at ait-tech.com
Fax: (727) 799-1232 (Attn: Orhan Karaali)
From norman at psych.colorado.edu Sat Aug 11 00:03:12 2001
From: norman at psych.colorado.edu (Ken Norman)
Date: Fri, 10 Aug 2001 22:03:12 -0600 (MDT)
Subject: new paper: modeling hippocampal and neocortical contributions to
recognition memory
Message-ID:
Dear Connectionists,
The following technical report is now available for downloading as:
ftp://grey.colorado.edu/pub/oreilly/papers/NormanOReilly01_recmem.pdf
webpage: http://psych.colorado.edu/~oreilly/pubs-abstr.html#01_recmem
Modeling Hippocampal and Neocortical Contributions to Recognition
Memory: A Complementary Learning Systems Approach
Kenneth A. Norman and Randall C. O'Reilly
Department of Psychology
University of Colorado
Boulder, CO 80309
ICS Technical Report 01-02
Abstract:
We present a computational neural network model of recognition memory
based on the biological structures of the hippocampus and medial
temporal lobe cortex (MTLC), which perform complementary learning
functions. The hippocampal component of the model contributes to
recognition by recalling specific studied details. MTLC can not
support recall, but it is possible to extract a scalar familiarity
signal from MTLC that tracks how well the test item matches studied
items. We present simulations that establish key qualitative
differences in the operating characteristics of the hippocampal recall
and MTLC familiarity signals, and we identify several manipulations
(e.g., target-lure similarity, interference) that differentially
affect the two signals. We also use the model to address the
stochastic relationship between recall and familiarity (i.e., are they
independent), and the effects of partial vs. complete hippocampal
lesions on recognition.
From yokoy at brain.riken.go.jp Tue Aug 14 02:12:01 2001
From: yokoy at brain.riken.go.jp (Yoko Yamaguchi)
Date: Tue, 14 Aug 2001 15:12:01 +0900
Subject: Postdoctorial/ technical staff positions in cognitive science and
computational neurosicence
Message-ID:
Please post:
POSTDOCTORAL SCIENTIST/ TECHNICAL STAFF POSITIONS at RIKEN BSI
Laboratory for Dynamics of Emergent Intelligence, Brain-Style Intelligence
Research Group, RIKEN Brain Science Institute (BSI) invites applicants for
postdoctoral and technical staff scientists in the fields of cognitive
science and computational neurosicence.
Our objective is to clarify the neural principle for the dynamics of
emergent intelligence in novel situations. Particular emphasis is given to
synchronization of oscillations in hierarchical neural networks.
For further information see http://www.dei.brain.riken.go.jp/
Applicants for postdoctoral positions must have a PhD.
Technical staff are expected to have a bachelors or masters degree.
Applicant should submit a full CV detailing education and experience,
attached with your photograph, in addition to a complete bibliography of
publications. The names, addresses, email addresses of two referees must
also be supplied. Send all applications to:
Contact::
Dr. Yoko yamaguchi (Laboratory Head)
Lab. for Dynamics of Emergent Intelligence
Brain Science Institute, RIKEN
2-1 Hirosawa, Wako, Saitama, 351-0198 Japan
FAX: +81-48-467-6938
E-mail : yokoy at brain.riken.go.jp
-------------------------------------------------------------------
Yoko Yamaguchi
Lab. for Dynamics of Emergent Intelligence
RIKEN Brain Science Institute(BSI)
From bbs at bbsonline.org Mon Aug 13 16:27:04 2001
From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor))
Date: Mon, 13 Aug 2001 16:27:04 -0400
Subject: Norman: Two Visual Systems -- BBS Call for Commentators
Message-ID:
Dear Dr. Connectionists List User,
Below is the abstract of a forthcoming BBS target article
Two Visual Systems and Two Theories of Perception:
An Attempt to Reconcile the Constructivist and Ecological Approaches
by
Joel Norman
http://www.bbsonline.org/Preprints/Norman/
http://psy.haifa.ac.il/~maga/tvs&ttp.pdf
This article has been accepted for publication in Behavioral and Brain
Sciences (BBS), an international, interdisciplinary journal providing
Open Peer Commentary on important and controversial current research in
the biobehavioral and cognitive sciences.
Commentators must be BBS Associates or nominated by a BBS Associate. To
be considered as a commentator for this article, to suggest other
appropriate commentators, or for information about how to become a BBS
Associate, please reply by EMAIL within three (3) weeks to:
calls at bbsonline.org
The Calls are sent to 8000 BBS Associates, so there is no expectation
(indeed, it would be calamitous) that each recipient should comment
on every occasion! Hence there is no need to reply except if you wish
to comment, or to nominate someone to comment.
If you are not a BBS Associate, please approach a current BBS
Associate (there are currently over 10,000 worldwide) who is familiar
with your work to nominate you. All past BBS authors, referees and
commentators are eligible to become BBS Associates. A full electronic
list of current BBS Associates is available at this location to help
you select a name:
http://www.bbsonline.org/Instructions/assoclist.html
If no current BBS Associate knows your work, please send us your
Curriculum Vitae and BBS will circulate it to appropriate Associates to
ask whether they would be prepared to nominate you. (In the meantime,
your name, address and email address will be entered into our database
as an unaffiliated investigator.)
To help us put together a balanced list of commentators, please give
some indication of the aspects of the topic on which you would bring
your areas of expertise to bear if you were selected as a commentator.
To help you decide whether you would be an appropriate commentator for
this article, an electronic draft is retrievable from the online
BBSPrints Archive, at the URL that follows the abstract below.
_____________________________________________________________
Two Visual Systems and Two Theories of Perception:
An Attempt to Reconcile the Constructivist and Ecological Approaches
Joel Norman
Department of Psychology
University of Haifa
Haifa, Israel
jnorman at psy.haifa.ac.il
KEYWORDS: Visual perception theories, ecological, constructivist,
two visual systems, space perception, size perception,
dual-process approach
ABSTRACT: The two contrasting theoretical approaches to visual
perception, the constructivist and the ecological, are briefly
presented and illustrated through their analyses of space perception
and size perception. Earlier calls for their reconciliation and
unification are reviewed. Neurophysiological, neuropsychological, and
psychophysical evidence for the existence of two quite distinct visual
systems, the ventral and the dorsal, is presented. These two
perceptual systems differ in their functions; the ventral systems
central function is that of identification, while the dorsal system is
mainly engaged in the visual control of motor behavior. The strong
parallels between the ecological approach and the functioning of the
dorsal system and between the constructivist approach and the
functioning of the ventral system are noted. It is also shown that the
experimental paradigms used by the proponents of these two approaches
match the functions of the respective visual systems. A dual-process
approach to visual perception emerges from this analysis, with the
ecological-dorsal process transpiring mainly without conscious
awareness, while the constructivist-ventral process is normally
conscious. Some implications of this dual-process approach to
visual-perceptual phenomena are presented, with emphasis on space
perception.
http://www.bbsonline.org/Preprints/Norman/
http://psy.haifa.ac.il/~maga/tvs&ttp.pdf
___________________________________________________________
Please do not prepare a commentary yet. Just let us know, after having
inspected it, what relevant expertise you feel you would bring to bear
on what aspect of the article. We will then let you know whether it was
possible to include your name on the final formal list of invitees.
_______________________________________________________________________
*** SUPPLEMENTARY ANNOUNCEMENTS ***
(1) The authors of scientific articles are not paid money for their
refereed research papers; they give them away. What they want is to
reach all interested researchers worldwide, so as to maximize the
potential research impact of their findings.
Subscription/Site-License/Pay-Per-View costs are accordingly
access-barriers, and hence impact-barriers for this give-away
research literature.
There is now a way to free the entire refereed journal literature,
for everyone, everywhere, immediately, by mounting interoperable
university eprint archives, and self-archiving all refereed research
papers in them.
Please see: http://www.eprints.org
http://www.openarchives.org/
http://www.dlib.org/dlib/december99/12harnad.html
---------------------------------------------------------------------
(2) All authors in the biobehavioral and cognitive sciences are
strongly encouraged to self-archive all their papers in their own
institution's Eprint Archives or in CogPrints, the Eprint Archive
for the biobehavioral and cognitive sciences:
http://cogprints.soton.ac.uk/
It is extremely simple to self-archive and will make all of our
papers available to all of us everywhere, at no cost to anyone,
forever.
Authors of BBS papers wishing to archive their already published
BBS Target Articles should submit it to BBSPrints Archive.
Information about the archiving of BBS' entire backcatalogue will
be sent to you in the near future. Meantime please see:
http://www.bbsonline.org/help/
and
http://www.bbsonline.org/Instructions/
---------------------------------------------------------------------
(3) Call for Book Nominations for BBS Multiple Book Review
In the past, Behavioral and Brain Sciences (BBS) had only been able
to do 1-2 BBS multiple book treatments per year, because of our
limited annual page quota. BBS's new expanded page quota will make
it possible for us to increase the number of books we treat per
year, so this is an excellent time for BBS Associates and
biobehavioral/cognitive scientists in general to nominate books you
would like to see accorded BBS multiple book review.
(Authors may self-nominate, but books can only be selected on the
basis of multiple nominations.) It would be very helpful if you
indicated in what way a BBS Multiple Book Review of the book(s) you
nominate would be useful to the field (and of course a rich list of
potential reviewers would be the best evidence of its potential
impact!).
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
Please note: Your email address has been added to our user database for
Calls for Commentators, the reason you received this email. If you do
not wish to receive further Calls, please feel free to change your
mailshot status through your User Login link on the BBSPrints homepage.
Check the helpfiles for details of how to obtain your username and
password.
http://www.bbsonline.org/
For information about the mailshot, please see the help file at:
http://www.bbsonline.org/help/node5.html#mailshot
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
From murphyk at cs.berkeley.edu Tue Aug 14 20:42:21 2001
From: murphyk at cs.berkeley.edu (Kevin Murphy)
Date: Tue, 14 Aug 2001 17:42:21 -0700
Subject: OpenBayes
Message-ID: <3B79C56D.43E951A5@cs.berkeley.edu>
Richard Dybowski formed the OpenBayes discussion group/email list
on 17 January 2001. The goal is to discuss the development of an open
source library for probabilistic graphical models. We had our first
meeting at the recent UAI conference in Seattle. The only concrete
decision reached was that we should advertise the existence of this
group more widely - hence this email.
For more details on the OpenBayes project, please see
http://HTTP.CS.Berkeley.EDU/~murphyk/OpenBayes/index.html
This page includes a list of people who attended the meeting, more
details on the project's goals, achievements to date, ways you can
subscribe to the list and/or contribute code, etc.
Kevin Murphy
P.S. If you have problems subscribing to the list, please send email
to openbayes-owner at egroups.com, not to me! I am not the moderator.
From neted at anc.ed.ac.uk Wed Aug 15 05:54:39 2001
From: neted at anc.ed.ac.uk (Network Editor)
Date: Wed, 15 Aug 2001 10:54:39 +0100
Subject: NETWORK: Computation in Neural Systems
Message-ID: <15226.18143.81476.664143@gargle.gargle.HOWL>
Here is the contents page for the current issue of NETWORK:
Computation in Neural Systems. NETWORK publishes original research
work on theoretical and computational aspects of the development and
functioning of the nervous system, at all levels of analysis,
particularly at the network, cellular and subcellular levels.
Professor David Willshaw
Editor-in-Chief
NETWORK: Computation in Neural Systems
Institute for Adaptive & Neural Computation
Division of Informatics
University of Edinburgh
5 Forrest Hill
Edinburgh EH1 2QL
UK
Tel: +44-(0)131-650 4404
Fax: +44-(0)131-650 4406
Email: neted at anc.ed.ac.uk
========================================================================
NETWORK: COMPUTATION IN NEURAL SYSTEMS - VOLUME 12, ISSUE 3, AUGUST 2001
Special issue featuring selected papers from the Natural Stimulus
Statistics Workshop, October 2000, Cold Spring Harbor, USA
EDITORIALS
Publishing papers in Network: Special Issues
D J Willshaw (p 235)
Natural stimulus statistics
P Reinagel and S Laughlin (pp 237-240)
PAPERS
Redundancy reduction revisited
H Barlow (pp 241-253)
Characterizing the sparseness of neural codes
B Willmore and D J Tolhurst (pp 255-270)
Beats, kurtosis and visual coding
M G A Thomson (pp 271-287)
Estimating spatio-temporal receptive fields of auditory and visual
neurons from their responses to natural stimuli
F E Theunissen, S V David, N C Singh, A Hsu, W E Vinje and J L Gallant
(pp 289-316)
Neural coding of naturalistic motion stimuli
G D Lewen, W Bialek and R R de Ruyter van Steveninck (pp 317-329)
Nonlinear and extra-classical receptive field properties and the
statistics of natural scenes C Zetzsche and F R?hrbein (pp 331-350)
Neuronal processing of behaviourally generated optic flow: experiments
and model simulations
R Kern, M Lutterklas, C Petereit, J P Lindemann and M Egelhaaf (pp 351-369)
Can recent innovations in harmonic analysis `explain' key findings in
natural image statistics?
D L Donoho and A G Flesia (pp 371-393)
Optimal nonlinear codes for the perception of natural colours
T von der Twer and D I A MacLeod (pp 395-407)
From colette.faucher at wanadoo.fr Thu Aug 16 05:40:32 2001
From: colette.faucher at wanadoo.fr (colette faucher)
Date: Thu, 16 Aug 2001 02:40:32 -0700
Subject: cfp for FLAIRS special track : Categorization and Concept
Representation : Models and Implications
Message-ID: <3b7b16df3cc49870@amyris.wanadoo.fr> (added by amyris.wanadoo.fr)
===========================================================================
FLAIRS 2002
15th International Florida Artificial Intelligence Research Society
Conference
Pensacola, Florida
May 16-18, 2002
Special Track : "Categorization and Concept Representation :
Models and Implications"
===========================================================================
This track seeks to bring together researchers working on issues related to
categorization and concept representation in the areas of Artificial
Intelligence and Cognitive Psychology.
Topic Description
------------------
Categorization is the process by which distinct entities are treated as
equivalent. It is one of the most fundamental and pervasive cognitive
activities. It is fundamental because categorization allows us to understand
and make predictions about objects and events in our world. The problem of
understanding what criteria are used to group together entities in a same
category is indeed central in categorization. Though most works in that
topic have proposed that perceptual or structural similarity is the "glue"
that binds objects of a same category, some psychologists have claimed that
similarity is insufficient to account for the acquisition and use of
categories and have proposed more abstract forms of criteria that make
categories coherent and give them a kind of homogeneity in terms of the
entities that belong to them.
The different new propositions psychologists have suggested are that objects
are grouped together because they facilitate a common goal or serve the same
function. Some categories are viewed as coherent because they rest on a
theory which explains the commonalities of their elements. Similarity and
goals, on one hand, and theories, on the other hand, have not been paid the
same attention in computational models of categorization. Similarity-based
models abound and the notion of categorization goals has also been exploited
in computational models. On the other hand, the notion of an underlying
theory that makes a category coherent just begins to be further analyzed and
specified. New computational models of categorization reflecting this new
tendency are thus expected.
The representation of concepts that a categorization system generates is of
course intimately tied to the criteria this system uses to group entities
into categories, so along with new models of categorization, we expect to
see the emergence of new models of concept representation apart from the
classical ones deriving from the Aristotelician, the Prototypical and the
Exemplar Views. The representation of the entities to categorize plays also
an important part in the categorization process. In particular, the context
in which the entities occur may influence the way they are classified.
The purpose of this track is to bring fresh insights concerning a perhaps
revisited notion of similarity, the way goals of categorization influence
this process, how the notion of the theory of a concept can be formalized
and implemented in computational models of categorization and the
implications those elements may have on the representation of concepts.
The contributions to this track may be situated in the symbolic approach of
categorization or the connectionist one.
Contributions in the following sub-topics would be welcomed :
- Computational models of similarity,
- Computational models of theory-based categorization,
- Computational models of similarity-based categorization,
- Computational models of human categorization,
- Models of concept representation which are relevant as regards to the
process of categorization,
- Models of concept representation and elicitation,
- Formalization of the notion of theory which underlies a category,
- Formalization of the context of occurrence of the entities to categorize
and its influence on the categorization process.
This list is not exclusive provided that the contributions are relevant to
the definition of the track specified above.
Paper Review and Publication
------------------------------
Only full papers will be considered for the track. Submitted papers will be
reviewed by two program committee members. An author for an accepted paper
is expected to present the paper in the track. Papers accepted for the track
will be published in the FLAIRS 2002 Conference Proceedings.
The best papers will be invited for modification, extension and submission
to a special issue in an international AI journal.
Important dates
----------------
Paper Submission Deadline : November 15, 2001
Notification of Acceptance-Rejection : January 10, 2002
Camera Ready Copy Due : March 4, 2002
Journal Invitation : February 10, 2002
Journal Paper Due : May 10, 2002
Conference Dates : May 16-18, 2002
Program Committee
------------------
David W. Aha, Navy Center for Applied Research in AI, Washington, USA
Ralph Bergmann, University of Kaiserslautern, Germany
Max Bramer, University of Portsmouth, UK
Colette Faucher (Chair), University of Aix-Marseille III, France
Paolo Frasconi, University of Florence, Italy
Robert L. Goldstone, Indiana University, USA
James Hampton, City University, London, UK
David Leake, Indiana University, USA
Bradley C. Love, University of Texas, USA
Paul Mc Kevitt, University of Ulster, Northern Ireland
Ryszard S. Michalski, George Mason University, USA
Philip Resnik, University of Maryland, USA
Lance J. Rips, Northwestern University, USA
Steven A. Sloman, Brown University, USA
Paper Submission Information
-----------------------------
Authors must submit an electronic copy of their complete manuscript of no
more than 5 pages. All submissions must be original work.
The review will be blind. Author names and affiliations are to appear ONLY
on a separate cover page. The presenter (if different) from the first author
must be specified on that cover page. All appropriate contact information
must be mentioned for each author (e-mail, phone, fax, etc.).
Papers must be written using MS Word, RTF or PDF formats according to AAAI's
standard format for authors.
All submissions must be sent in electronic form to :
colette.faucher at iuspim.u-3mrs.fr and colette.faucher at wanadoo.fr
For any problem or question, please contact the chair track, Colette
Faucher, at : colette.faucher at iuspim.u-3mrs.fr or
colette.faucher at wanadoo.fr.
Track Website
--------------
http://perso.wanadoo.fr/colette.faucher/categorization.html
FLAIRS 2002 Website
--------------------
http://altair.coginst.uwf.edu/~jkolen/Flairs2002/intro.php3
From wolfskil at MIT.EDU Fri Aug 17 13:17:26 2001
From: wolfskil at MIT.EDU (Jud Wolfskill)
Date: Fri, 17 Aug 2001 13:17:26 -0400
Subject: book announcement--Leen
Message-ID: <5.0.2.1.2.20010817115831.00ae3e28@hesiod>
Hello,
I thought readers of the Connectionists List might be interested in this
book. For more information please visit
http://mitpress.mit.edu/catalog/item/default.asp?sid=59E8DBE7-4980-48C7-A87F-F0917571FB1E&ttype=2&tid=8662
Best,
Jud
Advances in Neural Information Processing Systems 13
edited by Todd K. Leen, Thomas G. Dietterich, and Volker Tresp
The annual conference on Neural Information Processing Systems (NIPS) is
the flagship conference on neural computation. The conference is
interdisciplinary, with contributions in algorithms, learning theory,
cognitive science, neuroscience, vision, speech and signal processing,
reinforcement learning and control, implementations, and diverse
applications. Only about 30 percent of the papers submitted are accepted
for presentation at NIPS, so the quality is exceptionally high. These
proceedings contain all of the papers that were presented at the 2000
conference.
Todd K. Leen is Professor of Computer Science and Engineering, and of
Electrical and Computer Engineering, at Oregon Graduate Institute of
Science and Technology. Thomas G. Dietterich is Professor of Computer
Science at Oregon State University. Volker Tresp heads a research group at
Siemens Corporate Technology in Munich.
7 x 10, 1100 pp., cloth ISBN 0-262-12241-3
Neural Information Processing series
A Bradford Book
Jud Wolfskill
Associate Publicist
MIT Press
5 Cambridge Center, 4th Floor
Cambridge, MA 02142
617.253.2079
617.253.1709 fax
wolfskil at mit.edu
From cindy at cns.bu.edu Fri Aug 17 10:04:17 2001
From: cindy at cns.bu.edu (Cynthia Bradford)
Date: Fri, 17 Aug 2001 10:04:17 -0400
Subject: Neural Networks 14(6/7): 2001 Special Issue
Message-ID: <200108171404.KAA06299@retina.bu.edu>
NEURAL NETWORKS 14(6/7)
Contents - Volume 14, Numbers 6/7 - 2001
2001 Special Issue
"Spiking Neurons in Neuroscience and Technology"
Stephen Grossberg, Wolfgang Maass, and Henry Markram, co-editors
------------------------------------------------------------------
Neural assemblies: Technical issues, analysis, and modeling
George L. Gerstein and Lyle L. Kirkland
Coding properties of spiking neurons:
Reverse and cross-correlations
Wulfram Gerstner
ON-OFF retinal ganglion cells temporally encode OFF/ON sequence
Hiroyuki Uchiyama, Koichi Goto, and Hiroyuki Matsunobu
Building blocks for electronic spiking neural networks
Andre van Schaik
Orientation-selective aVLSI spiking neurons
Shih-Chii Liu, Jorg Kramer, Giacomo Indiveri, Tobias Delbruck,
Thomas Burg, and Rodney Douglas
Space-rate coding in an adaptive silicon neuron
Kai Hynna and Kwabena Boahen
Propagation of cortical synfire activity:
Survival probability in single trials and stability in the mean
Marc-Oliver Gewaltig, Markus Diesmann, and Ad Aertsen
Fokker-Planck approach to the pulse packet propagation in
synfire chain
H. Cateau and T. Fukai
Connection topology dependence of synchronization of neural
assemblies on class 1 and 2 excitability
Luis F. Lago-Fernandez, Fernando J. Corbacho, and Ramon Huerta
Deterministic dynamics emerging from a cortical functional
architecture
Ralph M. Siegel and Heather L. Read
Spike-based strategies for rapid processing
Simon Thorpe, Arnaud Delorme, and Rufin van Rullen
Zero-lag synchronous dynamics in triplets of interconnected
cortical areas
D. Chawla, K.J. Friston, and E.D. Lumer
Neural timing nets
P.A. Cariani
Spike-based VLSI modeling of the ILD system in the echolocating bat
Timothy Horiuchi and Kai Hynna
Pattern separation and synchronization in spiking associative
memories and visual areas
Andreas Knoblauch and Gunther Palm
Probabilistic synaptic weighting in a reconfigurable network of
VLSI integrate-and-fire neurons
David H. Goldberg, Gert Cauwenberghs, and Andreas G. Andreou
Face identification using one spike per neuron:
Resistance to image degradations
A. Delorme and S.J. Thorpe
Temporal receptive fields, spikes, and Hebbian delay selection
Christian Leibold and J. Leo van Hemmen
Distributed synchrony in a cell assembly of spiking neurons
Nir Levy, David Horn, Isaac Meilijson, and Eytan Ruppin
Associative memory in networks of spiking neurons
Friedrich T. Sommer and Thomas Wennekers
Trajectory estimation from place cell data
Nanayaa Twum-Danso and Roger Brockett
A pulsed neural network model of bursting in the basal ganglia
Mark D. Humphries and Kevin N. Gurney
Regularization mechanisms of spiking-bursting neurons
P. Varona, J.J. Torres, R. Huerta, H.D.I. Abarbanel, and
M.I. Rabinovich
Optimal firing rate estimation
Michael G. Paulin and Larry F. Hoffman
Resonate-and-fire neurons
Eugene M. Izhikevich
Coherence resonance and discharge time reliability in neurons
and neuronal models
K. Pakdaman, Seiji Tanabe, and Tetsuya Shimokawa
Adaptation in single spiking neurons based on a noise shaping
neural coding hypothesis
Jonghan Shin
The double queue method:
A numerical method for integrate-and-fire neuron networks
Geehyuk Lee and Nabil H. Farhat
A spiking neural network architecture for nonlinear function
approximation
Nicolangelo Iannella and Andrew D. Back
From kenm at uwo.ca Sun Aug 19 16:02:44 2001
From: kenm at uwo.ca (Ken McRae)
Date: Sun, 19 Aug 2001 16:02:44 -0400
Subject: Postdoctoral Postion
Message-ID:
Postdoctoral Fellowship in Psycholinguistics & Computational Modeling
I have funding for a two-year Postdoctoral Fellowship in my Cognitive
Science laboratory at the University of Western Ontario in London, Ontario,
Canada. The stipend is $35,000 per year plus $2,500 per year for conference
travel. There are no citizenship restrictions.
Our research focuses on the interrelated issues of noun meaning, verb
meaning, and sentence processing. Our research integrates theories and
methodologies from a number of areas, including: word recognition, semantic
memory, concepts and categorization, sentence processing, connectionist
modeling, and cognitive neuropsychology. Central to our research program is
connectionist modeling of the computation of noun and verb meaning, as well
as competition-integration modeling of on-line sentence reading time. Thus,
a postdoctoral fellow in my lab will have the opportunity to participate in
projects in a number of areas of Cognitive Science.
Our department has a number of Cognition faculty, all of whom conduct
research related to language processing. Thus, our faculty and graduate
students provide a rich research environment. I am also involved in a number
of collaborations with researchers from other universities. My lab is
well-equipped for both human experimentation and computational modeling. UWO
also has a 4T magnet that is used for research only.
London is a pleasant city of approximately 350,000, and is located 2 hours
drive from either Toronto or Detroit. Note that a reasonable one-bedroom
apartment in London costs approximately $500 per month.
For further information about our lab, and Cognition at UWO, see:
http://www.sscl.uwo.ca/psychology/cognitive/faculty.html
If you are interested in this position, please send a cv, a statement of
research interests, and 3 letters of reference to me at the address below.
Sending all information electronically is preferable. The start-date for
this position is flexible. If you would like more information about this
position, please contact me directly.
***********************************************************
Ken McRae
Associate Professor
Department of Psychology & Neuroscience Program
Social Science Centre
University of Western Ontario
London, Ontario CANADA N6A 5C2
email: mcrae at uwo.ca
http://www.sscl.uwo.ca/psychology/cognitive/mcrae/mcrae.html
phone: (519) 661-2111 ext. 84688 fax: (519) 661-3961
***********************************************************
From cohn+jmlr at cs.cmu.edu Mon Aug 20 14:01:50 2001
From: cohn+jmlr at cs.cmu.edu (JMLR)
Date: Mon, 20 Aug 2001 14:01:50 -0400
Subject: New paper in the Journal of Machine Learning Research: Bayes Point Machines
Message-ID:
The Journal of Machine Learning Research (www.jmlr.org) is pleased to
announce the availability of a new paper in electronic form.
----------------------------------------
Bayes Point Machines
Ralf Herbrich, Thore Graepel and Colin Campbell. Journal of Machine Learning
Research 1 (August 2001), pp. 245-279.
Abstract
Kernel-classifiers comprise a powerful class of non-linear decision
functions for binary classification. The support vector machine is an
example of a learning algorithm for kernel classifiers that singles out the
consistent classifier with the largest margin, i.e. minimal real-valued
output on the training sample, within the set of consistent hypotheses, the
so-called version space. We suggest the Bayes point machine as a
well-founded improvement which approximates the Bayes-optimal decision by
the centre of mass of version space. We present two algorithms to
stochastically approximate the centre of mass of version space: a billiard
sampling algorithm and a sampling algorithm based on the well known
perceptron algorithm. It is shown how both algorithms can be extended to
allow for soft-boundaries in order to admit training errors. Experimentally,
we find that - for the zero training error case - Bayes point machines
consistently outperform support vector machines on both surrogate data and
real-world benchmark data sets. In the soft-boundary/soft-margin case, the
improvement over support vector machines is shown to be reduced. Finally, we
demonstrate that the real-valued output of single Bayes points on novel test
points is a valid confidence measure and leads to a steady decrease in
generalisation error when used as a rejection criterion.
This paper and earlier papers in Volume 1 are available electronically at
http://www.jmlr.org in PostScript, PDF and HTML formats; a bound, hardcopy
edition of Volume 1 will be available later this year.
-David Cohn,
Managing Editor, Journal of Machine Learning Research
-------
This message has been sent to the mailing list "jmlr-announce at ai.mit.edu",
which is maintained automatically by majordomo. To subscribe to the list,
send mail to listserv at ai.mit.edu with the line "subscribe jmlr-announce" in
the body; to unsubscribe send email to listserv at ai.mit.edu with the line
"unsubscribe jmlr-announce" in the body.
From jf218 at hermes.cam.ac.uk Mon Aug 20 17:15:19 2001
From: jf218 at hermes.cam.ac.uk (Dr J. Feng)
Date: Mon, 20 Aug 2001 22:15:19 +0100 (BST)
Subject: five years post at cambridge
In-Reply-To: <200108171404.KAA06299@retina.bu.edu>
Message-ID:
The Babraham Institute, Cambridge
Computational Neuroscientist/Electrophysiologist (Ref. KK/CNE)
Applications are invited for a postdoctoral scientist to join a group of
systems neuroscientists within the Laboratory of Cognitive and
Developmental Neuroscience investigating how the brain encodes visual and
olfactory cues associated with recognition or both social and non-social
objects using novel multi-array electrophysiological recording techniques
in both rodent and sheep models. This post is available initially for 5
years. It would either suit an individual with primary expertise in
computational analysis and modelling of sensory system functioning or an
in vivo electrophysiologist with good expertise in computational analysis
of complex single-unit data. In both cases there would be significant
involvement in carrying out multi-array electrophysiological recording
experiments and subsequent data analysis and representation. The
individual would also be expected to work closely with
electrophysiologists both within the group and the USA and to co-ordinate
with other UK-based Computational Neuroscientists involved with the
projects. The group already has excellent computational facilities to deal
with the large amounts data associated with multi-array recording
experiments
Informal enquiries on these Neuroscience vacancies should be directed to
Dr. Keith Kendrick, Head of Neurobiology Programme: tel: 44(0) 1223
496385, fax. 44(0)1223 496028, e-mail keith.kendrick at bbsrc.ac.uk
Starting salary in the range ?19,500 - ?23,000 per annum. Benefits include
a non-contributory pension scheme, 25 days leave and 10? public holidays a
year. On site Refectory, Nursery and Sports & Social Club as well as free
car parking.
Further details and an application form available from the Personnel
Office, The Babraham Institute, Babraham, Cambridge CB2 4AT. Tel. 01223
496000, e-mail babraham.personnel at bbsrc.ac.uk. The closing date for these
positions is 28th September 2001.
AN EQUAL OPPORTUNITIES EMPLOYER
An Institute supported by the Biotechnology and Biological Sciences
Research Council
Jianfeng Feng
The Babraham Institute
Cambridge CB2 4AT
UK
http://www.cosg.susx.ac.uk/users/jianfeng
http://www.cus.cam.ac.uk/~jf218
From wolfskil at MIT.EDU Mon Aug 20 10:25:54 2001
From: wolfskil at MIT.EDU (Jud Wolfskill)
Date: Mon, 20 Aug 2001 10:25:54 -0400
Subject: book announcement--O'Reilly
Message-ID: <5.0.2.1.2.20010820102443.00a82000@hesiod>
I thought readers of the Connectionists List might be interested in this
book. For more information please visit
http://mitpress.mit.edu/catalog/item/default.asp?sid=16CDFF8A-3F4A-4FB5-B713-D8725D0A6969&ttype=2&tid=3345
Best,
Jud
Computational Explorations in Cognitive Neuroscience
Understanding the Mind by Simulating the Brain
Randall C. O'Reilly and Yuko Munakata
foreword by James L. McClelland
The goal of computational cognitive neuroscience is to understand how the
brain embodies the mind by using biologically based computational models
comprising networks of neuronlike units. This text, based on a course
taught by Randall O'Reilly and Yuko Munakata over the past several years,
provides an in-depth introduction to the main ideas in the field. The
neural units in the simulations use equations based directly on the ion
channels that govern the behavior of real neurons, and the neural networks
incorporate anatomical and physiological properties of the neocortex. Thus
the text provides the student with knowledge of the basic biology of the
brain as well as the computational skills needed to simulate large-scale
cognitive phenomena.
The text consists of two parts. The first part covers basic neural
computation mechanisms: individual neurons, neural networks, and learning
mechanisms. The second part covers large-scale brain area organization and
cognitive phenomena: perception and attention, memory, language, and
higher-level cognition. The second part is relatively self-contained and
can be used separately for mechanistically oriented cognitive neuroscience
courses. Integrated throughout the text are more than forty different
simulation models, many of them full-scale research-grade models, with
friendly interfaces and accompanying exercises. The simulation software
(PDP++, available for all major platforms) and simulations can be
downloaded free of charge from the Web. Exercise solutions are available,
and the text includes full information on the software.
Randall C. O'Reilly is Assistant Professor in the Department of Psychology
and at the Institute for Cognitive Science at the University of Colorado,
Boulder. Yuko Munakata is Assistant Professor in Developmental Cognitive
Neuroscience at the University of Denver.
8 x 9, 512 pp., 213 illus., paper ISBN 0-262-65054-1
A Bradford Book
Jud Wolfskill
Associate Publicist
MIT Press
5 Cambridge Center, 4th Floor
Cambridge, MA 02142
617.253.2079
617.253.1709 fax
wolfskil at mit.edu
From ps629 at columbia.edu Tue Aug 21 15:54:59 2001
From: ps629 at columbia.edu (Paul Sajda)
Date: Tue, 21 Aug 2001 15:54:59 -0400
Subject: Postdoctoral Position in Computational Neural Modeling
Message-ID: <3B82BC93.34337935@columbia.edu>
Postdoctoral Position in Computational Neural Modeling--a two year
position is available immediately for conducting research in modeling
of neural mechanisms for visual scene analysis, with particular
applications to spatio-temporal and hyperspectral imagery. A
mathematical and computational background is desired, particularly in
probabilistic modeling and optimization. This position will be part of
a multi-university research team (UPenn, Columbia and MIT)
investigating biomimetic methods for analysis of literal and
non-literal imagery through a combination of experimental physiology,
neuromorphic design and simulation, computational modeling and visual
psycophysics.
Applicants should send a CV, three representative papers and the names
of three references to Prof. Paul Sajda, Department of Biomedical
Engineering, Columbia University, 530 W 120th Street, NY, NY 10027. Or
email to ps629 at columbia.edu.
--
Paul Sajda, Ph.D.
Associate Professor
Department of Biomedical Engineering
530 W 120th Street
Columbia University
New York, NY 10027
tel: (212) 854-5279
fax: (212) 854-8725
email: ps629 at columbia.edu
http://www.columbia.edu/~ps629
From wolfskil at MIT.EDU Wed Aug 22 14:17:30 2001
From: wolfskil at MIT.EDU (Jud Wolfskill)
Date: Wed, 22 Aug 2001 14:17:30 -0400
Subject: book announcement--Opper
Message-ID: <5.0.2.1.2.20010822140915.00b083c0@hesiod>
I thought readers of the Connectionists List might be interested in this
book. For more information please visit
http://mitpress.mit.edu/catalog/item/default.asp?sid=5CEC3656-296C-4C48-B6E3-6BDFAC7EBADD&ttype=2&tid=3847
Best,
Jud
Advanced Mean Field Methods
Theory and Practice
edited by Manfred Opper and David Saad
A major problem in modern probabilistic modeling is the huge computational
complexity involved in typical calculations with multivariate probability
distributions when the number of random variables is large. Because exact
computations are infeasible in such cases and Monte Carlo sampling
techniques may reach their limits, there is a need for methods that allow
for efficient approximate computations. One of the simplest approximations
is based on the mean field method, which has a long history in statistical
physics. The method is widely used, particularly in the growing field of
graphical models.
Researchers from disciplines such as statistical physics, computer science,
and mathematical statistics are studying ways to improve this and related
methods and are exploring novel application areas. Leading approaches
include the variational approach, which goes beyond factorizable
distributions to achieve systematic improvements; the TAP
(Thouless-Anderson-Palmer) approach, which incorporates correlations by
including effective reaction terms in the mean field theory; and the more
general methods of graphical models.
Bringing together ideas and techniques from these diverse disciplines, this
book covers the theoretical foundations of advanced mean field methods,
explores the relation between the different approaches, examines the
quality of the approximation obtained, and demonstrates their application
to various areas of probabilistic modeling.
Manfred Opper is a Reader and David Saad is Professor, the Neural Computing
Research Group, School of Engineering and Applied Science, Aston
University, UK.
7 x 10, 300 pp.
cloth ISBN 0-262-15054-9
Neural Information Processing series
Jud Wolfskill
Associate Publicist
MIT Press
5 Cambridge Center, 4th Floor
Cambridge, MA 02142
617.253.2079
617.253.1709 fax
wolfskil at mit.edu
From abrowne at lgu.ac.uk Thu Aug 23 07:50:50 2001
From: abrowne at lgu.ac.uk (Tony Browne)
Date: Thu, 23 Aug 2001 12:50:50 +0100 (GMT Daylight Time)
Subject: Connectionist Inference Preprint
Message-ID:
Apologies if you receive this posting more than once.
A preprint is available for download, of the paper
'Connectionist Inference Models' by Antony Browne and Ron
Sun (to appear in `Neural Networks'). 62 Pages, 155
References.
Abstract: The performance of symbolic inference tasks has
long been a challenge to connectionists. In this paper, we
present an extended survey of this area. Existing
connectionist inference systems are reviewed, with
particular reference to how they perform variable binding
and rule-based reasoning, and whether they involve
distributed or localist representations. The benefits and
disadvantages of different representations and systems are
outlined, and conclusions drawn regarding the capabilities
of connectionist inference systems when compared with
symbolic inference systems or when used for cognitive
modeling.
Keywords: Symbolic inference, resolution, variable binding,
localist representations, distributed representations.
Download Instructions: Go to
http://www.lgu.ac.uk/~abrowne/abrowne.htm and scroll down
to the section 'Downloadable Technical Reports and
Preprints'. Click on the file to download (in zipped
Postscript [190K] or Zipped PDF [228K] format).
Comments Welcome
If you have problems downloading, please e-mail me.
Tony Browne
=======================================================
Dr. Antony Browne abrowne at lgu.ac.uk
http://www.lgu.ac.uk/~abrowne/abrowne.htm
Reader in Intelligent Systems
School of Computing, Information Systems & Mathematics
London Guildhall University
100 Minories
London EC3 1JY, UK
Tel: (+44) 0207 320 1307
Fax: (+44) 0207 320 1717
=======================================================
From stefan.wermter at sunderland.ac.uk Thu Aug 23 13:02:13 2001
From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter)
Date: Thu, 23 Aug 2001 18:02:13 +0100
Subject: EmerNet book: Emergent Neural Computational Architectures
Message-ID: <3B853714.6E58E4CA@sunderland.ac.uk>
Emergent Neural Computational Architectures
based on Neuroscience
Stefan Wermter, Jim Austin, David Willshaw
2001, Springer, Heidelberg, 577p
For more detailed information, table of contents, abstracts
and chapters see:
http://www.his.sunderland.ac.uk/emernet/newbook.html
Summary:
This book is the result of a series of International
Workshops organised by the EmerNet project on
Emergent Neural Computational Architectures based
on Neuroscience sponsored by the Engineering and
Physical Sciences Research Council (EPSRC). The
overall aim of the book is to present a broad spectrum
of current research into biologically inspired
computational systems and hence encourage the
emergence of new computational approaches based
on neuroscience. It is generally understood that the
present approaches for computing do not have the
performance, flexibility and reliability of biological
information processing systems. Although there is a
massive body of knowledge regarding how processing
occurs in the brain and central nervous system this has
had little impact on mainstream computing so far.
The process of developing biologically inspired
computerised systems involves the examination of the
functionality and architecture of the brain with an
emphasis on the information processing activities.
Biologically inspired computerised systems address
neural computation from the position of both
neuroscience, and computing by using experimental
evidence to create general neuroscience-inspired
systems.
The book focuses on the main research areas of
modular organisation and robustness, timing and
synchronisation, and learning and memory storage.
The issues considered as part of these include: How
can the modularity in the brain be used to produce
large scale computational architectures? How does
the human memory manage to continue to operate
despite failure of its components? How does the brain
synchronise its processing? How does the brain
compute with relatively slow computing elements but
still achieve rapid and real-time performance? How
can we build computational models of these processes
and architectures? How can we design incremental
learning algorithms and dynamic memory
architectures? How can the natural information
processing systems be exploited for artificial
computational methods?
Emergent Neural Computational Architectures based on
Neuroscience can be ordered from Springer-Verlag using the
booking form and accessed on-line using the appropriate login
and password from Springer.
http://www.his.sunderland.ac.uk/emernet/newbook.html
http://www.springer.de/cgi-bin/search_book.pl?isbn=3-540-42363-X
--------------------------------------
***************************************
Professor Stefan Wermter
Chair for Intelligent Systems
University of Sunderland
Centre of Informatics, SCET
St Peters Way
Sunderland SR6 0DD
United Kingdom
phone: +44 191 515 3279
fax: +44 191 515 3553
email: stefan.wermter at sunderland.ac.uk
http://www.his.sunderland.ac.uk/~cs0stw/
http://www.his.sunderland.ac.uk/
****************************************
From rid at ecs.soton.ac.uk Fri Aug 24 05:56:32 2001
From: rid at ecs.soton.ac.uk (Bob Damper)
Date: Fri, 24 Aug 2001 10:56:32 +0100 (BST)
Subject: Source of a famous quotation ...
Message-ID:
Dear connectionists,
does anyone know the exact source of the famous quotation:
``neural networks are the second best way of solving every
problem'' ?
I'd be eternally grateful for an answer.
Bob.
***************************************************************
* R I Damper PhD *
* Reader and Head: *
* Image, Speech and Intelligent Systems (ISIS) *
* Research Group *
* Building 1 *
* Department of Electronics and Computer Science *
* University of Southampton *
* Southampton SO17 1BJ *
* England *
* *
* Tel: +44 (0) 23 8059 4577 (direct) *
* FAX: +44 (0) 23 8059 4498 *
* Email: rid at ecs.soton.ac.uk *
* WWW: http://www.ecs.soton.ac.uk/~rid *
* *
***************************************************************
From gabr-ci0 at wpmail.paisley.ac.uk Fri Aug 24 12:40:10 2001
From: gabr-ci0 at wpmail.paisley.ac.uk (Bogdan Gabrys)
Date: Fri, 24 Aug 2001 17:40:10 +0100
Subject: PhD studentship available
Message-ID:
PhD Studentship
Applied Computational Intelligence Research Unit (ACIRU)
School of Information and Communication Technologies,
University of Paisley, Scotland, UK
Applications are invited for a 3 year PhD research studentship which
can start from October 2001 and is jointly funded by the University of
Paisley (http://www.cis.paisley.ac.uk) and the Lufthansa Systems
Berlin GmbH (http://www.lsb.de).
The proposed research project will investigate and develop various
approaches for combining predictions (forecasts). There is a large
potential market for applications offering accurate and reliable
predictions ranging from stock market exchange to estimating the
demand for sales of goods and services. One such example, which will
be looked at in more detail in this project, is an accurate estimation
of the demand for various types of airplane tickets. Combination,
aggregation and fusion of information are major problems for all kinds
of knowledge-based systems, from image processing to decision making,
from pattern recognition to automatic learning. Various machine
learning and hybrid intelligent techniques will be used for processing
and modelling of imperfect data and information utilizing the
methodologies like probability, fuzzy, evidence and possibility
theories.
The student will be joining an enthusiastic and vibrant research group
and will be primarily based in the ACIRU in Paisley (near Glasgow),
Scotland but two extended visits to the Lufthansa Systems Berlin site
in Berlin, Germany are planned in the second and third year of the
project.
The studentship carries a remuneration of ?7500 tax-free (increased
to ?8k and ?9k in the second and third year respectively) and
payment of tuition fees paid at Home/EU rate. The stipend may be
augmented by a limited amount of teaching.
Applicants should have a strong mathematical background and hold a
first or upper second class honours degree or equivalent in
mathematics, physics, engineering, statistics, computer science or a
similar discipline. Additionally the candidate should have strong
programming experience using any or combination of C,C++,Matlab or
Java. Knowledge of ORACLE will be an advantage.
For further details please contact Dr. Bogdan Gabrys, e-mail:
gabr-ci0 at paisley.ac.uk.
Interested candidates should send a detailed CV and a letter of
application with the names and addresses of two referees to:
Dr. Bogdan Gabrys, School of Information and Communication
Technologies, Div. of Computing and Information Systems, University of
Paisley, Paisley PA1 2BE, Scotland, UK. The applications can be also
sent by e-mail.
******************************************************
Dr Bogdan Gabrys
Applied Computational Intelligence Research Unit
Division of Computing and Information Systems
University of Paisley
High Street, Paisley PA1 2BE
Scotland, United Kingdom
Tel: +44 (0) 141 848 3752
Fax: +44 (0) 141 848 3542
E-mail: gabr-ci0 at paisley.ac.uk
******************************************************
Legal disclaimer
--------------------------
The information transmitted is the property of the University of
Paisley and is intended only for the person or entity to which it is
addressed and may contain confidential and/or privileged material.
Statements and opinions expressed in this e-mail may not represent
those of the company. Any review, retransmission, dissemination and
other use of, or taking of any action in reliance upon, this
information by persons or entities other than the intended recipient
is prohibited. If you received this in error, please contact the
sender immediately and delete the material from any computer.
--------------------------
From cindy at cns.bu.edu Fri Aug 24 16:34:07 2001
From: cindy at cns.bu.edu (Cynthia Bradford)
Date: Fri, 24 Aug 2001 16:34:07 -0400
Subject: Call for Papers: 6th ICCNS
Message-ID: <200108242034.QAA17997@retina.bu.edu>
Apologies if you receive this more than once.
***** CALL FOR PAPERS *****
SIXTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS
Tutorials: May 29, 2002
Meeting: May 30 - June 1, 2002
Boston University
677 Beacon Street
Boston, Massachusetts 02215
http://www.cns.bu.edu/meetings/
Sponsored by Boston University's
Center for Adaptive Systems
and
Department of Cognitive and Neural Systems
with financial support from the
National Science Foundation
and the
Office of Naval Research
This interdisciplinary conference has drawn about 300 people from around
the world each time that it has been offered. Last year's conference was
attended by scientists from 31 countries. The conference is structured to
facilitate intense communication between its participants, both in the
formal sessions and during its other activities. As during previous years,
the conference will focus on solutions to the fundamental questions:
How Does the Brain Control Behavior?
How Can Technology Emulate Biological Intelligence?
The conference will include invited tutorials and lectures, and
contributed lectures and posters by experts on the biology and
technology of how the brain and other intelligent systems adapt to
a changing world. The conference is aimed at researchers and students
of computational neuroscience, connectionist cognitive science,
artificial neural networks, neuromorphic engineering, and artificial
intelligence.
A single oral or poster session enables all presented work to be
highly visible.
Abstract submissions encourage submissions of the latest results.
Costs are kept at a minimum without compromising the quality of
meeting handouts and social events.
CALL FOR ABSTRACTS
Session Topics:
* vision * spatial mapping and navigation
* object recognition * neural circuit models
* image understanding * neural system models
* audition * mathematics of neural systems
* speech and language * robotics
* unsupervised learning * hybrid systems (fuzzy, evolutionary, digital)
* supervised learning * neuromorphic VLSI
* reinforcement and emotion * industrial applications
* sensory-motor control * cognition, planning, and attention
* other
Contributed abstracts must be received, in English, by January 31,
2002. Notification of acceptance will be provided by email by February
28, 2002. A meeting registration fee must accompany each Abstract. See
Registration Information below for details. The fee will be returned if
the Abstract is not accepted for presentation and publication in the
meeting proceedings. Registration fees of accepted Abstracts will be
returned on request only until April 19, 2002.
Each Abstract should fit on one 8.5" x 11" white page with 1" margins
on all sides, single-column format, single-spaced, Times Roman or
similar font of 10 points or larger, printed on one side of the page
only. Fax submissions will not be accepted. Abstract title, author
name(s), affiliation(s), mailing, and email address(es) should begin
each Abstract. An accompanying cover letter should include: Full title
of Abstract; corresponding author and presenting author name, address,
telephone, fax, and email address; requested preference for oral or
poster presentation; and a first and second choice from the topics
above, including whether it is biological (B) or technological (T)
work. Example: first choice: vision (T); second choice: neural system
models (B). (Talks will be 15 minutes long. Posters will be up for a
full day. Overhead, slide, VCR, and LCD projector facilities will be
available for talks.) Abstracts which do not meet these requirements
or which are submitted with insufficient funds will be returned. Accepted
Abstracts will be printed in the conference proceedings volume. No longer
paper will be required. The original and 3 copies of each Abstract should
be sent to: Cynthia Bradford, Boston University, Department of Cognitive
and Neural Systems, 677 Beacon Street, Boston, MA 02215.
REGISTRATION INFORMATION: Early registration is recommended. To
register, please fill out the registration form below. Student
registrations must be accompanied by a letter of verification from a
department chairperson or faculty/research advisor. If accompanied by
an Abstract or if paying by check, mail to the address above. If
paying by credit card, mail as above, or fax to (617) 353-7755, or
email to cindy at cns.bu.edu. The registration fee will help to pay for a
reception, 6 coffee breaks, and the meeting proceedings.
STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral
fellows are available to help cover meeting travel and living costs. The
deadline to apply for fellowship support is January 31, 2002. Applicants
will be notified by email by February 28, 2002. Each application should
include the applicant's CV, including name; mailing address; email
address; current student status; faculty or PhD research advisor's name,
address, and email address; relevant courses and other educational data;
and a list of research articles. A letter from the listed faculty or PhD
advisor on official institutional stationery should accompany the
application and summarize how the candidate may benefit from the meeting.
Fellowship applicants who also submit an Abstract need to include the
registration fee with their Abstract submission. Those who are awarded
fellowships are required to register for and attend both the conference
and the day of tutorials. Fellowship checks will be distributed after
the meeting.
REGISTRATION FORM
Sixth International Conference on Cognitive and Neural Systems
Department of Cognitive and Neural Systems
Boston University
677 Beacon Street
Boston, Massachusetts 02215
Tutorials: May 29, 2002
Meeting: May 30 - June 1, 2002
FAX: (617) 353-7755
http://www.cns.bu.edu/meetings/
(Please Type or Print)
Mr/Ms/Dr/Prof: _____________________________________________________
Name: ______________________________________________________________
Affiliation: _______________________________________________________
Address: ___________________________________________________________
City, State, Postal Code: __________________________________________
Phone and Fax: _____________________________________________________
Email: _____________________________________________________________
The conference registration fee includes the meeting program,
reception, two coffee breaks each day, and meeting proceedings.
The tutorial registration fee includes tutorial notes and two
coffee breaks.
CHECK ONE:
( ) $85 Conference plus Tutorial (Regular)
( ) $55 Conference plus Tutorial (Student)
( ) $60 Conference Only (Regular)
( ) $40 Conference Only (Student)
( ) $25 Tutorial Only (Regular)
( ) $15 Tutorial Only (Student)
METHOD OF PAYMENT (please fax or mail):
[ ] Enclosed is a check made payable to "Boston University".
Checks must be made payable in US dollars and issued by
a US correspondent bank. Each registrant is responsible
for any and all bank charges.
[ ] I wish to pay my fees by credit card
(MasterCard, Visa, or Discover Card only).
Name as it appears on the card: _____________________________________
Type of card: _______________________________________________________
Account number: _____________________________________________________
Expiration date: ____________________________________________________
Signature: __________________________________________________________
From jzhu at stanford.edu Fri Aug 24 12:35:24 2001
From: jzhu at stanford.edu (Ji Zhu)
Date: Fri, 24 Aug 2001 09:35:24 -0700 (PDT)
Subject: No subject
Message-ID:
Dear all,
This is a repost of our paper "Kernel Logistic Regression and the
Import Vector Machine".
We want to apologize that we missed several important references in
our previous draft. The revised version is available at
http://www.stanford.edu/~jzhu/research/nips01.ps
Thank you!
Best regards,
-Ji Zhu
From skremer at q.cis.uoguelph.ca Mon Aug 27 16:29:04 2001
From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer)
Date: Mon, 27 Aug 2001 16:29:04 -0400 (EDT)
Subject: Announce: New Unlabeled Data Competition and Workshop
Message-ID:
Apologies if you receive multiple copies of this mailing.
ANNOUNCEMENT: The Second Annual NIPS Unlabeled Data Competition and Workshop
It's time to put-up or shut-up!
Synopsis:
We are please to announce the NIPS*2001 Unlabeled Data Competition and
Workshop, to be held in Whistler, British Columbia, Canada, Dec 7 or 8, 2001.
This competition is a challenge to the machine learning community to develop
and demonstrate methods to use unlabeled data to improve supervised
learning. We have created a web-site where participants can download
and submit problem sets and compete head to head with other contestants
in a series of challenging unlabeled-data, supervised-learning problems.
Recently, there has been much interest in applying techniques that
incorporate knowledge from unlabeled data into systems performing
supervised learning. The potential advantages of such techniques are
obvious in domains where labeled data is expensive and unlabeled data
is cheap. Many such techniques have been proposed, but only recently has
any effort been made to compare the effectiveness of different approaches
on real world problems.
Our contest presents a challenge to the proponents of methods to
incorporate unlabeled data into supervised learning. Can you really
use unlabeled data to help train a supervised classification (or
regression) system? Do recent (and not so recent) theories stand up to
the data test?
On the contest web-site you can find challenge problems where you can try
out your methods head-to-head against anyone brave enough to face
you. Then, at the end of the contest we will release the results and find
out who really knows something about using unlabeled data, and if
unlabeled data are really useful or we are all just wasting our time. So
ask yourself, are you (and your theory) up to the challenge?? Feeling
lucky???
For more details on the competition or the workshop and to sign up for
the Unlabeled Data Mailing List, please visit our
web-page at "http://q.cis.uoguelph.ca/~skremer/NIPS2001/".
Stefan
--
--
Dr. Stefan C. Kremer, Assistant Prof.,
Dept. of Computing and Information Science
University of Guelph, Guelph, Ontario N1G 2W1
WWW: http://hebb.cis.uoguelph.ca/~skremer
Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323
E-mail: skremer at snowhite.cis.uoguelph.ca
From bbs at bbsonline.org Tue Aug 28 16:55:50 2001
From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor))
Date: Tue, 28 Aug 2001 16:55:50 -0400
Subject: BBS Call for Commentators--Preston & De Waal: Empathy: Its ultimate and proximate bases
Message-ID:
Dear Dr. Connectionists List User,
Below is the abstract of a forthcoming BBS target article
Empathy: Its ultimate and proximate bases
by
Stephanie D. Preston & Frans B. M. de Waal
http://www.bbsonline.org/Preprints/Preston/
or
http://www.bbsonline.org/Preprints/Preston/Preston.pdf
This article has been accepted for publication in Behavioral and Brain
Sciences (BBS), an international, interdisciplinary journal providing
Open Peer Commentary on important and controversial current research in
the biobehavioral and cognitive sciences.
Commentators must be BBS Associates or nominated by a BBS Associate. To
be considered as a commentator for this article, to suggest other
appropriate commentators, or for information about how to become a BBS
Associate, please reply by EMAIL within three (3) weeks to:
calls at bbsonline.org
The Calls are sent to 10,000 BBS Associates, so there is no expectation
(indeed, it would be calamitous) that each recipient should comment
on every occasion! Hence there is no need to reply except if you wish
to comment, or to nominate someone to comment.
If you are not a BBS Associate, please approach a current BBS
Associate (there are currently over 10,000 worldwide) who is familiar
with your work to nominate you. All past BBS authors, referees and
commentators are eligible to become BBS Associates. A full electronic
list of current BBS Associates is available at this location to help
you select a name:
http://www.bbsonline.org/Instructions/assoclist.html
If no current BBS Associate knows your work, please send us your
Curriculum Vitae and BBS will circulate it to appropriate Associates to
ask whether they would be prepared to nominate you. (In the meantime,
your name, address and email address will be entered into our database
as an unaffiliated investigator.)
To help us put together a balanced list of commentators, please give
some indication of the aspects of the topic on which you would bring
your areas of expertise to bear if you were selected as a commentator.
To help you decide whether you would be an appropriate commentator for
this article, an electronic draft is retrievable from the online
BBSPrints Archive, at the URL that follows the abstract below.
_____________________________________________________________
Empathy: Its ultimate and proximate bases
Stephanie D. Preston
Department of Psychology
3210 Tolman Hall #1650
University of California at Berkeley
Berkeley, CA 94720-1650
USA
spreston at socrates.berkeley.edu
http://socrates.berkeley.edu/~spreston
Frans B. M. de Waal
Living Links,
Yerkes Primate Center and Psychology Department,
Emory University,
Atlanta, GA 30322
USA
dewaal at rmy.emory.edu
http://www.emory.edu/LIVING_LINKS/
KEYWORDS:
altruism; cognitive empathy; comparative; emotion;
emotional contagion; empathy; evolution; human; perception-action;
perspective taking;
ABSTRACT:
There is disagreement in the literature about the exact nature of the
phenomenon of empathy. There are emotional, cognitive, and conditioning
views, applying in varying degrees across species. An adequate description
of the ultimate and proximate mechanism can integrate these views.
Proximately, the perception of an object's state activates the subject's
corresponding representations, which in turn activate somatic and
autonomic responses. This mechanism supports basic behaviors (e.g., alarm,
social facilitation, vicariousness of emotions, mother-infant
responsiveness, and the modeling of competitors and predators) that are
crucial for the reproductive success of animals living in groups. The
"Perception-Action Model" (PAM) together with an understanding of how
representations change with experience can explain the major empirical
effects in the literature (similarity, familiarity, past experience,
explicit teaching and salience). It can also predict a variety of empathy
disorders. The interaction between the PAM and prefrontal functioning can
also explain different levels of empathy across species and age groups.
This view can advance our evolutionary understanding of empathy beyond
inclusive fitness and reciprocal altruism and can explain different levels
of empathy across individuals, species, stages of development, and
situations.
http://www.bbsonline.org/Preprints/Preston/
or
http://www.bbsonline.org/Preprints/Preston/Preston.pdf
___________________________________________________________
Please do not prepare a commentary yet. Just let us know, after having
inspected it, what relevant expertise you feel you would bring to bear
on what aspect of the article. We will then let you know whether it was
possible to include your name on the final formal list of invitees.
_______________________________________________________________________
*** SUPPLEMENTARY ANNOUNCEMENTS ***
(1) The authors of scientific articles are not paid money for their
refereed research papers; they give them away. What they want is to
reach all interested researchers worldwide, so as to maximize the
potential research impact of their findings.
Subscription/Site-License/Pay-Per-View costs are accordingly
access-barriers, and hence impact-barriers for this give-away
research literature.
There is now a way to free the entire refereed journal literature,
for everyone, everywhere, immediately, by mounting interoperable
university eprint archives, and self-archiving all refereed research
papers in them.
Please see: http://www.eprints.org
http://www.openarchives.org/
http://www.dlib.org/dlib/december99/12harnad.html
---------------------------------------------------------------------
(2) All authors in the biobehavioral and cognitive sciences are
strongly encouraged to self-archive all their papers in their own
institution's Eprint Archives or in CogPrints, the Eprint Archive
for the biobehavioral and cognitive sciences:
http://cogprints.soton.ac.uk/
It is extremely simple to self-archive and will make all of our
papers available to all of us everywhere, at no cost to anyone,
forever.
Authors of BBS papers wishing to archive their already published
BBS Target Articles should submit it to BBSPrints Archive.
Information about the archiving of BBS' entire backcatalogue will
be sent to you in the near future. Meantime please see:
http://www.bbsonline.org/help/
and
http://www.bbsonline.org/Instructions/
---------------------------------------------------------------------
(3) Call for Book Nominations for BBS Multiple Book Review
In the past, Behavioral and Brain Sciences (BBS) had only been able
to do 1-2 BBS multiple book treatments per year, because of our
limited annual page quota. BBS's new expanded page quota will make
it possible for us to increase the number of books we treat per
year, so this is an excellent time for BBS Associates and
biobehavioral/cognitive scientists in general to nominate books you
would like to see accorded BBS multiple book review.
(Authors may self-nominate, but books can only be selected on the
basis of multiple nominations.) It would be very helpful if you
indicated in what way a BBS Multiple Book Review of the book(s) you
nominate would be useful to the field (and of course a rich list of
potential reviewers would be the best evidence of its potential
impact!).
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
Please note: Your email address has been added to our user database for
Calls for Commentators, the reason you received this email. If you do
not wish to receive further Calls, please feel free to change your
mailshot status through your User Login link on the BBSPrints homepage,
useing your username and password above:
http://www.bbsonline.org/
For information about the mailshot, please see the help file at:
http://www.bbsonline.org/help/node5.html#mailshot
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
From brody at cshl.org Tue Aug 28 18:35:49 2001
From: brody at cshl.org (Carlos Brody)
Date: Tue, 28 Aug 2001 18:35:49 -0400 (EDT)
Subject: Postdoctoral positions in computational neuroscience
Message-ID: <15244.7365.990066.563845@sonnabend.cshl.org>
-- PLEASE POST --
POSTDOCTORAL OPPORTUNITIES IN COMPUTATIONAL NEUROSCIENCE
Postdoctoral positions for computational neuroscientists and
psychophysicists are available in Carlos Brody's research group at
Cold Spring Harbor Laboratory. (see
http://www.cns.caltech.edu/~carlos/temporary/Lab). Applicants should
have an interest in quantitative approaches to neuroscience, and
should have, or be near completing, a Ph.D. in Neuroscience,
Experimental Psychology, or in a quantitative field (e.g. Physics,
Math, Engineering).
Successful applicants will be expected, after appropriate guidance
and/or any necessary self-education, to lead the group's research
efforts in one or more of the projects listed below. For more
information on each of these projects, visit the lab's web page. In
addition, those who wish to develop and pursue their own, independent,
self-originated, line(s) of research will be very much encouraged to
do so: the lab seeks an atmosphere of vigorous discussion and creative
independence. Applications from self-guided, motivated, and
independent-minded scientists are particularly welcome.
Applicants should send a CV, the names of three references, and a
summary of research interests and experience to: Carlos Brody, 1
Bungtown Road, Freeman Building, Cold Spring Harbor, NY 11724,
USA. The positions are open immediately; salaries are on the NIH pay
scale.
----------
Lab interest areas (in order of descending current emphasis in the lab):
1) Psychophysics and neurocomputational modeling of working memory.
2) Encoding and representation of time.
3) Computation with spiking neurons.
4) Automated mapping of complex receptive fields.
From engp9286 at nus.edu.sg Wed Aug 29 03:39:47 2001
From: engp9286 at nus.edu.sg (Duan Kaibo)
Date: Wed, 29 Aug 2001 15:39:47 +0800
Subject: a technical report
Message-ID: <9C4C56CDF89E0440A6BD571E76D2387FB7559B@exs23.ex.nus.edu.sg>
Dear Connectionists:
We have recently completed a technical report that evaluates some simple
performance measures for tuning hyperparameters of Support Vector Machines.
A pdf file containing this report can be downloaded from:
http://guppy.mpe.nus.edu.sg/~mpessk/comparison.shtml
Here are the details of the report...
__________________________________________________________________
Title: Evaluation of Simple Performance Measures for Tuning SVM
Hyperparameters
Authors:
Kaibo Duan ( engp9286 at nus.edu.sg
)
S. Sathiya Keerthi ( mpessk at nus.edu.sg
)
Aun Neow Poo ( mpepooan at nus.edu.sg
)
Abstract:
Choosing optimal hyperparameter values for support vector machines is an
important step in SVM design. This is usually done by minimizing either an
estimate of generalization error or some other related performance measure.
In this paper, we empirically study the usefulness of several simple
performance measures that are inexpensive to compute (in the sense that they
do not require expensive matrix operations involving the kernel matrix). The
results point out which of these measures are adequate functionals for
tuning SVM hyperparameters. For SVMs with L1 soft margin formulation, none
of the simple measures yields a performance uniformly as good as k-fold
cross validation; Joachims' Xi-Alpha bound and Wahba et al's GACV come next
and perform reasonably well. For SVMs with L2 soft margin formulation, the
radius margin bound gives a very good prediction of optimal hyperparameter
values.
__________________________________________________________________
We are interested in knowing about the comparitive performance of the
measures that we have considered, on other data sets that we haven't tried.
Best regards,
Kaibo
From mike at stats.gla.ac.uk Wed Aug 29 10:17:12 2001
From: mike at stats.gla.ac.uk (Mike Titterington)
Date: Wed, 29 Aug 2001 15:17:12 +0100 (BST)
Subject: Postdoctoral post in Glasgow
Message-ID:
(Re-advertisement)
UNIVERSITY OF GLASGOW
DEPARTMENT OF STATISTICS
POSTDOCTORAL RESEARCH ASSISTANT
Applications are invited for a Postdoctoral Research Assistantship
(IA) post in the Department of Statistics, University of Glasgow, to
work with Professor D.M. Titterington for a period of up to 3 years,
starting as soon as possible. The post is funded by the UK Engineering
and Physical Sciences Research Council.
The research topic is 'Approximate Approaches to Likelihood and
Bayesian Statistical Inference in Incomplete-data problems'.
Applications, supported by full curriculum vitae and the names of
three referees, should be sent, to arrive no later than September 21, 2001,
to Professor D. M. Titterington, Department of Statistics, University of
Glasgow, Glasgow G12 8QQ, Scotland, from whom further particulars are
available. Informal enquiries by electronic mail (mike at stats.gla.ac.uk)
are welcomed.
From juergen at idsia.ch Thu Aug 30 10:57:29 2001
From: juergen at idsia.ch (juergen@idsia.ch)
Date: Thu, 30 Aug 2001 16:57:29 +0200
Subject: metalearner
Message-ID: <200108301457.QAA01240@ruebe.idsia.ch>
I would like to draw your attention to Sepp Hochreiter's astonishing
recent result on "learning to learn."
He trains gradient-based "Long Short-Term Memory" (LSTM) recurrent
networks with roughly 5000 weights to _metalearn_ fast online learning
algorithms for nontrivial classes of functions, such as all quadratic
functions of two variables. LSTM is necessary because metalearning
typically involves huge time lags between important events, and standard
gradient-based recurrent nets cannot deal with these. After a month
of metalearning on a PC he freezes all weights, then uses the frozen
net as follows: He selects some new function f, and feeds a sequence of
random training exemplars of the form ...data/target/data/target/data...
into the input units, one sequence element at a time. After about 30
exemplars the frozen recurrent net correctly predicts target inputs before
it sees them. No weight changes! How is this possible? After metalearning
the frozen net implements a sequential learning algorithm which apparently
computes something like error signals from data inputs and target inputs
and translates them into changes of internal estimates of f. Parameters
of f, errors, temporary variables, counters, computations of f and of
parameter updates are all somehow represented in form of circulating
activations. Remarkably, the new - and quite opaque - online learning
algorithm running on the frozen network is much faster than standard
backprop with optimal learning rate. This indicates that one can use
gradient descent to metalearn learning algorithms that outperform gradient
descent. Furthermore, the metalearning procedure automatically avoids
overfitting in a principled way, since it punishes overfitting online
learners just like it punishes slow ones, simply because overfitters
and slow learners cause more cumulative errors during metalearning.
Hochreiter himself admits the paper is not well-written. But the results
are quite amazing: http://www.cs.colorado.edu/~hochreit
@inproceedings{Hochreiter:01meta,
author = "S. Hochreiter and A. S. Younger and P. R. Conwell",
title = "Learning to learn using gradient descent",
booktitle= "Lecture Notes on Comp. Sci. 2130,
Proc. Intl. Conf. on Artificial Neural Networks (ICANN-2001)",
editors = "G. Dorffner and H. Bischof and K. Hornik",
publisher= "Springer: Berlin, Heidelberg",
pages = "87-94",
year = "2001"}
-------------------------------------------------
Juergen Schmidhuber director
IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland
juergen at idsia.ch www.idsia.ch/~juergen