From avi at eecs.harvard.edu Tue Jul 1 11:04:38 2003
From: avi at eecs.harvard.edu (Avi Pfeffer)
Date: Tue, 01 Jul 2003 11:04:38 -0400
Subject: Announcing IBAL release
Message-ID: <3F01A306.5020503@eecs.harvard.edu>
Readers of this list may be interested in the following announcement:
I am pleased to announce the initial release of IBAL, a general purpose
language for probabilistic reasoning. IBAL is highly expressive, and
its inference algorithm generalizes many common frameworks as well as
allowing many new ones. It also provides parameter estimation and
decision making. All this is packaged in a programming language that
provides libraries, automatic type checking, etc.
IBAL may be downloaded from http://www.eecs.harvard.edu/~avi/IBAL.
Avi Pfeffer
From Thomas.Wennekers at neuroinformatik.ruhr-uni-bochum.de Wed Jul 2 09:36:37 2003
From: Thomas.Wennekers at neuroinformatik.ruhr-uni-bochum.de (Thomas Wennekers)
Date: Wed, 2 Jul 2003 15:36:37 +0200 (MEST)
Subject: Special issue on "Cell Assemblies"
Message-ID: <200307021336.h62DabQg005005@fsnif.neuroinformatik.ruhr-uni-bochum.de>
Dear all,
the following collection of papers appeared recently as a special issue
on "Cell Assemblies" at "Theory in Biosciences".
Preprint versions of the papers are available under
http://www.neuroinformatik.rub.de/thbio/publications/specialissue/cellassemblies.html
Final versions should be available from the authors.
Best wishes,
Thomas
Thomas Wennekers, Friedrich T. Sommer, and Ad Aertsen
Editorial: Cell Assemblies
Theory in Biosciences 122 (2003) 1-4.
Thomas Wennekers and Nihat Ay
Spatial and Temporal Stochastic Interaction in Neuronal Assemblies
Theory in Biosciences 122 (2003) 5-18.
The observation of various types of spatio-temporal correlations
in spike patterns of multiple cortical neurons has shifted attention
from rate coding paradigms to computational processes based on
the precise timing of spikes in neuronal ensembles. In the present
work we develop the notion of "spatial" and "temporal interaction"
which provides measures for statistical dependences in coupled
stochastic processes like multiple unit spike trains. We show that
the classical Willshaw network and Abeles' synfire chain model both
reveal a moderate spatial interaction, but only the synfire chain
model reveals a positive temporal interaction, too. Systems that
maximize temporal interaction are shown to be almost deterministic
globally, but posses almost unpredictable firing behavior on the
single unit level.
Anders Lansner, Erik Frans?n, and Anders Sandberg
Cell assembly dynamics in detailed and abstract
attractor models of cortical associative memory
Theory in Biosciences 122 (2003) 19-36.
During the last few decades we have seen a
convergence among ideas and hypotheses regarding functional
principles underlying human memory. Hebb's now more than fifty
years old conjecture concerning synaptic plasticity and cell
assemblies, formalized mathematically as attractor neural
networks, has remained among the most viable and productive
theoretical frameworks. It suggests plausible explanations for
Gestalt aspects of active memory like perceptual completion,
reconstruction and rivalry.
We review the biological plausibility of these theories and
discuss some critical issues concerning their associative
memory functionality in the light of simulation studies of
models with palimpsest memory properties. The focus is on
memory properties and dynamics of networks modularized in
terms of cortical minicolumns and hypercolumns. Biophysical
compartmental models demonstrate attractor dynamics that
support cell assembly operations with fast convergence and low
firing rates. Using a scaling model we obtain reasonable
relative connection densities and amplitudes. An abstract
attractor network model reproduces systems level psychological
phenomena seen in human memory experiments as the Sternberg
and von Restorff effects.
We conclude that there is today considerable substance in
Hebb's theory of cell assemblies and its attractor network
formulations, and that they have contributed to increasing our
understanding of cortical associative memory function.
The criticism raised with regard to biological and
psychological plausibility as well as low storage capacity,
slow retrieval etc has largely been disproved. Rather, this
paradigm has gained further support from new experimental data
as well as computational modeling.
Andreas Knoblauch and G?nther Palm
Synchronization of Neuronal Assemblies in
Reciprocally Connected Cortical Areas
Theory in Biosciences 122 (2003) 37-54.
To investigate scene segmentation in the visual system we
present a model of two reciprocally connected visual areas
comprising spiking neurons. The peripheral area P is modeled
similar to the primary visual cortex, while the central
area C is modeled as an associative memory representing
stimulus objects according to Hebbian learning. Without
feedback from area C, spikes corresponding to stimulus
representations in P are synchronized only locally (slow
state). Feedback from C can induce fast oscillations and
an increase of synchronization ranges (fast state).
Presenting a superposition of several stimulus objects,
scene segmentation happens on a time scale of hundreds of
milliseconds by alternating epochs of the slow and fast
state, where neurons representing the same object are
simultaneously in the fast state. We relate our simulation
results to various phenomena observed in neurophysiological
experiments, such as stimulus-dependent synchronization of
fast oscillations, synchronization on different time scales,
ongoing activity, and attention-dependent neural activity.
Friedrich T. Sommer and Thomas Wennekers
Models of distributed associative memory networks in the brain
Theory in Biosciences 122 (2003) 55-69.
Although experimental evidence for distributed cell assemblies
is growing, theories of cell assemblies are still marginalized
in theoretical neuroscience. We argue that this has to do with
shortcomings of the currently best understood assembly theories,
the ones based on formal associative memory models. These only
insufficiently reflect anatomical and physiological properties
of nervous tissue and their functionality is too restricted to
provide a framework for cognitive modeling. We describe cell
assembly models that integrate more neurobiological constraints
and review results from simulations of a simple nonlocal
associative network formed by a reciprocal topographic
projection. Impacts of nonlocal associative projections in the
brain are discussed with respect to the functionality they can
explain.
Hualou Liang and Hongbin Wang
Top-Down Anticipatory Control in Prefrontal Cortex
Theory in Biosciences 122 (2003) 70-86.
The prefrontal cortex has been implicated in a wide variety
of executive functions, many involving some form of
anticipatory attention. Anticipatory attention involves
the pre-selection of specific sensory circuits to allow
fast and efficient stimulus processing and a subsequently
fast and accurate response. It is generally agreed that the
prefrontal cortex plays a critical role in anticipatory
attention by exerting a facilitatory "top-down" bias on
sensory pathways. In this paper we review recent results
indicating that synchronized activity in prefrontal cortex,
during anticipation of visual stimulus, can predict features
of early visual stimulus processing and behavioral response.
Although the mechanisms involved in anticipatory attention
are still largely unknown, we argue that the synchronized
oscillation in prefrontal cortex is a plausible candidate
during sustained visual anticipation. We further propose a
learning hypothesis that explains how this top-down anticipatory
control in prefrontal cortex is learned based on accumulated
prior experience by adopting a Temporal Difference learning
algorithm.
Friedemann Pulverm?ller
Sequence detectors as a basis of grammar in the brain
Theory in Biosciences 122 (2003) 87-104.
Grammar processing may build upon serial-order mechanisms
known from non-human species. A circuit similar to that
underlying direction-sensitive movement detection in arthropods
and vertebrates may become selective for sequences of words,
thus yielding grammatical sequence detectors in the human
brain. Sensitivity to the order of neuronal events arises from
unequal connection strengths between two input units and a
third element, the sequence detector. This mechanism, which
critically depends on the dynamics of the input units, can
operate at the single neuron level and may be relevant at the
level of neuronal ensembles as well. Due to the repeated
occurrence of sequences, for example word strings, the
sequence-sensitive elements become more firmly established
and, by substitution of elements between strings, a process
called auto-associative substitution learning (AASL) is
triggered. AASL links the neuronal counterparts of the
string elements involved in the substitution process to the
sequence detector, thereby providing a brain basis of what can
be described linguistically as the generalization of rules of
grammar. A network of sequence detectors may constitute
grammar circuits in the human cortex on which a separate set
of mechanisms establishing temporary binding and recursion
can operate.
____________________________________________________________________________
Jr.Prof.Dr.Thomas Wennekers
Theoretical Neuroscience Group
Institute for Neuroinformatics
Ruhr-Universitaet Bochum
Universitaetsstrasse 150
ND 04/589a
44780 Bochum
Phone: +49-234-3224231
Fax: +49-234-3214209
Priv.: +49-160-6123416
Email: Thomas.Wennekers at neuroinformatik.rub.de
____________________________________________________________________________
From eurich at physik.uni-bremen.de Wed Jul 2 11:33:40 2003
From: eurich at physik.uni-bremen.de (Christian Eurich)
Date: Wed, 02 Jul 2003 17:33:40 +0200
Subject: PhD position in Theoretical Neuroscience
Message-ID: <3F02FB54.3C538E71@physik.uni-bremen.de>
Dear Connectionists,
a PhD position is available in the Institute for Theoretical
Neurophysics at the University of Bremen for a project on
human sensorimotor control loops.
In cooperation with experimental groups, dynamical models of action and
perception will be developed to investigate, for example, human postural
sway and the task of balancing sticks on the fingertip. Typical methods
we employ in our institute include dynamical systems theory, neural
networks, and statistical estimation theory. Our homepage is
http://www.neuro.uni-bremen.de/index.php
The University of Bremen has several institutions in the field of
Neuroscience, including a Center for Cognitive Neuroscience and a
Special Research Project "Neurocognition". There are several theoretical
and experimental groups in the Physics, Biology and Psychology working
on neural network modeling, psychophysics, and electrophysiology. The
Hanse Institute for Advanced Study in Delmenhorst (which is close to
Bremen) hosts international guests from the area of Neuroscience and
Cognitive Science and also organizes Neuroscience workshops and
conferences.
Closing date is July 25, 2003. For further information and applications,
please contact Dr. Christian Eurich during the upcoming CNS conference
in Alicante or at
Universitaet Bremen
Institut fuer Theoretische Neurophysik, FB 1
Postfach 330 440
D-28334 Bremen, Germany
Phone: +49 (421) 218-4559
Fax: +49 (421) 218-9104
e-mail: eurich at physik.uni-bremen.de
From Luc.Berthouze at aist.go.jp Wed Jul 2 05:30:27 2003
From: Luc.Berthouze at aist.go.jp (Luc Berthouze)
Date: Wed, 2 Jul 2003 11:30:27 +0200
Subject: postdoc position in developmental robotics and motor learning
Message-ID: <20030630075928.1B61613B65C@aidan6.a02.aist.go.jp>
The Cognitive Neuroinformatics group in the Neuroscience Research Institute at
the Japanese National Institute of Industrial Science and Technology, Tsukuba
(Japan) is seeking an outstanding postdoctoral researcher to join our lab for
two years starting between April 2004 and September 2004. Candidates should
have a solid background in robotics and computational modeling, and a keen
interest in developmental robotics and embodied cognition.
The postdoc will be expected to contribute to our study of the acquisition of
motor skills in human infants. Our approach is interdisciplinary. On the one
hand, we exploit studies in developmental psychology to propose candidate
mechanisms; and, on the other hand, we use robots to test and validate those
hypotheses. The purpose of this approach is two-fold: (a) to contribute to the
understanding of the mechanisms underlying motor development in infants; (b)
to propose new methods for robot learning. Experience in using neural
oscillators to implement motor control models, and a good understanding of the
so-called "dynamical systems approach" will be highly appreciated.
For more information, please consult our lab's website at:
http://www.neurosci.aist.go.jp/~mechwa
The Neuroscience Research Institute, and more generally, AIST, provides an
excellent environment for interdisciplinary research, with groups engaged in
research in biology, neuroscience, psychology, cognitive science, and
robotics. See http://www.aist.go.jp (AIST website) and
http://www.neurosci.aist.go.jp (Neuroscience Research Institute) for more
information.
Candidates should contact Luc Berthouze with a CV and a one-page statement of
research interests. If electronic submission is not possible, fax your
application to +81-298-615841, directed to the attention of Luc Berthouze.
From P.Tino at cs.bham.ac.uk Thu Jul 3 11:27:30 2003
From: P.Tino at cs.bham.ac.uk (Peter Tino)
Date: Thu, 03 Jul 2003 16:27:30 +0100
Subject: papers on architectural bias of RNNs
Message-ID: <3F044B62.9060906@cs.bham.ac.uk>
Dear Connectionists,
a collection of papers
dealing with theoretical and practical aspects of
recurrent neural networks before and in the early stages of
training is available on-line.
Preprints can be found at
http://www.cs.bham.ac.uk/~pxt/my.publ.html
B. Hammer, P. Tino:
Recurrent neural networks with small weights implement definite memory
machines.
Neural Computation, accepted, 2003.
- Proves that
Recurrent networks are architecturally biased
towards definite memory machines/Markov models.
Also contains rigorous learnability analysis
of recurrent nets in the early stages of
learning.
P. Tino, M. Cernansky, L. Benuskova:
Markovian architectural bias of recurrent neural networks.
IEEE Transactions on Neural Networks, accepted, 2003.
- Mostly empirical study of the architectural bias phenomenon
in the context of connectionist modeling of symbolic sequences.
It is possible to extract (variable memory length) Markov models
from recurrent networks even prior to any training!
To assess the amount of useful information extracted during the training,
the networks should be compared with variable memory length Markov
models.
P. Tino, B. Hammer:
Architectural Bias in Recurrent Neural Networks - Fractal Analysis.
Neural Computation, accepted, 2003.
- Rigorous fractal analysis of recurrent activations in
recurrent networks in the early stages of
learning. The complexity of input patterns (topological entropy)
is directly reflected by the complexity of recurrent activations
(fractal dimension).
Best wishes,
Peter Tino
--
Peter Tino
The University of Birmingham
School of Computer Science
Edgbaston, Birmingham B15 2TT, UK
+44 121 414 8558 , fax: 414 4281
http://www.cs.bham.ac.uk/~pxt/
From gary at cs.ucsd.edu Thu Jul 3 13:31:50 2003
From: gary at cs.ucsd.edu (Gary Cottrell)
Date: Thu, 3 Jul 2003 10:31:50 -0700 (PDT)
Subject: papers on architectural bias of RNNs
Message-ID: <200307031731.h63HVoO29370@fast.ucsd.edu>
Folks interested in Peter's paper on RNN's and Definite Memory
Machines may also be interested in our papers on TDNN's and
definite memory machines:
Clouse, Daniel S., Giles, Lee C., Horne, Bill G. and
Cottrell, G. W. (1997) Time-delay neural networks:
Representation and induction of finite state machines. IEEE
Transactions on Neural Networks.
This work attempts to characterize the capabilities of
time-delay neural networks (TDNN), and contrast two
subclasses of TDNN in the area of language induction. The
two subclasses are those with delays limited to the inputs
(IDNN), and those which include delays also on hidden units
(HDNN). Both of these architectures are capable of
representing the same languages, those representable by
definite memory machines (DMM), a subclass of finite state
machines (FSM). They have a strong representational bias
towards DMMs which can be characterized by little logic. We
demonstrate this by learning a 2048 state DMM using very few
training examples. Even though both architectures are
capable of representing the same class of languages, HDNNs
are biased towards learning languages which are
characterized by shift-invariant behavior on short input
windows in the mapping from recent inputs to the
accept/reject classification. We demonstrate this
difference in learning bias via a set of simulations and
statistical analysis.
http://www.neci.nec.com/%7Egiles/papers/IEEE.TNN.tdnn.as.fsm.ps.Z
Clouse, Daniel S., Giles, Lee C., Horne, Bill G. and
Cottrell, G. W. (1997) Representation and induction of
finite state machines using time-delay neural networks. In
Michael C. Mozer, Michael I. Jordan, and Thomas Petsche
(eds.) Advances in Neural Information Processing Systems 9,
pp. 403-409. MIT Press: Cambridge, MA, 1997.
(Similar abstract!)
http://nips.djvuzone.org/djvu/nips09/0403.djvu
From r.gayler at mbox.com.au Thu Jul 3 19:26:23 2003
From: r.gayler at mbox.com.au (Ross Gayler)
Date: Fri, 04 Jul 2003 09:26:23 +1000
Subject: Response to Jackendoff's challenges -- notice of conference
presentation and availability of paper
Message-ID: <001001c341ba$8517fc50$2402a8c0@Chennai>
The linguist, Ray Jackendoff, proposed four challenges to cognitive
neuroscience in his book "Foundations of Language". Each challenge
corresponds to an element of core linguistic functionality which Jackendoff
sees as being poorly addressed by current connectionist models.
On August 5, 2002, Jerome Feldman broadcast these challenges to the
Connectionists mailing list under the subject "Neural binding". After
receiving several responses, Feldman concluded on August 21 that "it isn't
obvious (at least to me) how to use any of the standard techniques to
specify a model that meets Jackendoff's criteria".
I have prepared a paper setting out how I believe one family of
connectionist architectures can meet Jackendoff's challenges. This will be
presented at the Joint International Conference on Cognitive Science to be
held in Sydney, Australia from 13 - 17 July, 2003
(http://www.arts.unsw.edu.au/cogsci2003/). If you will be attending the
conference and wish to hear the presentation - it is currently scheduled for
1 p.m. on Thursday 17th in the Language stream
(http://www.arts.unsw.edu.au/cogsci2003/conf_content/program_thurs_pm.htm).
An extended abstract is included below and anyone who wishes a preprint copy
of the paper (which is very condensed to fit the conference format) should
e-mail me at r.gayler at mbox.com.au
Ross Gayler
Melbourne, AUSTRALIA
r.gayler at mbox.com.au
+61 413 111 303 mobile
Vector Symbolic Architectures answer Jackendoff's challenges for cognitive
neuroscience.
Ross Gayler
Vector Symbolic Architectures (Gayler, 1998; Kanerva, 1997; Plate, 1994) are
a little-known class of connectionist models that can directly implement
functions usually taken to form the kernel of symbolic processing. They are
an enhancement of tensor product variable binding networks (Smolensky,
1990).
Like tensor product networks, VSA's can create and manipulate
recursively-structured representations in a natural and direct connectionist
fashion without requiring lengthy training. However, unlike tensor product
networks, VSA's afford a practical basis for implementations because they
require only fixed dimension vector representations. The fact that VSA's
relate directly, without training, to both simple, practical vector
implementations and core symbolic processing functionality suggests that
they would provide a fruitful connectionist basis for the implementation of
cognitive functionality.
Ray Jackendoff (2002) posed four challenges that linguistic combinatoriality
and rules of language present to theories of brain function. These
challenges are: the massiveness of the binding problem, the problem of
dealing with multiple instances, the problem of variables, and the
compatibility of representations in working memory and long-term memory.
The essence of these problems is the question of how to neurally instantiate
the rapid construction and transformation of the compositional structures
that are typically taken to be the domain of symbolic processing.
Drawing on work by Gary Marcus (2001), Jackendoff contended that these
challenges had not been widely recognised in the cognitive neuroscience
community and that the dialogue between linguistic theory and neural network
modelling would be relatively unproductive until the challenges were
answered by some technical innovation in connectionist models. Jerome
Feldman (2002) broadcast these challenges to the neural network modelling
community via the Connectionists Mailing List. The few responses he
received were unable to convince Feldman that any standard connectionist
techniques would meet Jackendoff's criteria.
In this paper I demonstrate that Vector Symbolic Architectures are able to
meet Jackendoff's challenges.
References
Feldman, J. (2002). Neural binding. Posted to Connectionists Mailing List,
5th August, 2002.
(http://www-2.cs.cmu.edu/afs/cs.cmu.edu/project/connect/connect-archives/arc
h.2002-08.gz 0005.txt see also 8, 9, 18, and 21)
Gayler, R. W. (1998). Multiplicative binding, representation operators, and
analogy. In K.
Holyoak, D. Gentner & B. Kokinov (Eds.), Advances in analogy research:
Integration of theory and data from the cognitive, computational, and neural
sciences (p. 405). Sofia, Bulgaria: New Bulgarian University.
(http://cogprints.ecs.soton.ac.uk/archive/00000502/ see also 500 and 501)
Jackendoff, R. (2002). Foundations of language: Brain, meaning, grammar,
evolution. Oxford: Oxford University Press.
Kanerva, P. (1997). Fully distributed representation. In Proceedings Real
World Computing Symposium (RWC'97, Tokyo). Report TR-96001 (pp. 358-365).
Tsukuba-city, Japan: Real World Computing Partnership.
(http://www.rni.org/kanerva/rwc97.ps.gz see also
http://www.rni.org/kanerva/pubs.html)
Marcus, G. (2001). The algebraic mind. Cambridge, MA, USA: MIT Press.
Plate, T. A. (1994). Distributed representations and nested compositional
structure. Ph.D. thesis, Department of Computer Science, University of
Toronto.
(http://pws.prserv.net/tap/papers/plate.thesis.ps.gz see also
http://pws.prserv.net/tap/)
Smolensky, P. (1990). Tensor product variable binding and the representation
of symbolic structures in connectionist systems. Artificial Intelligence,
46, 159-216.
From j.hogan at qut.edu.au Fri Jul 4 03:44:11 2003
From: j.hogan at qut.edu.au (James Michael Hogan)
Date: Fri, 04 Jul 2003 17:44:11 +1000
Subject: Symposium on Statistical Learning
Message-ID: <200307040744.AKA05918@mail-router02.qut.edu.au>
An upcoming workshop in Sydney - organised by people from
UNSW- jh
S E C O N D A N N O U N C E M E N T
\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
Australian Mathematical Sciences Institute symposium on
///////////////////////////////////////////////////////
:::::::::::::::::::::::::::::::
.. ..
.. STATISTICAL LEARNING ..
.. ..
:::::::::::::::::::::::::::::::
University of New South Wales
Sydney, Australia
2nd-3rd October, 2003
==========================================================
The symposium is now just 3 months away. Here are some
updates:
* The new and improved web-site is
www.maths.unsw.edu.au/~inge/symp
and information about the symposium
is continuously being added. The
latest addition is a tentative
programme.
* HOTEL BOOKING ALERT!!!!
The symposium takes place just one week
before the 2003 Rugby World Cup commences
in Sydney. Therefore you are advised to book
accommodation as soon as possible. Accommodation
suggestions have been added to the web-site.
Flights may also be affected by the World
Cup.
* Early bird special (late bird penalty).
We recommend you register as soon as possible,
but not later than 2nd September when the lower
rates expire. Full registration procedures are
now on the web.
* Speaker addition/subtraction
Drs. Markus Hegland and Alex Smola
have been added to the invited speaker
list. Professor Geoff McLachlan has
had to withdraw.
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
Inge Koch and Matt Wand
Department of Statistics
University of New South Wales
Sydney 2052, Australia
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
----- End Forwarded Message -----
From rporter at lanl.gov Mon Jul 7 14:46:41 2003
From: rporter at lanl.gov (Reid Porter)
Date: Mon, 07 Jul 2003 12:46:41 -0600
Subject: Postdoctoral position in digital neural networks
Message-ID: <5.0.0.25.2.20030707121230.02d91d78@nis-pop.lanl.gov>
Postdoctoral position available in digital neural networks
---------------------------------------------------------------------------------
The Space Data System Group (NIS-3) of Los Alamos National Laboratory seeks
outstanding candidates for a postdoctoral research position in the areas of
pattern recognition, mathematical morphology and reconfigurable computing.
The candidate will help develop high performance feature extraction and
classification algorithms to be deployed in reconfigurable computing
hardware. Applications include remotely sensed satellite imagery, unmanned
aerial vehicle video and other spatio-temporal data sets.
The main focus will be to develop novel cellular non-linear networks
suitable for digital hardware by building on mathematical morphology and
non-linear digital filter theory. The position will require research
(typically mathematics) and algorithm proto-typing (typically Matlab),
software development and implementation (typically in C / C++) targeting
eventual deployment with reconfigurable computing (typically in VHDL).
Required Skills: Prospective candidates should have good oral and written
communication skills, and a demonstrated ability to perform independent and
creative research. We are most interested in candidates with research
interests and experience in the following areas:
- Cellular non-linear networks and spatio-temporal processing
- Mathematical morphology and non-linear digital filters
- Machine learning, artificial intelligence and optimization
- Image, video and signal processing.
- Digital design, reconfigurable computing
Education: A PhD completed within the past five years or soon to be
completed is required.
Post-doc starting salaries are usually in the range $59,300 - $67,300
depending on experience, and generous assistance is provided with
relocation expenses. The initial contract offered would be for two years,
with good possibilities for contract extensions. Candidates may be
considered for a Director's Fellowship and outstanding candidates may be
considered for the prestigious J. Robert Oppenheimer, Richard P. Feynman or
Frederick Reines Fellowships. Please see
http://www.hr.lanl.gov/jps/regjobsearch.stm, job number #205519, for more
information.
Los Alamos is a small and very friendly town situated 7200 ft up in the
scenic Jemez mountains in northern New Mexico. The climate is very pleasant
and opportunities for outdoor recreation are numerous (skiing, hiking,
biking, climbing, etc). The Los Alamos public school system is excellent.
LANL provides a very constructive working environment with abundant
resources and support, and the opportunity to work with intelligent and
creative people on a variety of interesting projects.
Applicants are asked to send a resume, publications list and a cover letter
outlining current research interests to rporter at lanl.gov. Hard copies may
be sent to: Reid Porter, NIS-3, MS D440, Los Alamos National Lab, New
Mexico 87545, USA.
From becker at mcmaster.ca Wed Jul 9 23:05:47 2003
From: becker at mcmaster.ca (S. Becker)
Date: Wed, 9 Jul 2003 23:05:47 -0400 (EDT)
Subject: NIPS 2003 Survey
Message-ID:
To Connectionists:
We are asking for a few minutes of your time to complete an online Survey
consisting of four yes/no questions. The Survey deals with a number of major
changes to the format of the NIPS Conference that are under consideration. The
impetus for these possible changes is to accommodate the growth in submissions
in recent years (a 50% increase between 1999 and 2002), as well the diverse
demographics of the Conference attendees.
Your responses to the four questions, as well as any additional input you
may have, will be valuable in shaping the future of NIPS. We thank you for
your participation.
https://register.nips.salk.edu/surveys/survey.php?id=1
Terrence Sejnowski
President
Neural Information Processing Systems Foundation
From terry at salk.edu Fri Jul 11 19:01:45 2003
From: terry at salk.edu (Terry Sejnowski)
Date: Fri, 11 Jul 2003 16:01:45 -0700 (PDT)
Subject: NEURAL COMPUTATION 15:8
Message-ID: <200307112301.h6BN1jT51877@purkinje.salk.edu>
Neural Computation - Contents - Volume 15, Number 8 - August 1, 2003
ARTICLE
Computation In a Single Neuron: Hodgkin and Huxley Revisited
Blaise Aguera Y Arcas, Adrienne L. Fairhall and William Bialek
NOTE
Learning the Nonlinearity of Neurons from Natural Visual Stimuli
Christoph Kayser, Konrad P. Kording and Peter Konig
LETTERS
Analytic Expressions for Rate and CV of a Type I Neuron Driven
by White Gaussian Noise
Benjamin Lindner, Andre Longtin, and Adi Bulsara
What Causes a Neuron to Spike?
Blaise Aguera y Arcas and Adrienne L. Fairhall
Rate Models for Conductance-Based Cortical Neuronal Networks
Oren Shriki, David Hansel, and Haim Sompolinsky
Neural Representation of Probabilistic Information
M.J. Barber, J.W. Clark and C.H. Anderson
Learning the Gestalt Rule of Collinearity from Object Motion
Carsten Prodoehl, Rolf Wuertz and Christoph von der Malsburg
Recurrent Neural Networks With Small Weights Implement Definite Memory
Machines
Barbara Hammer and Peter Tino
Architectural Bias in Recurrent Neural Networks: Fractal Analysis
Peter Tino and Barbara Hammer
An Effective Bayesian Neural Network Classifier with a Comparison Study
to Support Vector Machine
Faming Liang
Variational Bayesian Learning of ICA with Missing Data
Kwokleung Chan, Te-Won Lee and Terrence J. Sejnowski
-----
ON-LINE - http://neco.mitpress.org/
SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES
USA Canada* Other Countries
Student/Retired $60 $64.20 $108
Individual $95 $101.65 $143
Institution $590 $631.30 $638
* includes 7% GST
MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902.
Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu
-----
From Olivier.Buffet at loria.fr Fri Jul 11 11:13:11 2003
From: Olivier.Buffet at loria.fr (Olivier Buffet)
Date: Fri, 11 Jul 2003 17:13:11 +0200
Subject: EWRL-6 : Call for Participation
References: <3E9D2408.7000301@loria.fr>
Message-ID: <3F0ED407.7038A41@loria.fr>
-- Please excuse us if you receive multiple copies of this message ---
********* Please Distribute around you ********
Call for participation
European Workshop on Reinforcement Learning
EWRL-6
Nancy, FRANCE, September 4-5, 2003
URL: http://www.loria.fr/conferences/EWRL6/
Reinforcement learning (RL) is a growing research area. To build an
European RL community and give visibility to the current situation in
the old continent, we are running a now biennial series of workshops.
EWRL-1 took place in Brussels, Belgium (1994), EWRL-2 in Milano, Italy
(1995), EWRL-3 in Rennes, France (1997), EWRL-4 in Lugano, Switzerland
(1999), and EWRL-5 in Utrecht, the Netherlands (2001). EWRL-6 will take
place in Nancy, France.
The workshop will feature a plenary talk by Bernard Walliser, professor
at Ecole Nationale des Ponts et Chausses and Ecole Polytechnique
(Paris). He is also research director of the ECCO group (CNRS). He is
working on cognitive economics and game theory.
The rest of the two-day workshop will be dedicated to presentations
given by selected participants. The program will be on-line next week.
An inscription fee of euros 200 (only euros 100 for students) will cover
local organization expenses, lunch, coffee breaks, the proceedings, and
a social dinner on Thursday evening.
Registration procedure is detailed at :
http://www.loria.fr/conferences/EWRL6/Inscription/inscription_form.htm
If you have any question, please contact dutech at loria.fr and
buffet at loria.fr
From jason at cs.jhu.edu Sat Jul 12 13:21:56 2003
From: jason at cs.jhu.edu (Jason Eisner)
Date: Sat, 12 Jul 2003 13:21:56 -0400 (EDT)
Subject: postdoc opportunities at Johns Hopkins
Message-ID: <200307121721.h6CHLus17035@emu.cs.jhu.edu>
Johns Hopkins University seeks to hire outstanding postdoctoral
researchers immediately at its Center for Language and Speech
Processing (CLSP). Candidates should have previous experience in
quantitative approaches to machine learning, speech, language, or
other AI domains. Strong computational and mathematical skills are
required.
CLSP is a leading center for research on speech and language. It
specializes in formal and quantitative approaches such as
probabilistic modeling, unsupervised machine learning, and grammar
formalisms.
Our core faculty presently include:
Luigi Burzio Cognitive Science
Bill Byrne Electrical & Computer Engineering
Jason Eisner Computer Science
Bob Frank Cognitive Science
Fred Jelinek Electrical & Computer Engineering
Sanjeev Khudanpur Electrical & Computer Engineering
Paul Smolensky Cognitive Science
David Yarowsky Computer Science
We are looking for postdocs to contribute to one or more of the
following long-term projects funded by NSF and/or DoD. Postdocs
participating in these highly visible projects can expect to gain
considerable research experience in speech and language technology.
Speech Recognition
* MALACH: Multilingual Access to Large Spoken Archives
* ASR for Rich Transcription of Conversational Mandarin
Machine Translation
* Improving Statistical Translation Models Via Text Analyzers Trained
from Parallel Corpora
Algorithmic Infrastructure
* Weighted Dynamic Programming and Finite-State Modeling
for Statistical NLP
Applicants are invited to email us a CV, a one-page statement of
research interests, a list of three references, and a cover letter
that briefly summarizes qualifications. Applications may be sent to
Sue Porterfield at sec at clsp.jhu.edu (fax to +1 410 516 5050 if email
is not possible).
Johns Hopkins University is located in Baltimore, Maryland, USA.
Our URL is http://www.clsp.jhu.edu/.
From bolshausen at rni.org Sun Jul 13 16:53:42 2003
From: bolshausen at rni.org (Bruno Olshausen)
Date: Sun, 13 Jul 2003 13:53:42 -0700
Subject: Workshop on Inference and Prediction in Neocortical Circuits
Message-ID: <3F11C6D6.2070700@rni.org>
Dear Connectionists,
The American Institute of Mathematics will be hosting a
workshop on "Inference and Prediction in Neocortical Circuits,"
September 21-24, in Palo Alto, California. Please see
http://www.aimath.org/ARCC/workshops/brain.html
Space and funding are available for a few more participants.
If you would like to participate, please apply by filling out
the on-line form at
http://koutslts.bucknell.edu/~aimath/WWN/cgi-bin/participantapply.prl?workshop=14
no later than August 1, 2003. Applications are open to all,
and we especially encourage women, underrepresented minorities,
junior mathematicians, and researchers from primarily
undergraduate institutions to apply.
Bruno
--
Bruno A. Olshausen (650) 321-8282 x233
Redwood Neuroscience Institute (650) 321-8585 (fax)
1010 El Camino Real http://www.rni.org
Menlo Park, CA 94025 bolshausen at rni.org
From no-spam-please-find-my-address-typing-frasconi-email at google.com Sun Jul 13 20:41:26 2003
From: no-spam-please-find-my-address-typing-frasconi-email at google.com (Paolo Frasconi)
Date: Mon, 14 Jul 2003 02:41:26 +0200
Subject: New book: Modeling the Internet and the Web (Wiley 2003)
Message-ID:
Some of the readers of this list might be interested in the following
book
Pierre Baldi, Paolo Frasconi, and Padhraic Smyth, Modeling the
Internet and the Web Probabilistic Methods and Algorithms Wiley, 2003,
ISBN: 0-470-84906-1.
It covers various models and algorithms for the Web including
generative models of networks, IR and machine learning algorithms for
text analysis, link analysis, focused crawling, methods for modeling
user behavior, and for mining Web e-commerce data.
1. Mathematical Background - 2. Basic WWW Technologies - 3. Web Graphs -
4. Text Analysis - 5. Link analysis - 6. Advanced Crawling Techniques -
7. Modeling and Understanding Human Behavior on the Web - 8. Commerce
on the Web: Models and Applications - Appendix A Mathematical
Complements - Appendix B List of Main Symbols and Abbreviations
The webpage http://ibook.ics.uci.edu/ contains more details, a
hyperlinked bibliography, and a sample chapter in pdf.
Regards,
Paolo Frasconi
http://www.dsi.unifi.it/~paolo/
From ddlewis4 at worldnet.att.net Sun Jul 13 22:32:32 2003
From: ddlewis4 at worldnet.att.net (ddlewis4@worldnet.att.net)
Date: Sun, 13 Jul 2003 21:32:32 -0500
Subject: Research Software Developer w/ Ornarose, Inc. (short term, Chicago or NJ)
Message-ID: <676e01c349b0$6cb21df0$0500a8c0@colussus>
Company: Ornarose Inc.
Location: Northern New Jersey or Chicago, IL
Title: Research Software Developer - Data
Mining/Statistics/Text Classification
Requirements:
B.S., M.S., or Ph.D. in computer science, statistics,
applied mathematics, or related field.
5+ years professional software development experience.
2+ years professional experience with one or more of the
following: machine learning, data mining, statistics, pattern
recognition, numerical optimization, numerical analysis.
Experience with designing and running computational
experiments in computer science or statistics highly
desirable. Also desirable is experience in information
retrieval, text categorization, natural language processing,
computational linguistics, or text mining.
C and Perl proficiency required. C++ proficiency
desirable.
Unix/Linux experience required. Windows experience
desirable.
Excellent verbal and written communication skills in
English.
Demonstrated ability to meet deadlines and communicate
effectively when working from home.
Responsibilities:
This is a short-term (5 to 6 month) position for an
SBIR-supported startup company. Developer will be
responsible for prototyping and testing advanced algorithms
for supervised and unsupervised learning, prediction,
classification, etc. Work also includes preparation and
cleaning of large text and non-text data sets,
experimentation with new algorithms and modeling techniques,
and measuring the effectiveness of these techniques and
their demands for computing resources.
Interested candidates should send a resume (leads also
welcome) in ASCII or PDF to job2003a at ornarose.com.
Ornarose, Inc. is an equal opportunity employer.
Because the position begins immediately, the candidate must
be eligible to work legally in the United States throughout
2003.
From gbarreto at sel.eesc.sc.usp.br Mon Jul 14 13:44:03 2003
From: gbarreto at sel.eesc.sc.usp.br (Guilherme de Alencar Barreto)
Date: Mon, 14 Jul 2003 14:44:03 -0300 (EST)
Subject: Papers on Self-Organizing Neural Networks
Message-ID:
Dear Connectionists,
Maybe the following papers can be of interest for those working
with Unsupervised Neural Networks and their applications to
generative modeling and robotics.
Abstracts and downloadable draft
versions can be found at http://www.deti.ufc.br/~guilherme/publicacoes.htm
Best regards,
Guilherme A. Barreto
Department of Teleinformatics Engineering
Federal University of Ceara, BRAZIL
------------------------------------
Paper (1):
Barreto, G.A., Araújo, A.F.R. and Kremer, S. (2003).
"A taxonomy for spatiotemporal connectionist networks revisited: The
unsupervised case."
Neural Computation, 15(6):1255-1320.
------------------------------------
Paper (2):
Barreto, G.A., Araújo, A.F.R. and Ritter, H. (2003).
"Self-organizing feature maps for modeling and control of robotic
manipulators."
Journal of Intelligent and Robotic Systems, 36(4):407-450.
------------------------------------
Paper (3):
Barreto, G.A., Araújo, A.F.R., Dücker, C. and Ritter, H. (2002).
"A distributed robotic control system based on a Temporal Self-Organizing
Neural Network"
IEEE Transactions on Systems, Man, and Cybernetics, C-32(4):347-357.
From amasuoka at atr.co.jp Mon Jul 14 07:03:34 2003
From: amasuoka at atr.co.jp (Aya Masuoka)
Date: Mon, 14 Jul 2003 20:03:34 +0900
Subject: ATR CNS Labs Inaugural Symposium
Message-ID:
Dear members of the Connectionists,
We are happy to announce the Inaugural Symposium of ATR Computational
Neuroscience Laboratories, which started
on May, 2003. In addition to the speakers from the new
laboratories, we will have two keynote speakers,
Dr. Dietmar Plenz from National Institute of Health and Dr. Miguel
A.L. Nicolelis from Duke University .
The symposium is open to everyone.
Please join us to commemorate this event together.
ATR CNS Labs Inaugural Symposium
Date: Monday, August 4, 2003
Place: ATR Main Conference Room
Hosted by :ATR Computational Neuroscience Laboratories
For latest information on the symposium: http://www.cns.atr.co.jp/events.html
Please complete the registration form provided below to register for
the event and
return by 7/30 by email: amasuoka at atr.co.jp
***********************************************************
Registration Form
Last Name:
First Name:
MI:
Institution/Agency:
Email Address:
Please type 'X' next to the following options.
I will attend :
the symposium
the lab tour
the reception
all of the above
I have special dietary requirement:
Please state clearly your requirement.
Fee: \1,000 for participating the reception.?
There is no fee for the symposium and lab tour.
Method of Payment: Please make a payment at the on-site registration
desk by cash.
***********************************************************
12:30 PM - Registration
1:00 PM -1:05 PM Mitsuo Kawato, Director (ATR Computational
Neuroscience Laboratories)
Opening Remarks
1:05 PM -1:30 PM Kenji Doya (Department Head, Computational
Neurobiology, ATR, CNS)
"Neural mechanisms of reinforcement learning"
1:30 PM - 1:55 PM Hiroshi Imamizu (Department Head, Cognitive
Neuroscience, ATR, CNS)
"Internal models for cognitive functions"
1:55 PM - 2:20 PM Gordon Cheng (Department Head, Humanoid
Robotics and Computational Neuroscience, ATR, CNS)
"Paving the paths to the brain with humanoid robotics"
2:20 PM - 2:40 PM Coffee Break
2:40 PM - 3:40 PM Dr. Dietmar Plenz (National Institute of Health)
3:40 PM - 4:40 PM Dr. Miguel A.L. Nicolelis (Duke University)
4:40 PM - 5:05 PM Mitsuo Kawato, Director (ATR, CNS)
"Controversies in computational motor control."
5:05 PM - 6:00 PM Lab tour
6:00 PM - Reception at the ATR Cafeteria
***********************************************************
Thank you.
We look forward to seeing you at the symposium.
Mitsuo Kawato, Director
ATR Computational Neuroscience Laboratories
--
---------------------------
Aya Masuoka,
Computational Neuroscience Laboratories, ATR International
Department of Computational Neurobiology
2-2-2 Hikaridai,Keihanna Science City
Kyoto,619-0288 Japan
TEL +81-774-95-1252 FAX +81-774-95-1259
EMAIL amasuoka at atr.co.jp
==============================
CNS was established in May 1, 2003!
==============================
From canete at ctima.uma.es Mon Jul 14 07:18:40 2003
From: canete at ctima.uma.es (=?iso-8859-1?Q?Javier_Fern=E1ndez_de_Ca=F1ete?=)
Date: Mon, 14 Jul 2003 13:18:40 +0200
Subject: EANN'03 Final Programme (8-10 September 2003, Malaga SPAIN)
Message-ID: <002801c349f9$ade29d60$836dd696@isa.uma.es>
Dear colleagues:
This e_mail is to inform that you can find the Final Programme of the
Engineering Application of Neural Networks (EANN'03) available at the
web page
http://www.isa.uma.es/eann03
With regards
Javier Fernandez de Canete
EANN'03 Secretariat
eann03 at ctima.uma.es
Prof. Javier Fernandez de Canete. Ph. D.
Dpto. de Ingenier=EDa de Sistemas y Automatica
E.T.S.I. Informatica
Campus de Teatinos, 29071 Malaga (SPAIN)
Phone: +34-95-2132887
FAX: +34-95-2133361
e_mail: canete at ctima.uma.es
From James-Johnson at nyc.rr.com Tue Jul 15 10:28:03 2003
From: James-Johnson at nyc.rr.com (James Johnson)
Date: Tue, 15 Jul 2003 10:28:03 -0400
Subject: A Generative Theory of Shape
References: <000101c342a5$8aec87e0$66df75d8@thinkpad>
Message-ID: <000f01c34add$4def6bb0$ff5a6c42@ibmntgzhmy5bef>
The following book has just appeared in Springer-Verlag.
A Generative Theory of Shape
Michael Leyton
Springer-Verlag, 550 pages
--------------------------------------------------------------------
The purpose of the book is to develop a generative theory of shape
that has two properties regarded as fundamental to intelligence -
maximizing transfer of structure and maximizing recoverability of the
generative operations. These two properties are particularly important
in the representation of complex shape - which is the main concern of
the book. The primary goal of the theory is the conversion of
complexity into understandability. For this purpose, a mathematical
theory is presented of how understandability is created in a
structure. This is achieved by developing a group-theoretic approach
to formalizing transfer and recoverability. To handle complex shape, a
new class of groups is developed, called unfolding groups. These
unfold structure from a maximally collapsed version of that
structure. A principal aspect of the theory is that it develops a
group-theoretic formalization of major object-oriented concepts such
as inheritance. The result is an object-oriented theory of geometry.
The algebraic theory is applied in detail to CAD, perception, and
robotics. In CAD, lengthy chapters are presented on mechanical and
architectural design. For example, using the theory of unfolding
groups, the book works in detail through the main stages of mechanical
CAD/CAM: part-design, assembly and machining. And within part-design,
an extensive algebraic analysis is given of sketching, alignment,
dimensioning, resolution, editing, sweeping, feature-addition, and
intent-management. The equivalent analysis is also done for
architectural design. In perception, extensive theories are given for
grouping and the main Gestalt motion phenomena (induced motion,
separation of systems, the Johannson relative/absolute motion
effects); as well as orientation and form. In robotics, several levels
of analysis are developed for manipulator structure, using the
author's algebraic theory of object-oriented structure.
--------------------------------------------------------------------
This book can be viewed electronically at the following site:
http://link.springer.de/link/service/series/0558/tocs/t2145.htm
--------------------------------------------------------------------
Author's address:
Professor Michael Leyton,
Center for Discrete Mathematics,
& Theoretical Computer Science (DIMACS)
Rutgers University, Busch Campus,
New Brunswick, NJ 08854,
USA
E-mail address: mleyton at dimacs.rutgers.edu
--------------------------------------------------------------------
From juhn at utopiacompression.com Tue Jul 15 16:48:14 2003
From: juhn at utopiacompression.com (Juhn Maing)
Date: Tue, 15 Jul 2003 13:48:14 -0700
Subject: Job posting: Sr. machine learning scientist
Message-ID: <004501c34b12$69c7eb70$066fa8c0@JUHN>
JOB POSTING: SENIOR MACHINE LEARNING SCIENTIST
UtopiaCompression (UC) is an early-stage, intelligent imaging solutions
company. UC's core offering is an intelligent image compression
technology, which was recognized in 2002 as one of the top emerging
technologies in the US by the National Institute of Standards and
Technology
(http://jazz.nist.gov/atpcf/prjbriefs/prjbrief.cfm?ProjectNumber=00-00-4
936).
Job Description
UtopiaCompression is looking for a highly qualified candidate with
extensive experience and knowledge in machine learning, data mining and
knowledge discovery. The candidate is required to have an MS or Ph.D. in
the areas mentioned above from a highly reputable university.
Post-doctorate and/or industry experience is strongly preferred. The
ideal candidate will be thoroughly versed in the latest research,
methods, developments and theories in machine learning and data mining
as well as possess in-depth experience applying them to commercial,
scientific or industrial applications. The candidate is also required to
be a visionary, highly creative and a great problem solver capable of
proposing solutions to multiple problems in parallel, and mentoring and
guiding R&D engineers in developing and implementing the solutions.
Permanent residents or US citizens are preferred.
This position is ideally suited for full-time employment, but part-time,
contract and contract-to-hire arrangements may also be considered.
Skills & Qualifications
1 - In-depth knowledge and experience in statistical analysis, reasoning
and learning (e.g., Bayesian learning, estimation maximization and
maximum likelihood algorithms, and feature extraction problems),
(statistical) combinatorial optimization and learning (e.g., simulated
annealing, genetic programming), neural networks, inductive and rule
generation learning, fuzzy reasoning, (numeric) decision tree learning,
search methods, (image) data mining and understanding, etc. Candidates
are expected to have knowledge and working experience in various
learning regimes. For instance, in the case of layered neural nets
dexterous familiarity with the back propagation algorithm, radial basis
functions, etc., in decision tree learning working experience in
information gain measure, category utility function, tree pruning, etc.
2 - Dexterous familiarity with various machine learning and statistical
software tools.
3 - Fluency in software analysis, design and development using C
programming environment. Candidates must be well versed and experienced
in C. Working experience in C++ (and Java) is a plus.
4 - Knowledge and working experience with image compression techniques,
and image analysis and processing is a big plus.
Contact:
Juhn Maing
Product Manager
UtopiaCompression
Tel: 310-473-1500 x104
Email: juhn at utopiacompression.com
From poznan at iub-psych.psych.indiana.edu Wed Jul 16 16:50:40 2003
From: poznan at iub-psych.psych.indiana.edu (Roman Poznanski)
Date: Wed, 16 Jul 2003 13:50:40 -0700
Subject: JIN, Vol. 2, No. 1, June 2003
Message-ID: <3F15BAA0.7040109@iub-psych.psych.indiana.edu>
[ Moderator's note: this journal special issue may be of interest to
readers of Connectionists, but only the article abstracts are
available free online. Full text requires a subscription. -- DST ]
Special Issue: Complex Nonlinear Neural Dynamics: Experimental
Advances and Theoretical Interpretations
Editorial
Peter Andras, Robert Kozma and Peter Erdi 1
The Wave Packet: An Action Potential for the 21st Century
Walter J. Freeman 3
Two Species of Gamma Oscillations in the Olfactory Bulb: Dependence on
Behavioral State and Synaptic Interactions
Leslie M. Kay 31
The Global Effects of Stroke on the Human Electroencephalogram
Rudolph C. Hwa, Wei He and Thomas C. Ferree 45
A Model for Emergent Complex Order in Small Neural Networks
Peter Andras 55
Dimension Change, Coarse Grained Coding and Pattern Recognition in
Spatio-Temporal Nonlinear Systems
David DeMaris 71
On the Formation of Persistent States in Neuronal Network Models of
Feature Selectivity
Evan C. Haskell and Paul C. Bressloff 103
Basic Principles of the KIV Model and its Application to the Navigation
Problem
Robert Kozma and Walter J. Freeman 125
Book Review
Book Review: "Computational Neuroanatomy: Principles and Methods", G. A.
Ascoli, ed., (2002)
A. Garenne and G. A. Chauvet 147
--
Roman R. Poznanski, Ph.D
Associate Editor,
Journal of Integrative Neuroscience
Department of Psychology
Indiana University
1101 E. 10th St.
Bloomington, IN 47405-7007
email: poznan at iub-psych.psych.indiana.edu
phone (Office): (812) 856-0838
http://www.worldscinet.com/jin/mkt/editorial.shtml
From bogus@does.not.exist.com Wed Jul 16 11:13:28 2003
From: bogus@does.not.exist.com ()
Date: Wed, 16 Jul 2003 11:13:28 -0400
Subject: postdoc position available
Message-ID:
From stefan.wermter at sunderland.ac.uk Thu Jul 17 08:23:49 2003
From: stefan.wermter at sunderland.ac.uk (Stefan Wermter)
Date: Thu, 17 Jul 2003 13:23:49 +0100
Subject: Stipends for MSc Intelligent Systems
Message-ID: <3F169555.446E5938@sunderland.ac.uk>
Stipends available for MSc Intelligent Systems
----------------------------------
We are pleased to announce that for eligible EU students we have obtained
funding to offer a bursary for our new MSc Intelligent Systems
worth up to 9.000 EURO as fee waiver and stipend.
***Please forward to students who may be interested.***
The School of Computing and Technology, University of Sunderland
is delighted to announce the launch of its new MSc Intelligent Systems
programme for October 2003. Building on the School's leading edge
research in intelligent systems this masters programme will be
funded via the ESF scheme (see below).
Intelligent Systems is an exciting field of study for science and
industry since the currently existing computing systems have
often not yet reached the various aspects of human performance.
"Intelligent Systems" is a term to describe software systems and
methods, which simulate aspects of intelligent behaviour. The intention
is to learn from nature and human performance in order to build more
powerful computing systems. The aim is to learn from cognitive science,
neuroscience, biology, engineering, and linguistics for building more
powerful computational system architectures. In this programme a
wide variety of novel and exciting techniques will be taught including
neural networks, intelligent robotics, machine learning, natural language
processing, vision, evolutionary genetic computing, data mining,
information retrieval, Bayesian computing, knowledge-based systems,
fuzzy methods, and hybrid intelligent architectures.
Programme Structure
--------------
The following lectures/modules are available (at least modules with *
are intended to be available for the Oct. 2003 cohort entry)
Neural Networks *
Intelligent Systems Architectures *
Learning Agents *
Evolutionary Computation
Cognitive Neural Science *
Knowledge Based Systems and Data Mining *
Bayesian Computation
Vision and Intelligent Robots *
Natural Language Processing *
Dynamics of Adaptive Systems
Intelligent Systems Programming *
Funding up to 6000 pounds (about 9.000Euro) for eligible students
------------------------------
The Bursary Scheme applies to this Masters programme commencing
October 2003 and we have obtained funding through the European
Social Fund (ESF). ESF support enables the University to waive the
normal tuition fee and provide a bursary of 75 per week for 45 weeks
for eligible EU students, together up to 6000 pounds or 9000 Euro.
For further information in the first instance please see:
http://www.his.sunderland.ac.uk/Teaching_frame.html
http://osiris.sund.ac.uk/webedit/allweb/courses/progmode.php?prog=G550A&mode=FT&mode2=&dmode=C
For information on applications and start dates contact:
gillian.potts at sunderland.ac.uk Tel: 0191 515 2758
For academic information about the programme contact:
alfredo.moscardini at sunderland.ac.uk
Please forward to interested students.
Stefan
***************************************
Stefan Wermter
Professor for Intelligent Systems
Centre for Hybrid Intelligent Systems
School of Computing and Technology
University of Sunderland
St Peters Way
Sunderland SR6 0DD
United Kingdom
phone: +44 191 515 3279
fax: +44 191 515 3553
email: stefan.wermter at sunderland.ac.uk
http://www.his.sunderland.ac.uk/~cs0stw/
http://www.his.sunderland.ac.uk/
****************************************
From norbert at cn.stir.ac.uk Fri Jul 18 10:09:23 2003
From: norbert at cn.stir.ac.uk (Norbert Krueger)
Date: Fri, 18 Jul 2003 15:09:23 +0100
Subject: Special Session: NEXT GENERATION VISION SYSTEMS
Message-ID: <3F17FF93.5DA48668@cn.stir.ac.uk>
Dear Colleagues,
I would like to point you to the special session
NEXT GENERATION VISION SYSTEMS
to be held at the
Fourth International ICSC Symposium at the
ENGINEERING OF INTELLIGENT SYSTEMS (EIS 2004)
With best regards
Norbert Krueger
_______________________________________________________
Special Session
NEXT GENERATION VISION SYSTEMS
Fourth International ICSC Symposium at the
ENGINEERING OF INTELLIGENT SYSTEMS (EIS 2004)
http://www.icsc-naiso.org/conferences/eis2004/index.html
February 29 - March 2, 2004 at the University of Madeira,
Island of Madeira, Portugal
Organisers:
Dr. Norbert Krueger
University of Stirling
Email: norbert at cn.stir.ac.uk
http://www.cn.stir.ac.uk/~norbert
Dr. Volker Krueger
Aalborg University, Esbjerg
Email: vok at cs.aue.auc.dk
Dr. Florentin Woergoetter
University of Stirling
Stirling FK9 4LA Scotland, UK
Email: worgott at cn.stir.ac.uk
Abstract
Vision based devices have been entering the industrial and
private world more and more successfully: Face recognition
systems control the access to buildings; airports and train
stations are controlled by Video Surveillance devices; and
cars become equipped with vision based driver assistance
systems.
However, the gap between human performance and the
top performance of today's artificial visual systems is
considerable. Especially, scene analysis in unfamiliar
environments allowing for highly reliable actions is yet an
outstanding quality of biological systems.
The next generation of vision systems will have to show
stable and reliable performance in uncontrolled environments
in real time. To achieve reliability these systems need to make
use of regularities in visual data. In this respect, the
representation of the temporal structure of visual data as
well as the fusion of visual sub-modalities are crucial.
Such systems also need to be equipped with a sufficient
amount of prestructured knowledge as well as the ability to
deal with uncertainties and to learn in complex environments.
The invited session focusses on requirements for and prospects
of future vision systems. This covers all questions of visual
representation and integration as well as questions of hardware
and software design.
Submission Deadline: 15.9.2003
Maximum number of pages: Fifteen pages (including
diagrams and references)
Papers (either as pdf or postscript) to be send to
norbert at cn.stir.ac.uk
From levys at wlu.edu Fri Jul 18 16:03:04 2003
From: levys at wlu.edu (Simon Levy)
Date: Fri, 18 Jul 2003 16:03:04 -0400
Subject: Software Release Announcement: SNARLI, free/open-source Java package
for neural nets
Message-ID: <3F185278.3040403@wlu.edu>
Dear Connectionists,
I would like to announce the release of a free, open-source Java package
that may be of interest to members of this list. This package is
currently available at http://snarli.sourceforge.net, and is described
below.
Please feel free to download this package, and contact me with question,
criticism, or suggestions. I am especially interested in hearing from
educators and researchers who find the package useful in their work, and
anyone who has a feature or neural architecture that they would like to
see implemented.
Thanks,
Simon
========================
Simon D. Levy
Assistant Professor
Computer Science Department
Washington & Lee University
Lexington, VA 24450
540-458-8419 (voice)
540-458-8479 (fax)
levys at wlu.edu
http://www.cs.wlu.edu/~levy
*SNARLI* (*/S/*imple */N/*eural */AR/*chitecture */LI/*brary) is a Java
package containing two classes: BPLayer, a general back-prop layer
class, and SOM, a class for the Kohonen Self-Organizing Map. BPLayer
also supports sigma-pi connections and back-prop-through-time, allowing
you to build just about any kind of back-prop network found in the
literature.
*SNARLI* differs from existing neural-net packages in two important
ways: First, it is /not/ GUI-based. Instead, it is meant as a code
resource that can be linked directly to new or existing Java-based
projects, for those who want to try a neural-network approach without
having to write a lot of new code. Given the variety of platforms that
currently interface to Java, from HTML to Matlab
, it made more sense to me to focus on the
neural net algorithms, and leave the GUI development to others.
Second, *SNARLI* gets a great deal of mileage out of a single class
(BPLayer), instead of adding a new class for each type of network. Using
this class, my students and I have been able to construct a large
variety of back-prop networks, from simple perceptrons through Pollack's
RAAM , with very
little additional coding. We have used these networks successfully in
coursework , thesis projects
, and research
.
Future versions of *SNARLI* may include classes to support other popular
architectures, such as Support Vector Machines
(SVMs), Hopfield Nets
,
and Long Short-Term Memory
(LSTM), as user
interest dictates.
From ahu at cs.stir.ac.uk Fri Jul 18 20:51:44 2003
From: ahu at cs.stir.ac.uk (Dr. Amir Hussain)
Date: Sat, 19 Jul 2003 01:51:44 +0100
Subject: Final Call for Papers: IJRA Journal Special Issue on Neuromorphic Systems ( IASTED / ACTA Press, Vol.19, 2004)
Message-ID: <002101c34d8f$ee4cf9b0$4f98fc3e@DrAmir>
Please post and distribute to colleagues and friends:
http://www.actapress.com/journals/specialra6.htm
Final Call for Papers: (with apologies for cross-postings!)
Note extended paper submission deadline (upon request from numerous
authors) of: 1 Sep 2003
For readership of the International Journal of Robotics & Automation
(IJRA), please see the parent organization (IASTED's) website:
http://www.iasted.org/
----------
Call for Papers: Special Issue on "Neuromorphic Systems" International
Journal of Robotics & Automation (IJRA),
IASTED / ACTA Press, Vol.19, 2004
There has recently been a growing interest in neuromorphic systems
research, which is part of the larger field of computational
neuroscience. Neuromorphic systems are implementations in silicon of
systems whose architecture and design are based on neurobiology. In
general, however, neuromorphic systems research is not restricted to one
specific implementation technology. This growing area proffers exciting
possibilities, such as sensory systems that can compete with human
senses, pattern recognition systems that can run in real time, and
neuron models that can truly emulate living neurons. Neuromorphic
systems are at the intersection of neuroscience, computer science, and
electrical engineering.
The earliest neuromorphic systems were concerned with providing an
engineering approximation of some aspects of sensory systems, such as
the detection of sound in the auditory system or the detection of light
in the visual system. More recently, there has been considerable work on
robot control systems, on modelling various types of neurons, and on
including adaptation in hardware systems. Biorobotics, or the
intersection between biology and robotics, is a growing area in
neuromorphic systems. Biorobotics aims to investigate biological
sensorimotor control systems by building robot models of them. This
includes the development of novel sensors and actuators, hardware and
software emulations of neural control systems, and embedding and testing
devices in real environments.
The aim of this Special Issue on Neuromorphic Systems is to bring
together active researchers from different areas of this
interdisciplinary field, and to report on the lastest advances in this
area.
Contributions are sought from (amongst others):
- engineers interested in designing and implementing systems based on
neurobiology
- neurobiologists interested in engineering implementations of systems
- modellers and theoreticians from all the relevant disciplines
Any topic relevant to neuromorphic systems and theory, sensory
neuromorphic systems, and neuromorphic hardware will be considered.
Instructions for Manuscripts:
All manuscripts should be e-mailed to the ACTA Press office at
calgary at actapress.com by September 1, 2003. On the e-mail subject line
please put "Submission for the IJRA Special Issue on Neuromorphic
Systems." The paper submission should include the name(s) of the
author(s) and their affiliations, addresses, fax numbers, and e-mail
addresses.
Manuscripts should strictly follow the guidelines of ACTA Press, given
at the following website:
http://www.actapress.com/journals/submission.htm
Important Dates:
Deadline for paper submission: September 1, 2003
Notification of acceptance: December 1, 2003
Final Manuscripts due: January 31, 2004
Publication in special issue: Vol.19, 2004
Guest Editors:
Dr. Amir Hussain
& Prof. L.S.Smith
Dept. of Computing Science & Mathematics
University of Stirling, Stirling FK9 4LA, Scotland, UK
Email: a.hussain at cs.stir.ac.uk Website: http://www.cs.stir.ac.uk/~ahu/
From aude.billard at epfl.ch Fri Jul 18 08:51:58 2003
From: aude.billard at epfl.ch (aude billard)
Date: Fri, 18 Jul 2003 14:51:58 +0200
Subject: Workshop on Robot Learning by Demonstration
Message-ID: <0a4f01c34d2b$5fd1e6f0$7391b280@sti.intranet.epfl.ch>
=======================
Call For Papers
======================
IROS-2003 Workshop on
Robot Learning by Demonstration
http://asl.epfl.ch/events/iros03Workshop/index.php
Friday 31st of October 2003, 12-5pm
IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems: IROS 2003
Bally's Las Vegas Hotel, October 27-31, 2003
Objectives:
===================
Programming by demonstration has become a key research topic in
robotics. It impacts both fundamental research and
application-oriented studies. Work in that area tackles the
development of robust algorithms for motor control, motor learning,
gesture recognition and visuo-motor integration. While the field has
been ongoing for more than twenty years, recent developments, taking
inspiration in biological mechanisms of imitation, have brought a new
perspective, which this workshop aims at assessing.
Call for Papers:
===================
We solicit papers relevant to the general workshop theme in the three
categories:
- research papers
- application papers
- challenge/position statements (typically only 1 or 2 pages)
Relevant workshop topics include (non-exhaustive list):
- Programming by Demonstration
- Imitation learning
- Task and Skill Learning
- Motor control
- Motor learning
- Visuo-motor Integration
- Gesture recognition
Important Dates:
===============
- August 4, 2003 Deadline for paper submission
- August 15, 2003 Notification of acceptance
- August 26, 2003 Deadline for final contributions
Papers should not exceed 8 pages and conform to the single column,
10pt, A4 format. Papers should be submitted by email to:
aude.billard at epfl.ch
Proceedings will be distributed at the Workshop. A number of papers
presented in this workshop will be selected for publication in a
special issue of the Robotics and Autonomous Systems journal.
Detailed Information:
===================
For more detailed information, please visit the workshop website at
http://asl.epfl.ch/events/iros03Workshop/index.php
Program Chairs:
===================
Aude Billard &
Roland Siegwart
Autonomous Systems Lab
EPFL, Swiss Institute of Technology
CH-1015 Lausanne, Switzerland
http://asl.epfl.ch
Program Committee:
====================
Luc Berthouze, ETL, Japan
Henrik Christensen, KTH, Sweden
Kerstin Dautenhahn, University of Hertfordshire, UK
Yiannis Demiris, Imperial College London, UK
Rudiger Dillmann, Karlsruhe, Germany
Auke Jan Ijspeert, EPFL, CH
Helge Ritter, University of Bielefeld, Germany
Stefan Schaal, University of Southern California, USA
Ales Ude, ATR, Japan
Jianwei Zhang, University of Hamburg, Germany
From wolpert at hera.ucl.ac.uk Fri Jul 18 05:43:35 2003
From: wolpert at hera.ucl.ac.uk (Daniel Wolpert)
Date: Fri, 18 Jul 2003 10:43:35 +0100
Subject: Postdoctoral Positions in Sensorimotor Control
Message-ID: <003901c34d11$0e9674f0$51463ec1@aphrodite>
Two Postdoctoral Research Fellows
Sensorimotor Control Laboratory
Sobell Department of Motor Neuroscience & Movement Disorders
Institute of Neurology
University College London
The Sensorimotor Control Laboratory, under the direction of Professor
Daniel Wolpert has openings for two highly motivated Postdoctoral
Research Fellows in the area of computational and experimental human
motor control. The Fellows will join a team investigating planning,
control and learning of skilled action.One Research Fellow will work
on modelling of the motor system using optimal control and Bayesian
approaches and should have a background in a computational field
(e.g. Computational Neuroscience, Engineering, Physics, Maths). The
other will work on psychophysical studies of human motor control and
should have a background in an experimental field (e.g. Neuroscience,
Psychology). Applicants should ideally have a PhD, plus technical
expertise and computational skills relevant to the study of human
movement. Further details of both posts and laboratory facilities can
be found on www.hera.ucl.ac.uk/vacancies. Informal enquiries can be
addressed to Professor Daniel Wolpert by email to
wolpert at hera.ucl.ac.uk.
The positions are available for two years in the first instance with a
starting date from September 2003. Starting salary is up to =A332,794
pa inclusive, depending on experience, superannuable. Applicants
should provide (ideally by email) by August 11th 2003:
- a maximum 1 page statement of research interests relevant to the
project
- copy of CV (2 if sent by post)
- names and contact details of 3 referees
- 1 copy of Declaration (required - see further details of posts)
- Equal Opportunities form (optional - see further details of posts)
to:
Miss E Bertram,
Assistant Secretary (Personnel)
Institute of Neurology
Queen Square
London WC1N 3BG
Fax: +44 (0)20 7278 5069
Email: e.bertram at ion.ucl.ac.uk
Taking Action for Equality
From dhwang at cs.latrobe.edu.au Sun Jul 20 21:47:01 2003
From: dhwang at cs.latrobe.edu.au (Dianhui Wang)
Date: Mon, 21 Jul 2003 11:47:01 +1000
Subject: Call for Papers
References: <3EC48075.E8BBA612@cs.latrobe.edu.au>
Message-ID: <3F1B4614.4A072F26@cs.latrobe.edu.au>
Dear Colleages,
This email solicits your submission for Invited Session on Advances in
Design, Analysis and Applications of Neural/Neuro-fuzzy Classifiers,
KES2004: 8th International Conference on Knowledge-Based Intelligent
Information & Engineering Systems21th-24th September 2004, Hotel
Intercontinental, Wellington, New Zealand.
Details of the "Call for Paper" could be found at
http://homepage.cs.latrobe.edu.au/dhwang/KES04.htm
I am looking forward to receiving your submissions.
Kind regards,
Dr Dianhui Wang (Session Chair)
Department of Computer Science and Computer Engineering
La Trobe University, Melbourne, VIC 3083, Australia
Tel: +61 3 9479 3034 Fax:+61 3 9479 3060
Email: dhwang at cs.latrobe.edu.au
From bengio at idiap.ch Mon Jul 21 08:37:20 2003
From: bengio at idiap.ch (Samy Bengio)
Date: Mon, 21 Jul 2003 14:37:20 +0200 (CEST)
Subject: Open position for a senior in Machine Learning - IDIAP
Message-ID:
Open position for a Senior Researcher in Machine Learning
---------------------------------------------------------
The Dalle Molle Institute for Perceptual Artificial Intelligence (IDIAP,
http://www.idiap.ch) seeks qualified applicants to fill the position of
Senior Researcher in its Machine Learning group.
Given the current scientific strengths of IDIAP, the ideal candidate should
have strong research experience in machine learning problems related to speech
processing, vision processing, and above all, multimodal processing.
The ideal candidate will have been active for several years in the machine
learning research community, and have yielded a strong publication record. He
or She is expected to supervise PhD students and postdoctoral fellows in
machine learning, propose new research projects at a national and European
level, and be open to eventually giving lectures (either at IDIAP, or at the
nearby EPFL engineering school, http://www.epfl.ch). In fact, given the strong
relationship between IDIAP and EPFL, extremely qualified and experienced
candidates have the possibility of being offered an academic title of
professor at EPFL, while working at IDIAP.
IDIAP has recently been awarded several large research projects in multimodal
processing, both at the national and European level (see for instance
http://www.im2.ch), and the ideal candidate will be interested in the research
projects associated with this funding.
IDIAP is an equal opportunity employer and is actively involved in the
European initiative involving the Advancement of Women in Science. IDIAP seeks
to maintain a principle of open competition (on the basis of merit) to appoint
the best candidate, provide equal opportunity for all candidates, and equally
encourages both females and males to consider employment with IDIAP.
Although IDIAP is located in the French part of Switzerland, English is the
main working language. Free English and French lessons are provided.
IDIAP is located in the town of Martigny (http://www.martigny.ch) in Valais,
a scenic region in the south of Switzerland, surrounded by the highest
mountains of Europe, and offering exciting recreational activities, including
hiking, climbing and skiing, as well as varied cultural activities. It is
within close proximity to Montreux (Jazz Festival) and Lausanne.
Interested candidates should send a letter of application, along with their
detailed CV to jobs at idiap.ch. More information can also be obtained by
contacting Samy Bengio (bengio at idiap.ch).
----
Samy Bengio
Research Director. Machine Learning Group Leader.
IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Switzerland.
tel: +41 27 721 77 39, fax: +41 27 721 77 12.
mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio
From ckiw at inf.ed.ac.uk Mon Jul 21 07:37:53 2003
From: ckiw at inf.ed.ac.uk (Chris Williams)
Date: Mon, 21 Jul 2003 12:37:53 +0100 (BST)
Subject: Faculty positions at the British University in Dubai
Message-ID:
[note that machine learning is one of the areas highlighted --- Chris]
The British University in Dubai
Institute of Informatics and Communications
Chair and 4 Lectureships
* Context
The British University in Dubai is an important development in higher
education, providing cutting-edge research and education in key areas
of science and technology, and is supported in its early growth by the
University of Edinburgh and by other front-ranked UK universities.
Early research and teaching programmes will be developed in
association with the University of Edinburgh's 5*-rated School of
Informatics. Newly appointed staff will spend part of their first year
working with colleagues in Edinburgh and be eligible for Honorary
Fellowships in the University of Edinburgh. (This is intended to help
cement the foundations for continuing collaborative research
projects and exchanges.)
* Posts
The new Professor will be Director of the Institute, provide
leadership in creating innovative research and teaching programmes and
be involved in appointments to the lectureships. Appointment to the
chair and the 4 lectureships will be made in areas of Informatics
related to the first programmes to be developed by the Institute in:
- Natural Language and Speech Engineering
- Knowledge Management & Engineering
- Machine Learning
* Closing date
11 August 2003.
* Remuneration etc.
Full details are available at:
http://www.jobs.thes.co.uk/rs6/cl.asp?action=view_ad&ad_id=15134
These will shortly be copied to:
http://www.informatics.ed.ac.uk/events/vacancies/
You are encouraged to consult Professor Michael Fourman
(buid at inf.ed.ac.uk) to learn further details, and to discuss potential
ways of taking up a post.
From ncopp at jsd.claremont.edu Wed Jul 23 13:18:51 2003
From: ncopp at jsd.claremont.edu (Newton Copp)
Date: Wed, 23 Jul 2003 19:18:51 +0200
Subject: POSITION AVAILABLE - ENDOWED CHAIR
Message-ID: <5.1.0.14.0.20030718103048.00b13ee0@jsd.claremont.edu>
To all:
I would appreciate it if you would consider the position described below or
pass this announcement along to someone who might be interested.
Thank you,
Newt Copp
Regarding the William R. Kenan Professorship in Computational Neuroscience
at The Claremont Colleges;
The undergraduate colleges in the Claremont consortium (Claremont McKenna,
Harvey Mudd, Pitzer, Pomona, and Scripps Colleges) seek an accomplished,
broadly trained neuroscientist with expertise in computational work to fill
the William R. Kenan Chair beginning in September of 2004. The Kenan
Professorship was formed as an all-Claremont position to be held by a
person who has achieved a record of distinction in an interdisciplinary
area. The successful candidate will have an unusual opportunity to take a
leadership role in an intercollegiate, interdisciplinary Neuroscience
Program that focuses on undergraduate education and research and involves
faculty members in Biology, Psychology, Engineering, and Philosophy. A
commitment to excellence in undergraduate teaching, an interest in
exploring interdisciplinary collaborations, and an active research program
are expected. Area of research interest is open. Preference will be given
to candidates at the associate professor level or higher, although
outstanding candidates at the advanced assistant professor level may be
considered.
The Claremont Colleges include five highly selective liberal arts colleges,
the Claremont Graduate University, and the Keck Graduate Institute for
Applied Life Sciences (see http://www.claremont.edu/about.html). The hire
will be made within the Joint Science Department (see
http://www.jsd.claremont.edu/), a department of 22 faculty members in
Biology (12), Chemistry (7) and Physics (4) that is co-sponsored by
Claremont McKenna, Pitzer, and Scripps Colleges.
Send a curriculum vita, copies of three publications, statements of
research interests and teaching interests/philosophy to Kenan Search
Committee, W. M. Keck Science Center, 925 N. Mills Ave., Claremont, CA
91711. Arrange to have three letters of recommendation sent to the same
address. Please direct inquires to Newton Copp, Professor of Biology and
Chair of the Search Committee (tel: 909 621-8298; E-mail:
ncopp at jsd.claremont.edu). Review of applications will begin on Dec. 1,
2003 and continue until the position is filled. (This is a re-posting of a
position unfilled last year.)
In a continuing effort to enrich our academic environment and provide equal
educational and employment opportunities, The Claremont Colleges actively
encourage applications from women and members of historically
under-represented groups in higher education.
________________________
Newton Copp
Professor of Biology
Joint Science Department
The Claremont Colleges
Claremont, CA 91711
tel: 909 607-2932
fax: 909 621-8588
From bolshausen at rni.org Wed Jul 23 13:18:50 2003
From: bolshausen at rni.org (Bruno Olshausen)
Date: Wed, 23 Jul 2003 19:18:50 +0200
Subject: Workshop on Inference and Prediction in Neocortical Circuits
Message-ID: <3F11C7B5.4080501@rni.org>
The American Institute of Mathematics will be hosting a
workshop on "Inference and Prediction in Neocortical Circuits,"
September 21-24, in Palo Alto, California. Please see
http://www.aimath.org/ARCC/workshops/brain.html
Space and funding are available for a few more participants.
If you would like to participate, please apply by filling out
the on-line form at
http://koutslts.bucknell.edu/~aimath/WWN/cgi-bin/participantapply.prl?workshop=14
no later than August 1, 2003. Applications are open to all,
and we especially encourage women, underrepresented minorities,
junior mathematicians, and researchers from primarily
undergraduate institutions to apply.
--
Bruno A. Olshausen (650) 321-8282 x233
Redwood Neuroscience Institute (650) 321-8585 (fax)
1010 El Camino Real http://www.rni.org
Menlo Park, CA 94025 bolshausen at rni.org
From H.Bowman at kent.ac.uk Thu Jul 24 09:51:33 2003
From: H.Bowman at kent.ac.uk (hb5)
Date: Thu, 24 Jul 2003 14:51:33 +0100
Subject: NCPW 8 Call for Participation
Message-ID: <3F1FE465.A2BC7FB3@ukc.ac.uk>
Please distribute this call for participation to anybody you think might
be interested in this event. Apologies for multiple copies.
--------------------------------------------
CALL FOR PARTICIPATION
Eighth Neural Computation and Psychology Workshop (NCPW 8)
Connectionist Models of Cognition, Perception and Emotion
28-30 August 2003 at the
University of Kent at Canterbury, UK
The Eighth Neural Computation and Psychology Workshop (NCPW8)
will be held in Canterbury, England from 28-30th August 2003.
The NCPW series is now a well established and lively forum
that brings together researchers from such diverse disciplines
as artificial intelligence, cognitive science, computer science,
neuroscience, philosophy and psychology. 35 papers will be
presented, of which eight will be invited papers. In addition to the
high quality of the papers presented, this Workshop takes
place in an informal setting, in order to encourage interaction
among the researchers present.
Website
-------
More details, including registration information, can be found on
the conference website,
http://www.cs.ukc.ac.uk/events/conf/2003/ncpw/
The Programme
-------------
Highlights of the programme include a session on modelling face
perception, including three invited papers,
Gary Cottrell
University of California, San Diego, USA
Modeling Face Perception
Peter Hancock, Mike Burton and Rob Jenkins
Stirling University, Scotland
Face Recognition: Average or Examplar?
C.J. Solomon, S.J. Gibson, A. Pallares-Bejarano and M. Maylin
University of Kent at Canterbury
Exploring the Case for a Psychological "Face-space"
Five more invited papers have been scheduled,
John A. Bullinaria
The University of Birmingham
On the Evolution of Irrational Behaviour
Bob French
University of Liege, Belgium
The bottom-up nature of category acquisition in 3- to 4-month old
infants:
Predictions of a connectionist model and empirical data
Richard Shillcock and Padraic Monaghan
University of Edinburgh, Scotland
Sublexical units in the computational modelling of visual word
recognition
John G. Taylor
King's College Strand, University of London
Through Attention to Consciousness by CODAM
Marius Usher and Eddy Davelaar
Birkbeck College, University of London
Short/long term memory in terms of activation versus weight based
processes
The full programme can be found at the following site,
http://www.cs.kent.ac.uk/events/conf/2003/ncpw/prog/
Conference Chair
----------------
Howard Bowman, University of Kent, UK
Conference Organisers
---------------------
Howard Bowman, UKC
Colin G. Johnson, UKC
Miguel Mendao, UKC
Vikki Roberts, UKC
Proceedings Editors
--------------------
Howard Bowman, UKC
Christophe Labiouse, Liege
Publication
-----------
Proceedings of the workshop will appear in the series Progress
in Neural Processing, which is published by World Scientific.
From shih at ini.phys.ethz.ch Thu Jul 24 07:30:09 2003
From: shih at ini.phys.ethz.ch (Shih-Chii Liu)
Date: Thu, 24 Jul 2003 13:30:09 +0200 (CEST)
Subject: NIPS03 demonstration track
Message-ID:
Would you like to show off a demo of your hardware system, robots, or
software system to people who are interested in all aspects of neural
and statistical computation?
The Neural Information Processing conference has a relatively new
Demonstration track for submissions of this sort. The participants in
this track will have a chance to show their interactive demos in the
areas of for example, hardware technology, neuromorphic systems,
biologically-inspired and biomimetic systems, robotics, and also
software systems. The only hard rule is that the demo must be live.
Check out the web site,
http://www.nips.cc/Conferences/2003/CFP/CallForDemos.php
for details.
Students can also apply for a limited number of travel funds provided
by the Institute of Neuromorphic Engineering for submissions that
are accepted to the Demonstration Track.
The conference will be held in Vancouver, Canada on Dec 8-10 2003.
The DEADLINE for submissions to this track is on Aug 1, 2003.
Remember that acceptance to this track does not mean an automatic
acceptance to your submission to the main conference.
Regards
Shih-Chii Liu and Tobi Delbruck
Co-Chairs NIPS 2003 Demonstration Track
From bogus@does.not.exist.com Thu Jul 24 11:49:47 2003
From: bogus@does.not.exist.com ()
Date: Thu, 24 Jul 2003 16:49:47 +0100
Subject: PhD Studentship Available: Neural Networks for Natural Language Processing
Message-ID: <2D50DF8AA284EC438C8DCAA2D0021FD52665EF@lime.ntu.ac.uk>
From thrun at robotics.Stanford.EDU Fri Jul 25 11:48:21 2003
From: thrun at robotics.Stanford.EDU (Sebastian Thrun)
Date: Fri, 25 Jul 2003 08:48:21 -0700
Subject: NIPS - Deadline Reminder
Message-ID: <200307251548.h6PFmL421238@robo.Stanford.EDU>
Dear Connectionists:
A brief reminder that NIPS workshop proposals and submission to the
demonstration track are due August 1, 2003. Please consult
nips.cc
for details. You are encouraged to submit.
Sebastian Thrun
NIPS*2003 General Chair
From ted.carnevale at yale.edu Mon Jul 28 16:49:12 2003
From: ted.carnevale at yale.edu (Ted Carnevale)
Date: Mon, 28 Jul 2003 16:49:12 -0400
Subject: NEURON course at SFN 2003 meeting
Message-ID: <3F258C48.2010402@yale.edu>
Short Course Announcement
USING THE NEURON SIMULATION ENVIRONMENT
Satellite Symposium, Society for Neuroscience Meeting
9 AM - 5 PM on Friday, Nov. 7, 2003
Speakers to include M.L. Hines and N.T. Carnevale
This 1 day course with lectures and live demonstrations will
present information essential for teaching and research
applications of NEURON, an advanced simulation environment
that handles realistic models of biophysical mechanisms,
individual neurons, and networks of cells. The emphasis is
on practical issues that are key to the most productive use
of this powerful and convenient modeling tool.
Features that will be covered include:
constructing and managing models of cells and networks
importing detailed morphometric data
expanding NEURON's repertoire of biophysical mechanisms
database resources for empirically-based modeling
Each registrant will a comprehensive set of notes which
include material that has not appeared elsewhere in print.
Registration is limited to 50 individuals on a first-come,
first serve basis.
For more information see
http://www.neuron.yale.edu/no2003.html
--Ted
From avi at eecs.harvard.edu Tue Jul 1 11:04:38 2003
From: avi at eecs.harvard.edu (Avi Pfeffer)
Date: Tue, 01 Jul 2003 11:04:38 -0400
Subject: Announcing IBAL release
Message-ID: <3F01A306.5020503@eecs.harvard.edu>
Readers of this list may be interested in the following announcement:
I am pleased to announce the initial release of IBAL, a general purpose
language for probabilistic reasoning. IBAL is highly expressive, and
its inference algorithm generalizes many common frameworks as well as
allowing many new ones. It also provides parameter estimation and
decision making. All this is packaged in a programming language that
provides libraries, automatic type checking, etc.
IBAL may be downloaded from http://www.eecs.harvard.edu/~avi/IBAL.
Avi Pfeffer
From Thomas.Wennekers at neuroinformatik.ruhr-uni-bochum.de Wed Jul 2 09:36:37 2003
From: Thomas.Wennekers at neuroinformatik.ruhr-uni-bochum.de (Thomas Wennekers)
Date: Wed, 2 Jul 2003 15:36:37 +0200 (MEST)
Subject: Special issue on "Cell Assemblies"
Message-ID: <200307021336.h62DabQg005005@fsnif.neuroinformatik.ruhr-uni-bochum.de>
Dear all,
the following collection of papers appeared recently as a special issue
on "Cell Assemblies" at "Theory in Biosciences".
Preprint versions of the papers are available under
http://www.neuroinformatik.rub.de/thbio/publications/specialissue/cellassemblies.html
Final versions should be available from the authors.
Best wishes,
Thomas
Thomas Wennekers, Friedrich T. Sommer, and Ad Aertsen
Editorial: Cell Assemblies
Theory in Biosciences 122 (2003) 1-4.
Thomas Wennekers and Nihat Ay
Spatial and Temporal Stochastic Interaction in Neuronal Assemblies
Theory in Biosciences 122 (2003) 5-18.
The observation of various types of spatio-temporal correlations
in spike patterns of multiple cortical neurons has shifted attention
from rate coding paradigms to computational processes based on
the precise timing of spikes in neuronal ensembles. In the present
work we develop the notion of "spatial" and "temporal interaction"
which provides measures for statistical dependences in coupled
stochastic processes like multiple unit spike trains. We show that
the classical Willshaw network and Abeles' synfire chain model both
reveal a moderate spatial interaction, but only the synfire chain
model reveals a positive temporal interaction, too. Systems that
maximize temporal interaction are shown to be almost deterministic
globally, but posses almost unpredictable firing behavior on the
single unit level.
Anders Lansner, Erik Frans?n, and Anders Sandberg
Cell assembly dynamics in detailed and abstract
attractor models of cortical associative memory
Theory in Biosciences 122 (2003) 19-36.
During the last few decades we have seen a
convergence among ideas and hypotheses regarding functional
principles underlying human memory. Hebb's now more than fifty
years old conjecture concerning synaptic plasticity and cell
assemblies, formalized mathematically as attractor neural
networks, has remained among the most viable and productive
theoretical frameworks. It suggests plausible explanations for
Gestalt aspects of active memory like perceptual completion,
reconstruction and rivalry.
We review the biological plausibility of these theories and
discuss some critical issues concerning their associative
memory functionality in the light of simulation studies of
models with palimpsest memory properties. The focus is on
memory properties and dynamics of networks modularized in
terms of cortical minicolumns and hypercolumns. Biophysical
compartmental models demonstrate attractor dynamics that
support cell assembly operations with fast convergence and low
firing rates. Using a scaling model we obtain reasonable
relative connection densities and amplitudes. An abstract
attractor network model reproduces systems level psychological
phenomena seen in human memory experiments as the Sternberg
and von Restorff effects.
We conclude that there is today considerable substance in
Hebb's theory of cell assemblies and its attractor network
formulations, and that they have contributed to increasing our
understanding of cortical associative memory function.
The criticism raised with regard to biological and
psychological plausibility as well as low storage capacity,
slow retrieval etc has largely been disproved. Rather, this
paradigm has gained further support from new experimental data
as well as computational modeling.
Andreas Knoblauch and G?nther Palm
Synchronization of Neuronal Assemblies in
Reciprocally Connected Cortical Areas
Theory in Biosciences 122 (2003) 37-54.
To investigate scene segmentation in the visual system we
present a model of two reciprocally connected visual areas
comprising spiking neurons. The peripheral area P is modeled
similar to the primary visual cortex, while the central
area C is modeled as an associative memory representing
stimulus objects according to Hebbian learning. Without
feedback from area C, spikes corresponding to stimulus
representations in P are synchronized only locally (slow
state). Feedback from C can induce fast oscillations and
an increase of synchronization ranges (fast state).
Presenting a superposition of several stimulus objects,
scene segmentation happens on a time scale of hundreds of
milliseconds by alternating epochs of the slow and fast
state, where neurons representing the same object are
simultaneously in the fast state. We relate our simulation
results to various phenomena observed in neurophysiological
experiments, such as stimulus-dependent synchronization of
fast oscillations, synchronization on different time scales,
ongoing activity, and attention-dependent neural activity.
Friedrich T. Sommer and Thomas Wennekers
Models of distributed associative memory networks in the brain
Theory in Biosciences 122 (2003) 55-69.
Although experimental evidence for distributed cell assemblies
is growing, theories of cell assemblies are still marginalized
in theoretical neuroscience. We argue that this has to do with
shortcomings of the currently best understood assembly theories,
the ones based on formal associative memory models. These only
insufficiently reflect anatomical and physiological properties
of nervous tissue and their functionality is too restricted to
provide a framework for cognitive modeling. We describe cell
assembly models that integrate more neurobiological constraints
and review results from simulations of a simple nonlocal
associative network formed by a reciprocal topographic
projection. Impacts of nonlocal associative projections in the
brain are discussed with respect to the functionality they can
explain.
Hualou Liang and Hongbin Wang
Top-Down Anticipatory Control in Prefrontal Cortex
Theory in Biosciences 122 (2003) 70-86.
The prefrontal cortex has been implicated in a wide variety
of executive functions, many involving some form of
anticipatory attention. Anticipatory attention involves
the pre-selection of specific sensory circuits to allow
fast and efficient stimulus processing and a subsequently
fast and accurate response. It is generally agreed that the
prefrontal cortex plays a critical role in anticipatory
attention by exerting a facilitatory "top-down" bias on
sensory pathways. In this paper we review recent results
indicating that synchronized activity in prefrontal cortex,
during anticipation of visual stimulus, can predict features
of early visual stimulus processing and behavioral response.
Although the mechanisms involved in anticipatory attention
are still largely unknown, we argue that the synchronized
oscillation in prefrontal cortex is a plausible candidate
during sustained visual anticipation. We further propose a
learning hypothesis that explains how this top-down anticipatory
control in prefrontal cortex is learned based on accumulated
prior experience by adopting a Temporal Difference learning
algorithm.
Friedemann Pulverm?ller
Sequence detectors as a basis of grammar in the brain
Theory in Biosciences 122 (2003) 87-104.
Grammar processing may build upon serial-order mechanisms
known from non-human species. A circuit similar to that
underlying direction-sensitive movement detection in arthropods
and vertebrates may become selective for sequences of words,
thus yielding grammatical sequence detectors in the human
brain. Sensitivity to the order of neuronal events arises from
unequal connection strengths between two input units and a
third element, the sequence detector. This mechanism, which
critically depends on the dynamics of the input units, can
operate at the single neuron level and may be relevant at the
level of neuronal ensembles as well. Due to the repeated
occurrence of sequences, for example word strings, the
sequence-sensitive elements become more firmly established
and, by substitution of elements between strings, a process
called auto-associative substitution learning (AASL) is
triggered. AASL links the neuronal counterparts of the
string elements involved in the substitution process to the
sequence detector, thereby providing a brain basis of what can
be described linguistically as the generalization of rules of
grammar. A network of sequence detectors may constitute
grammar circuits in the human cortex on which a separate set
of mechanisms establishing temporary binding and recursion
can operate.
____________________________________________________________________________
Jr.Prof.Dr.Thomas Wennekers
Theoretical Neuroscience Group
Institute for Neuroinformatics
Ruhr-Universitaet Bochum
Universitaetsstrasse 150
ND 04/589a
44780 Bochum
Phone: +49-234-3224231
Fax: +49-234-3214209
Priv.: +49-160-6123416
Email: Thomas.Wennekers at neuroinformatik.rub.de
____________________________________________________________________________
From eurich at physik.uni-bremen.de Wed Jul 2 11:33:40 2003
From: eurich at physik.uni-bremen.de (Christian Eurich)
Date: Wed, 02 Jul 2003 17:33:40 +0200
Subject: PhD position in Theoretical Neuroscience
Message-ID: <3F02FB54.3C538E71@physik.uni-bremen.de>
Dear Connectionists,
a PhD position is available in the Institute for Theoretical
Neurophysics at the University of Bremen for a project on
human sensorimotor control loops.
In cooperation with experimental groups, dynamical models of action and
perception will be developed to investigate, for example, human postural
sway and the task of balancing sticks on the fingertip. Typical methods
we employ in our institute include dynamical systems theory, neural
networks, and statistical estimation theory. Our homepage is
http://www.neuro.uni-bremen.de/index.php
The University of Bremen has several institutions in the field of
Neuroscience, including a Center for Cognitive Neuroscience and a
Special Research Project "Neurocognition". There are several theoretical
and experimental groups in the Physics, Biology and Psychology working
on neural network modeling, psychophysics, and electrophysiology. The
Hanse Institute for Advanced Study in Delmenhorst (which is close to
Bremen) hosts international guests from the area of Neuroscience and
Cognitive Science and also organizes Neuroscience workshops and
conferences.
Closing date is July 25, 2003. For further information and applications,
please contact Dr. Christian Eurich during the upcoming CNS conference
in Alicante or at
Universitaet Bremen
Institut fuer Theoretische Neurophysik, FB 1
Postfach 330 440
D-28334 Bremen, Germany
Phone: +49 (421) 218-4559
Fax: +49 (421) 218-9104
e-mail: eurich at physik.uni-bremen.de
From Luc.Berthouze at aist.go.jp Wed Jul 2 05:30:27 2003
From: Luc.Berthouze at aist.go.jp (Luc Berthouze)
Date: Wed, 2 Jul 2003 11:30:27 +0200
Subject: postdoc position in developmental robotics and motor learning
Message-ID: <20030630075928.1B61613B65C@aidan6.a02.aist.go.jp>
The Cognitive Neuroinformatics group in the Neuroscience Research Institute at
the Japanese National Institute of Industrial Science and Technology, Tsukuba
(Japan) is seeking an outstanding postdoctoral researcher to join our lab for
two years starting between April 2004 and September 2004. Candidates should
have a solid background in robotics and computational modeling, and a keen
interest in developmental robotics and embodied cognition.
The postdoc will be expected to contribute to our study of the acquisition of
motor skills in human infants. Our approach is interdisciplinary. On the one
hand, we exploit studies in developmental psychology to propose candidate
mechanisms; and, on the other hand, we use robots to test and validate those
hypotheses. The purpose of this approach is two-fold: (a) to contribute to the
understanding of the mechanisms underlying motor development in infants; (b)
to propose new methods for robot learning. Experience in using neural
oscillators to implement motor control models, and a good understanding of the
so-called "dynamical systems approach" will be highly appreciated.
For more information, please consult our lab's website at:
http://www.neurosci.aist.go.jp/~mechwa
The Neuroscience Research Institute, and more generally, AIST, provides an
excellent environment for interdisciplinary research, with groups engaged in
research in biology, neuroscience, psychology, cognitive science, and
robotics. See http://www.aist.go.jp (AIST website) and
http://www.neurosci.aist.go.jp (Neuroscience Research Institute) for more
information.
Candidates should contact Luc Berthouze with a CV and a one-page statement of
research interests. If electronic submission is not possible, fax your
application to +81-298-615841, directed to the attention of Luc Berthouze.
From P.Tino at cs.bham.ac.uk Thu Jul 3 11:27:30 2003
From: P.Tino at cs.bham.ac.uk (Peter Tino)
Date: Thu, 03 Jul 2003 16:27:30 +0100
Subject: papers on architectural bias of RNNs
Message-ID: <3F044B62.9060906@cs.bham.ac.uk>
Dear Connectionists,
a collection of papers
dealing with theoretical and practical aspects of
recurrent neural networks before and in the early stages of
training is available on-line.
Preprints can be found at
http://www.cs.bham.ac.uk/~pxt/my.publ.html
B. Hammer, P. Tino:
Recurrent neural networks with small weights implement definite memory
machines.
Neural Computation, accepted, 2003.
- Proves that
Recurrent networks are architecturally biased
towards definite memory machines/Markov models.
Also contains rigorous learnability analysis
of recurrent nets in the early stages of
learning.
P. Tino, M. Cernansky, L. Benuskova:
Markovian architectural bias of recurrent neural networks.
IEEE Transactions on Neural Networks, accepted, 2003.
- Mostly empirical study of the architectural bias phenomenon
in the context of connectionist modeling of symbolic sequences.
It is possible to extract (variable memory length) Markov models
from recurrent networks even prior to any training!
To assess the amount of useful information extracted during the training,
the networks should be compared with variable memory length Markov
models.
P. Tino, B. Hammer:
Architectural Bias in Recurrent Neural Networks - Fractal Analysis.
Neural Computation, accepted, 2003.
- Rigorous fractal analysis of recurrent activations in
recurrent networks in the early stages of
learning. The complexity of input patterns (topological entropy)
is directly reflected by the complexity of recurrent activations
(fractal dimension).
Best wishes,
Peter Tino
--
Peter Tino
The University of Birmingham
School of Computer Science
Edgbaston, Birmingham B15 2TT, UK
+44 121 414 8558 , fax: 414 4281
http://www.cs.bham.ac.uk/~pxt/
From gary at cs.ucsd.edu Thu Jul 3 13:31:50 2003
From: gary at cs.ucsd.edu (Gary Cottrell)
Date: Thu, 3 Jul 2003 10:31:50 -0700 (PDT)
Subject: papers on architectural bias of RNNs
Message-ID: <200307031731.h63HVoO29370@fast.ucsd.edu>
Folks interested in Peter's paper on RNN's and Definite Memory
Machines may also be interested in our papers on TDNN's and
definite memory machines:
Clouse, Daniel S., Giles, Lee C., Horne, Bill G. and
Cottrell, G. W. (1997) Time-delay neural networks:
Representation and induction of finite state machines. IEEE
Transactions on Neural Networks.
This work attempts to characterize the capabilities of
time-delay neural networks (TDNN), and contrast two
subclasses of TDNN in the area of language induction. The
two subclasses are those with delays limited to the inputs
(IDNN), and those which include delays also on hidden units
(HDNN). Both of these architectures are capable of
representing the same languages, those representable by
definite memory machines (DMM), a subclass of finite state
machines (FSM). They have a strong representational bias
towards DMMs which can be characterized by little logic. We
demonstrate this by learning a 2048 state DMM using very few
training examples. Even though both architectures are
capable of representing the same class of languages, HDNNs
are biased towards learning languages which are
characterized by shift-invariant behavior on short input
windows in the mapping from recent inputs to the
accept/reject classification. We demonstrate this
difference in learning bias via a set of simulations and
statistical analysis.
http://www.neci.nec.com/%7Egiles/papers/IEEE.TNN.tdnn.as.fsm.ps.Z
Clouse, Daniel S., Giles, Lee C., Horne, Bill G. and
Cottrell, G. W. (1997) Representation and induction of
finite state machines using time-delay neural networks. In
Michael C. Mozer, Michael I. Jordan, and Thomas Petsche
(eds.) Advances in Neural Information Processing Systems 9,
pp. 403-409. MIT Press: Cambridge, MA, 1997.
(Similar abstract!)
http://nips.djvuzone.org/djvu/nips09/0403.djvu
From r.gayler at mbox.com.au Thu Jul 3 19:26:23 2003
From: r.gayler at mbox.com.au (Ross Gayler)
Date: Fri, 04 Jul 2003 09:26:23 +1000
Subject: Response to Jackendoff's challenges -- notice of conference
presentation and availability of paper
Message-ID: <001001c341ba$8517fc50$2402a8c0@Chennai>
The linguist, Ray Jackendoff, proposed four challenges to cognitive
neuroscience in his book "Foundations of Language". Each challenge
corresponds to an element of core linguistic functionality which Jackendoff
sees as being poorly addressed by current connectionist models.
On August 5, 2002, Jerome Feldman broadcast these challenges to the
Connectionists mailing list under the subject "Neural binding". After
receiving several responses, Feldman concluded on August 21 that "it isn't
obvious (at least to me) how to use any of the standard techniques to
specify a model that meets Jackendoff's criteria".
I have prepared a paper setting out how I believe one family of
connectionist architectures can meet Jackendoff's challenges. This will be
presented at the Joint International Conference on Cognitive Science to be
held in Sydney, Australia from 13 - 17 July, 2003
(http://www.arts.unsw.edu.au/cogsci2003/). If you will be attending the
conference and wish to hear the presentation - it is currently scheduled for
1 p.m. on Thursday 17th in the Language stream
(http://www.arts.unsw.edu.au/cogsci2003/conf_content/program_thurs_pm.htm).
An extended abstract is included below and anyone who wishes a preprint copy
of the paper (which is very condensed to fit the conference format) should
e-mail me at r.gayler at mbox.com.au
Ross Gayler
Melbourne, AUSTRALIA
r.gayler at mbox.com.au
+61 413 111 303 mobile
Vector Symbolic Architectures answer Jackendoff's challenges for cognitive
neuroscience.
Ross Gayler
Vector Symbolic Architectures (Gayler, 1998; Kanerva, 1997; Plate, 1994) are
a little-known class of connectionist models that can directly implement
functions usually taken to form the kernel of symbolic processing. They are
an enhancement of tensor product variable binding networks (Smolensky,
1990).
Like tensor product networks, VSA's can create and manipulate
recursively-structured representations in a natural and direct connectionist
fashion without requiring lengthy training. However, unlike tensor product
networks, VSA's afford a practical basis for implementations because they
require only fixed dimension vector representations. The fact that VSA's
relate directly, without training, to both simple, practical vector
implementations and core symbolic processing functionality suggests that
they would provide a fruitful connectionist basis for the implementation of
cognitive functionality.
Ray Jackendoff (2002) posed four challenges that linguistic combinatoriality
and rules of language present to theories of brain function. These
challenges are: the massiveness of the binding problem, the problem of
dealing with multiple instances, the problem of variables, and the
compatibility of representations in working memory and long-term memory.
The essence of these problems is the question of how to neurally instantiate
the rapid construction and transformation of the compositional structures
that are typically taken to be the domain of symbolic processing.
Drawing on work by Gary Marcus (2001), Jackendoff contended that these
challenges had not been widely recognised in the cognitive neuroscience
community and that the dialogue between linguistic theory and neural network
modelling would be relatively unproductive until the challenges were
answered by some technical innovation in connectionist models. Jerome
Feldman (2002) broadcast these challenges to the neural network modelling
community via the Connectionists Mailing List. The few responses he
received were unable to convince Feldman that any standard connectionist
techniques would meet Jackendoff's criteria.
In this paper I demonstrate that Vector Symbolic Architectures are able to
meet Jackendoff's challenges.
References
Feldman, J. (2002). Neural binding. Posted to Connectionists Mailing List,
5th August, 2002.
(http://www-2.cs.cmu.edu/afs/cs.cmu.edu/project/connect/connect-archives/arc
h.2002-08.gz 0005.txt see also 8, 9, 18, and 21)
Gayler, R. W. (1998). Multiplicative binding, representation operators, and
analogy. In K.
Holyoak, D. Gentner & B. Kokinov (Eds.), Advances in analogy research:
Integration of theory and data from the cognitive, computational, and neural
sciences (p. 405). Sofia, Bulgaria: New Bulgarian University.
(http://cogprints.ecs.soton.ac.uk/archive/00000502/ see also 500 and 501)
Jackendoff, R. (2002). Foundations of language: Brain, meaning, grammar,
evolution. Oxford: Oxford University Press.
Kanerva, P. (1997). Fully distributed representation. In Proceedings Real
World Computing Symposium (RWC'97, Tokyo). Report TR-96001 (pp. 358-365).
Tsukuba-city, Japan: Real World Computing Partnership.
(http://www.rni.org/kanerva/rwc97.ps.gz see also
http://www.rni.org/kanerva/pubs.html)
Marcus, G. (2001). The algebraic mind. Cambridge, MA, USA: MIT Press.
Plate, T. A. (1994). Distributed representations and nested compositional
structure. Ph.D. thesis, Department of Computer Science, University of
Toronto.
(http://pws.prserv.net/tap/papers/plate.thesis.ps.gz see also
http://pws.prserv.net/tap/)
Smolensky, P. (1990). Tensor product variable binding and the representation
of symbolic structures in connectionist systems. Artificial Intelligence,
46, 159-216.
From j.hogan at qut.edu.au Fri Jul 4 03:44:11 2003
From: j.hogan at qut.edu.au (James Michael Hogan)
Date: Fri, 04 Jul 2003 17:44:11 +1000
Subject: Symposium on Statistical Learning
Message-ID: <200307040744.AKA05918@mail-router02.qut.edu.au>
An upcoming workshop in Sydney - organised by people from
UNSW- jh
S E C O N D A N N O U N C E M E N T
\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
Australian Mathematical Sciences Institute symposium on
///////////////////////////////////////////////////////
:::::::::::::::::::::::::::::::
.. ..
.. STATISTICAL LEARNING ..
.. ..
:::::::::::::::::::::::::::::::
University of New South Wales
Sydney, Australia
2nd-3rd October, 2003
==========================================================
The symposium is now just 3 months away. Here are some
updates:
* The new and improved web-site is
www.maths.unsw.edu.au/~inge/symp
and information about the symposium
is continuously being added. The
latest addition is a tentative
programme.
* HOTEL BOOKING ALERT!!!!
The symposium takes place just one week
before the 2003 Rugby World Cup commences
in Sydney. Therefore you are advised to book
accommodation as soon as possible. Accommodation
suggestions have been added to the web-site.
Flights may also be affected by the World
Cup.
* Early bird special (late bird penalty).
We recommend you register as soon as possible,
but not later than 2nd September when the lower
rates expire. Full registration procedures are
now on the web.
* Speaker addition/subtraction
Drs. Markus Hegland and Alex Smola
have been added to the invited speaker
list. Professor Geoff McLachlan has
had to withdraw.
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
Inge Koch and Matt Wand
Department of Statistics
University of New South Wales
Sydney 2052, Australia
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
----- End Forwarded Message -----
From rporter at lanl.gov Mon Jul 7 14:46:41 2003
From: rporter at lanl.gov (Reid Porter)
Date: Mon, 07 Jul 2003 12:46:41 -0600
Subject: Postdoctoral position in digital neural networks
Message-ID: <5.0.0.25.2.20030707121230.02d91d78@nis-pop.lanl.gov>
Postdoctoral position available in digital neural networks
---------------------------------------------------------------------------------
The Space Data System Group (NIS-3) of Los Alamos National Laboratory seeks
outstanding candidates for a postdoctoral research position in the areas of
pattern recognition, mathematical morphology and reconfigurable computing.
The candidate will help develop high performance feature extraction and
classification algorithms to be deployed in reconfigurable computing
hardware. Applications include remotely sensed satellite imagery, unmanned
aerial vehicle video and other spatio-temporal data sets.
The main focus will be to develop novel cellular non-linear networks
suitable for digital hardware by building on mathematical morphology and
non-linear digital filter theory. The position will require research
(typically mathematics) and algorithm proto-typing (typically Matlab),
software development and implementation (typically in C / C++) targeting
eventual deployment with reconfigurable computing (typically in VHDL).
Required Skills: Prospective candidates should have good oral and written
communication skills, and a demonstrated ability to perform independent and
creative research. We are most interested in candidates with research
interests and experience in the following areas:
- Cellular non-linear networks and spatio-temporal processing
- Mathematical morphology and non-linear digital filters
- Machine learning, artificial intelligence and optimization
- Image, video and signal processing.
- Digital design, reconfigurable computing
Education: A PhD completed within the past five years or soon to be
completed is required.
Post-doc starting salaries are usually in the range $59,300 - $67,300
depending on experience, and generous assistance is provided with
relocation expenses. The initial contract offered would be for two years,
with good possibilities for contract extensions. Candidates may be
considered for a Director's Fellowship and outstanding candidates may be
considered for the prestigious J. Robert Oppenheimer, Richard P. Feynman or
Frederick Reines Fellowships. Please see
http://www.hr.lanl.gov/jps/regjobsearch.stm, job number #205519, for more
information.
Los Alamos is a small and very friendly town situated 7200 ft up in the
scenic Jemez mountains in northern New Mexico. The climate is very pleasant
and opportunities for outdoor recreation are numerous (skiing, hiking,
biking, climbing, etc). The Los Alamos public school system is excellent.
LANL provides a very constructive working environment with abundant
resources and support, and the opportunity to work with intelligent and
creative people on a variety of interesting projects.
Applicants are asked to send a resume, publications list and a cover letter
outlining current research interests to rporter at lanl.gov. Hard copies may
be sent to: Reid Porter, NIS-3, MS D440, Los Alamos National Lab, New
Mexico 87545, USA.
From becker at mcmaster.ca Wed Jul 9 23:05:47 2003
From: becker at mcmaster.ca (S. Becker)
Date: Wed, 9 Jul 2003 23:05:47 -0400 (EDT)
Subject: NIPS 2003 Survey
Message-ID:
To Connectionists:
We are asking for a few minutes of your time to complete an online Survey
consisting of four yes/no questions. The Survey deals with a number of major
changes to the format of the NIPS Conference that are under consideration. The
impetus for these possible changes is to accommodate the growth in submissions
in recent years (a 50% increase between 1999 and 2002), as well the diverse
demographics of the Conference attendees.
Your responses to the four questions, as well as any additional input you
may have, will be valuable in shaping the future of NIPS. We thank you for
your participation.
https://register.nips.salk.edu/surveys/survey.php?id=1
Terrence Sejnowski
President
Neural Information Processing Systems Foundation
From terry at salk.edu Fri Jul 11 19:01:45 2003
From: terry at salk.edu (Terry Sejnowski)
Date: Fri, 11 Jul 2003 16:01:45 -0700 (PDT)
Subject: NEURAL COMPUTATION 15:8
Message-ID: <200307112301.h6BN1jT51877@purkinje.salk.edu>
Neural Computation - Contents - Volume 15, Number 8 - August 1, 2003
ARTICLE
Computation In a Single Neuron: Hodgkin and Huxley Revisited
Blaise Aguera Y Arcas, Adrienne L. Fairhall and William Bialek
NOTE
Learning the Nonlinearity of Neurons from Natural Visual Stimuli
Christoph Kayser, Konrad P. Kording and Peter Konig
LETTERS
Analytic Expressions for Rate and CV of a Type I Neuron Driven
by White Gaussian Noise
Benjamin Lindner, Andre Longtin, and Adi Bulsara
What Causes a Neuron to Spike?
Blaise Aguera y Arcas and Adrienne L. Fairhall
Rate Models for Conductance-Based Cortical Neuronal Networks
Oren Shriki, David Hansel, and Haim Sompolinsky
Neural Representation of Probabilistic Information
M.J. Barber, J.W. Clark and C.H. Anderson
Learning the Gestalt Rule of Collinearity from Object Motion
Carsten Prodoehl, Rolf Wuertz and Christoph von der Malsburg
Recurrent Neural Networks With Small Weights Implement Definite Memory
Machines
Barbara Hammer and Peter Tino
Architectural Bias in Recurrent Neural Networks: Fractal Analysis
Peter Tino and Barbara Hammer
An Effective Bayesian Neural Network Classifier with a Comparison Study
to Support Vector Machine
Faming Liang
Variational Bayesian Learning of ICA with Missing Data
Kwokleung Chan, Te-Won Lee and Terrence J. Sejnowski
-----
ON-LINE - http://neco.mitpress.org/
SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES
USA Canada* Other Countries
Student/Retired $60 $64.20 $108
Individual $95 $101.65 $143
Institution $590 $631.30 $638
* includes 7% GST
MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902.
Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu
-----
From Olivier.Buffet at loria.fr Fri Jul 11 11:13:11 2003
From: Olivier.Buffet at loria.fr (Olivier Buffet)
Date: Fri, 11 Jul 2003 17:13:11 +0200
Subject: EWRL-6 : Call for Participation
References: <3E9D2408.7000301@loria.fr>
Message-ID: <3F0ED407.7038A41@loria.fr>
-- Please excuse us if you receive multiple copies of this message ---
********* Please Distribute around you ********
Call for participation
European Workshop on Reinforcement Learning
EWRL-6
Nancy, FRANCE, September 4-5, 2003
URL: http://www.loria.fr/conferences/EWRL6/
Reinforcement learning (RL) is a growing research area. To build an
European RL community and give visibility to the current situation in
the old continent, we are running a now biennial series of workshops.
EWRL-1 took place in Brussels, Belgium (1994), EWRL-2 in Milano, Italy
(1995), EWRL-3 in Rennes, France (1997), EWRL-4 in Lugano, Switzerland
(1999), and EWRL-5 in Utrecht, the Netherlands (2001). EWRL-6 will take
place in Nancy, France.
The workshop will feature a plenary talk by Bernard Walliser, professor
at Ecole Nationale des Ponts et Chausses and Ecole Polytechnique
(Paris). He is also research director of the ECCO group (CNRS). He is
working on cognitive economics and game theory.
The rest of the two-day workshop will be dedicated to presentations
given by selected participants. The program will be on-line next week.
An inscription fee of euros 200 (only euros 100 for students) will cover
local organization expenses, lunch, coffee breaks, the proceedings, and
a social dinner on Thursday evening.
Registration procedure is detailed at :
http://www.loria.fr/conferences/EWRL6/Inscription/inscription_form.htm
If you have any question, please contact dutech at loria.fr and
buffet at loria.fr
From jason at cs.jhu.edu Sat Jul 12 13:21:56 2003
From: jason at cs.jhu.edu (Jason Eisner)
Date: Sat, 12 Jul 2003 13:21:56 -0400 (EDT)
Subject: postdoc opportunities at Johns Hopkins
Message-ID: <200307121721.h6CHLus17035@emu.cs.jhu.edu>
Johns Hopkins University seeks to hire outstanding postdoctoral
researchers immediately at its Center for Language and Speech
Processing (CLSP). Candidates should have previous experience in
quantitative approaches to machine learning, speech, language, or
other AI domains. Strong computational and mathematical skills are
required.
CLSP is a leading center for research on speech and language. It
specializes in formal and quantitative approaches such as
probabilistic modeling, unsupervised machine learning, and grammar
formalisms.
Our core faculty presently include:
Luigi Burzio Cognitive Science
Bill Byrne Electrical & Computer Engineering
Jason Eisner Computer Science
Bob Frank Cognitive Science
Fred Jelinek Electrical & Computer Engineering
Sanjeev Khudanpur Electrical & Computer Engineering
Paul Smolensky Cognitive Science
David Yarowsky Computer Science
We are looking for postdocs to contribute to one or more of the
following long-term projects funded by NSF and/or DoD. Postdocs
participating in these highly visible projects can expect to gain
considerable research experience in speech and language technology.
Speech Recognition
* MALACH: Multilingual Access to Large Spoken Archives
* ASR for Rich Transcription of Conversational Mandarin
Machine Translation
* Improving Statistical Translation Models Via Text Analyzers Trained
from Parallel Corpora
Algorithmic Infrastructure
* Weighted Dynamic Programming and Finite-State Modeling
for Statistical NLP
Applicants are invited to email us a CV, a one-page statement of
research interests, a list of three references, and a cover letter
that briefly summarizes qualifications. Applications may be sent to
Sue Porterfield at sec at clsp.jhu.edu (fax to +1 410 516 5050 if email
is not possible).
Johns Hopkins University is located in Baltimore, Maryland, USA.
Our URL is http://www.clsp.jhu.edu/.
From bolshausen at rni.org Sun Jul 13 16:53:42 2003
From: bolshausen at rni.org (Bruno Olshausen)
Date: Sun, 13 Jul 2003 13:53:42 -0700
Subject: Workshop on Inference and Prediction in Neocortical Circuits
Message-ID: <3F11C6D6.2070700@rni.org>
Dear Connectionists,
The American Institute of Mathematics will be hosting a
workshop on "Inference and Prediction in Neocortical Circuits,"
September 21-24, in Palo Alto, California. Please see
http://www.aimath.org/ARCC/workshops/brain.html
Space and funding are available for a few more participants.
If you would like to participate, please apply by filling out
the on-line form at
http://koutslts.bucknell.edu/~aimath/WWN/cgi-bin/participantapply.prl?workshop=14
no later than August 1, 2003. Applications are open to all,
and we especially encourage women, underrepresented minorities,
junior mathematicians, and researchers from primarily
undergraduate institutions to apply.
Bruno
--
Bruno A. Olshausen (650) 321-8282 x233
Redwood Neuroscience Institute (650) 321-8585 (fax)
1010 El Camino Real http://www.rni.org
Menlo Park, CA 94025 bolshausen at rni.org
From no-spam-please-find-my-address-typing-frasconi-email at google.com Sun Jul 13 20:41:26 2003
From: no-spam-please-find-my-address-typing-frasconi-email at google.com (Paolo Frasconi)
Date: Mon, 14 Jul 2003 02:41:26 +0200
Subject: New book: Modeling the Internet and the Web (Wiley 2003)
Message-ID:
Some of the readers of this list might be interested in the following
book
Pierre Baldi, Paolo Frasconi, and Padhraic Smyth, Modeling the
Internet and the Web Probabilistic Methods and Algorithms Wiley, 2003,
ISBN: 0-470-84906-1.
It covers various models and algorithms for the Web including
generative models of networks, IR and machine learning algorithms for
text analysis, link analysis, focused crawling, methods for modeling
user behavior, and for mining Web e-commerce data.
1. Mathematical Background - 2. Basic WWW Technologies - 3. Web Graphs -
4. Text Analysis - 5. Link analysis - 6. Advanced Crawling Techniques -
7. Modeling and Understanding Human Behavior on the Web - 8. Commerce
on the Web: Models and Applications - Appendix A Mathematical
Complements - Appendix B List of Main Symbols and Abbreviations
The webpage http://ibook.ics.uci.edu/ contains more details, a
hyperlinked bibliography, and a sample chapter in pdf.
Regards,
Paolo Frasconi
http://www.dsi.unifi.it/~paolo/
From ddlewis4 at worldnet.att.net Sun Jul 13 22:32:32 2003
From: ddlewis4 at worldnet.att.net (ddlewis4@worldnet.att.net)
Date: Sun, 13 Jul 2003 21:32:32 -0500
Subject: Research Software Developer w/ Ornarose, Inc. (short term, Chicago or NJ)
Message-ID: <676e01c349b0$6cb21df0$0500a8c0@colussus>
Company: Ornarose Inc.
Location: Northern New Jersey or Chicago, IL
Title: Research Software Developer - Data
Mining/Statistics/Text Classification
Requirements:
B.S., M.S., or Ph.D. in computer science, statistics,
applied mathematics, or related field.
5+ years professional software development experience.
2+ years professional experience with one or more of the
following: machine learning, data mining, statistics, pattern
recognition, numerical optimization, numerical analysis.
Experience with designing and running computational
experiments in computer science or statistics highly
desirable. Also desirable is experience in information
retrieval, text categorization, natural language processing,
computational linguistics, or text mining.
C and Perl proficiency required. C++ proficiency
desirable.
Unix/Linux experience required. Windows experience
desirable.
Excellent verbal and written communication skills in
English.
Demonstrated ability to meet deadlines and communicate
effectively when working from home.
Responsibilities:
This is a short-term (5 to 6 month) position for an
SBIR-supported startup company. Developer will be
responsible for prototyping and testing advanced algorithms
for supervised and unsupervised learning, prediction,
classification, etc. Work also includes preparation and
cleaning of large text and non-text data sets,
experimentation with new algorithms and modeling techniques,
and measuring the effectiveness of these techniques and
their demands for computing resources.
Interested candidates should send a resume (leads also
welcome) in ASCII or PDF to job2003a at ornarose.com.
Ornarose, Inc. is an equal opportunity employer.
Because the position begins immediately, the candidate must
be eligible to work legally in the United States throughout
2003.
From gbarreto at sel.eesc.sc.usp.br Mon Jul 14 13:44:03 2003
From: gbarreto at sel.eesc.sc.usp.br (Guilherme de Alencar Barreto)
Date: Mon, 14 Jul 2003 14:44:03 -0300 (EST)
Subject: Papers on Self-Organizing Neural Networks
Message-ID:
Dear Connectionists,
Maybe the following papers can be of interest for those working
with Unsupervised Neural Networks and their applications to
generative modeling and robotics.
Abstracts and downloadable draft
versions can be found at http://www.deti.ufc.br/~guilherme/publicacoes.htm
Best regards,
Guilherme A. Barreto
Department of Teleinformatics Engineering
Federal University of Ceara, BRAZIL
------------------------------------
Paper (1):
Barreto, G.A., Araújo, A.F.R. and Kremer, S. (2003).
"A taxonomy for spatiotemporal connectionist networks revisited: The
unsupervised case."
Neural Computation, 15(6):1255-1320.
------------------------------------
Paper (2):
Barreto, G.A., Araújo, A.F.R. and Ritter, H. (2003).
"Self-organizing feature maps for modeling and control of robotic
manipulators."
Journal of Intelligent and Robotic Systems, 36(4):407-450.
------------------------------------
Paper (3):
Barreto, G.A., Araújo, A.F.R., Dücker, C. and Ritter, H. (2002).
"A distributed robotic control system based on a Temporal Self-Organizing
Neural Network"
IEEE Transactions on Systems, Man, and Cybernetics, C-32(4):347-357.
From amasuoka at atr.co.jp Mon Jul 14 07:03:34 2003
From: amasuoka at atr.co.jp (Aya Masuoka)
Date: Mon, 14 Jul 2003 20:03:34 +0900
Subject: ATR CNS Labs Inaugural Symposium
Message-ID:
Dear members of the Connectionists,
We are happy to announce the Inaugural Symposium of ATR Computational
Neuroscience Laboratories, which started
on May, 2003. In addition to the speakers from the new
laboratories, we will have two keynote speakers,
Dr. Dietmar Plenz from National Institute of Health and Dr. Miguel
A.L. Nicolelis from Duke University .
The symposium is open to everyone.
Please join us to commemorate this event together.
ATR CNS Labs Inaugural Symposium
Date: Monday, August 4, 2003
Place: ATR Main Conference Room
Hosted by :ATR Computational Neuroscience Laboratories
For latest information on the symposium: http://www.cns.atr.co.jp/events.html
Please complete the registration form provided below to register for
the event and
return by 7/30 by email: amasuoka at atr.co.jp
***********************************************************
Registration Form
Last Name:
First Name:
MI:
Institution/Agency:
Email Address:
Please type 'X' next to the following options.
I will attend :
the symposium
the lab tour
the reception
all of the above
I have special dietary requirement:
Please state clearly your requirement.
Fee: \1,000 for participating the reception.?
There is no fee for the symposium and lab tour.
Method of Payment: Please make a payment at the on-site registration
desk by cash.
***********************************************************
12:30 PM - Registration
1:00 PM -1:05 PM Mitsuo Kawato, Director (ATR Computational
Neuroscience Laboratories)
Opening Remarks
1:05 PM -1:30 PM Kenji Doya (Department Head, Computational
Neurobiology, ATR, CNS)
"Neural mechanisms of reinforcement learning"
1:30 PM - 1:55 PM Hiroshi Imamizu (Department Head, Cognitive
Neuroscience, ATR, CNS)
"Internal models for cognitive functions"
1:55 PM - 2:20 PM Gordon Cheng (Department Head, Humanoid
Robotics and Computational Neuroscience, ATR, CNS)
"Paving the paths to the brain with humanoid robotics"
2:20 PM - 2:40 PM Coffee Break
2:40 PM - 3:40 PM Dr. Dietmar Plenz (National Institute of Health)
3:40 PM - 4:40 PM Dr. Miguel A.L. Nicolelis (Duke University)
4:40 PM - 5:05 PM Mitsuo Kawato, Director (ATR, CNS)
"Controversies in computational motor control."
5:05 PM - 6:00 PM Lab tour
6:00 PM - Reception at the ATR Cafeteria
***********************************************************
Thank you.
We look forward to seeing you at the symposium.
Mitsuo Kawato, Director
ATR Computational Neuroscience Laboratories
--
---------------------------
Aya Masuoka,
Computational Neuroscience Laboratories, ATR International
Department of Computational Neurobiology
2-2-2 Hikaridai,Keihanna Science City
Kyoto,619-0288 Japan
TEL +81-774-95-1252 FAX +81-774-95-1259
EMAIL amasuoka at atr.co.jp
==============================
CNS was established in May 1, 2003!
==============================
From canete at ctima.uma.es Mon Jul 14 07:18:40 2003
From: canete at ctima.uma.es (=?iso-8859-1?Q?Javier_Fern=E1ndez_de_Ca=F1ete?=)
Date: Mon, 14 Jul 2003 13:18:40 +0200
Subject: EANN'03 Final Programme (8-10 September 2003, Malaga SPAIN)
Message-ID: <002801c349f9$ade29d60$836dd696@isa.uma.es>
Dear colleagues:
This e_mail is to inform that you can find the Final Programme of the
Engineering Application of Neural Networks (EANN'03) available at the
web page
http://www.isa.uma.es/eann03
With regards
Javier Fernandez de Canete
EANN'03 Secretariat
eann03 at ctima.uma.es
Prof. Javier Fernandez de Canete. Ph. D.
Dpto. de Ingenier=EDa de Sistemas y Automatica
E.T.S.I. Informatica
Campus de Teatinos, 29071 Malaga (SPAIN)
Phone: +34-95-2132887
FAX: +34-95-2133361
e_mail: canete at ctima.uma.es
From James-Johnson at nyc.rr.com Tue Jul 15 10:28:03 2003
From: James-Johnson at nyc.rr.com (James Johnson)
Date: Tue, 15 Jul 2003 10:28:03 -0400
Subject: A Generative Theory of Shape
References: <000101c342a5$8aec87e0$66df75d8@thinkpad>
Message-ID: <000f01c34add$4def6bb0$ff5a6c42@ibmntgzhmy5bef>
The following book has just appeared in Springer-Verlag.
A Generative Theory of Shape
Michael Leyton
Springer-Verlag, 550 pages
--------------------------------------------------------------------
The purpose of the book is to develop a generative theory of shape
that has two properties regarded as fundamental to intelligence -
maximizing transfer of structure and maximizing recoverability of the
generative operations. These two properties are particularly important
in the representation of complex shape - which is the main concern of
the book. The primary goal of the theory is the conversion of
complexity into understandability. For this purpose, a mathematical
theory is presented of how understandability is created in a
structure. This is achieved by developing a group-theoretic approach
to formalizing transfer and recoverability. To handle complex shape, a
new class of groups is developed, called unfolding groups. These
unfold structure from a maximally collapsed version of that
structure. A principal aspect of the theory is that it develops a
group-theoretic formalization of major object-oriented concepts such
as inheritance. The result is an object-oriented theory of geometry.
The algebraic theory is applied in detail to CAD, perception, and
robotics. In CAD, lengthy chapters are presented on mechanical and
architectural design. For example, using the theory of unfolding
groups, the book works in detail through the main stages of mechanical
CAD/CAM: part-design, assembly and machining. And within part-design,
an extensive algebraic analysis is given of sketching, alignment,
dimensioning, resolution, editing, sweeping, feature-addition, and
intent-management. The equivalent analysis is also done for
architectural design. In perception, extensive theories are given for
grouping and the main Gestalt motion phenomena (induced motion,
separation of systems, the Johannson relative/absolute motion
effects); as well as orientation and form. In robotics, several levels
of analysis are developed for manipulator structure, using the
author's algebraic theory of object-oriented structure.
--------------------------------------------------------------------
This book can be viewed electronically at the following site:
http://link.springer.de/link/service/series/0558/tocs/t2145.htm
--------------------------------------------------------------------
Author's address:
Professor Michael Leyton,
Center for Discrete Mathematics,
& Theoretical Computer Science (DIMACS)
Rutgers University, Busch Campus,
New Brunswick, NJ 08854,
USA
E-mail address: mleyton at dimacs.rutgers.edu
--------------------------------------------------------------------
From juhn at utopiacompression.com Tue Jul 15 16:48:14 2003
From: juhn at utopiacompression.com (Juhn Maing)
Date: Tue, 15 Jul 2003 13:48:14 -0700
Subject: Job posting: Sr. machine learning scientist
Message-ID: <004501c34b12$69c7eb70$066fa8c0@JUHN>
JOB POSTING: SENIOR MACHINE LEARNING SCIENTIST
UtopiaCompression (UC) is an early-stage, intelligent imaging solutions
company. UC's core offering is an intelligent image compression
technology, which was recognized in 2002 as one of the top emerging
technologies in the US by the National Institute of Standards and
Technology
(http://jazz.nist.gov/atpcf/prjbriefs/prjbrief.cfm?ProjectNumber=00-00-4
936).
Job Description
UtopiaCompression is looking for a highly qualified candidate with
extensive experience and knowledge in machine learning, data mining and
knowledge discovery. The candidate is required to have an MS or Ph.D. in
the areas mentioned above from a highly reputable university.
Post-doctorate and/or industry experience is strongly preferred. The
ideal candidate will be thoroughly versed in the latest research,
methods, developments and theories in machine learning and data mining
as well as possess in-depth experience applying them to commercial,
scientific or industrial applications. The candidate is also required to
be a visionary, highly creative and a great problem solver capable of
proposing solutions to multiple problems in parallel, and mentoring and
guiding R&D engineers in developing and implementing the solutions.
Permanent residents or US citizens are preferred.
This position is ideally suited for full-time employment, but part-time,
contract and contract-to-hire arrangements may also be considered.
Skills & Qualifications
1 - In-depth knowledge and experience in statistical analysis, reasoning
and learning (e.g., Bayesian learning, estimation maximization and
maximum likelihood algorithms, and feature extraction problems),
(statistical) combinatorial optimization and learning (e.g., simulated
annealing, genetic programming), neural networks, inductive and rule
generation learning, fuzzy reasoning, (numeric) decision tree learning,
search methods, (image) data mining and understanding, etc. Candidates
are expected to have knowledge and working experience in various
learning regimes. For instance, in the case of layered neural nets
dexterous familiarity with the back propagation algorithm, radial basis
functions, etc., in decision tree learning working experience in
information gain measure, category utility function, tree pruning, etc.
2 - Dexterous familiarity with various machine learning and statistical
software tools.
3 - Fluency in software analysis, design and development using C
programming environment. Candidates must be well versed and experienced
in C. Working experience in C++ (and Java) is a plus.
4 - Knowledge and working experience with image compression techniques,
and image analysis and processing is a big plus.
Contact:
Juhn Maing
Product Manager
UtopiaCompression
Tel: 310-473-1500 x104
Email: juhn at utopiacompression.com
From poznan at iub-psych.psych.indiana.edu Wed Jul 16 16:50:40 2003
From: poznan at iub-psych.psych.indiana.edu (Roman Poznanski)
Date: Wed, 16 Jul 2003 13:50:40 -0700
Subject: JIN, Vol. 2, No. 1, June 2003
Message-ID: <3F15BAA0.7040109@iub-psych.psych.indiana.edu>
[ Moderator's note: this journal special issue may be of interest to
readers of Connectionists, but only the article abstracts are
available free online. Full text requires a subscription. -- DST ]
Special Issue: Complex Nonlinear Neural Dynamics: Experimental
Advances and Theoretical Interpretations
Editorial
Peter Andras, Robert Kozma and Peter Erdi 1
The Wave Packet: An Action Potential for the 21st Century
Walter J. Freeman 3
Two Species of Gamma Oscillations in the Olfactory Bulb: Dependence on
Behavioral State and Synaptic Interactions
Leslie M. Kay 31
The Global Effects of Stroke on the Human Electroencephalogram
Rudolph C. Hwa, Wei He and Thomas C. Ferree 45
A Model for Emergent Complex Order in Small Neural Networks
Peter Andras 55
Dimension Change, Coarse Grained Coding and Pattern Recognition in
Spatio-Temporal Nonlinear Systems
David DeMaris 71
On the Formation of Persistent States in Neuronal Network Models of
Feature Selectivity
Evan C. Haskell and Paul C. Bressloff 103
Basic Principles of the KIV Model and its Application to the Navigation
Problem
Robert Kozma and Walter J. Freeman 125
Book Review
Book Review: "Computational Neuroanatomy: Principles and Methods", G. A.
Ascoli, ed., (2002)
A. Garenne and G. A. Chauvet 147
--
Roman R. Poznanski, Ph.D
Associate Editor,
Journal of Integrative Neuroscience
Department of Psychology
Indiana University
1101 E. 10th St.
Bloomington, IN 47405-7007
email: poznan at iub-psych.psych.indiana.edu
phone (Office): (812) 856-0838
http://www.worldscinet.com/jin/mkt/editorial.shtml
From bogus@does.not.exist.com Wed Jul 16 11:13:28 2003
From: bogus@does.not.exist.com ()
Date: Wed, 16 Jul 2003 11:13:28 -0400
Subject: postdoc position available
Message-ID:
From stefan.wermter at sunderland.ac.uk Thu Jul 17 08:23:49 2003
From: stefan.wermter at sunderland.ac.uk (Stefan Wermter)
Date: Thu, 17 Jul 2003 13:23:49 +0100
Subject: Stipends for MSc Intelligent Systems
Message-ID: <3F169555.446E5938@sunderland.ac.uk>
Stipends available for MSc Intelligent Systems
----------------------------------
We are pleased to announce that for eligible EU students we have obtained
funding to offer a bursary for our new MSc Intelligent Systems
worth up to 9.000 EURO as fee waiver and stipend.
***Please forward to students who may be interested.***
The School of Computing and Technology, University of Sunderland
is delighted to announce the launch of its new MSc Intelligent Systems
programme for October 2003. Building on the School's leading edge
research in intelligent systems this masters programme will be
funded via the ESF scheme (see below).
Intelligent Systems is an exciting field of study for science and
industry since the currently existing computing systems have
often not yet reached the various aspects of human performance.
"Intelligent Systems" is a term to describe software systems and
methods, which simulate aspects of intelligent behaviour. The intention
is to learn from nature and human performance in order to build more
powerful computing systems. The aim is to learn from cognitive science,
neuroscience, biology, engineering, and linguistics for building more
powerful computational system architectures. In this programme a
wide variety of novel and exciting techniques will be taught including
neural networks, intelligent robotics, machine learning, natural language
processing, vision, evolutionary genetic computing, data mining,
information retrieval, Bayesian computing, knowledge-based systems,
fuzzy methods, and hybrid intelligent architectures.
Programme Structure
--------------
The following lectures/modules are available (at least modules with *
are intended to be available for the Oct. 2003 cohort entry)
Neural Networks *
Intelligent Systems Architectures *
Learning Agents *
Evolutionary Computation
Cognitive Neural Science *
Knowledge Based Systems and Data Mining *
Bayesian Computation
Vision and Intelligent Robots *
Natural Language Processing *
Dynamics of Adaptive Systems
Intelligent Systems Programming *
Funding up to 6000 pounds (about 9.000Euro) for eligible students
------------------------------
The Bursary Scheme applies to this Masters programme commencing
October 2003 and we have obtained funding through the European
Social Fund (ESF). ESF support enables the University to waive the
normal tuition fee and provide a bursary of 75 per week for 45 weeks
for eligible EU students, together up to 6000 pounds or 9000 Euro.
For further information in the first instance please see:
http://www.his.sunderland.ac.uk/Teaching_frame.html
http://osiris.sund.ac.uk/webedit/allweb/courses/progmode.php?prog=G550A&mode=FT&mode2=&dmode=C
For information on applications and start dates contact:
gillian.potts at sunderland.ac.uk Tel: 0191 515 2758
For academic information about the programme contact:
alfredo.moscardini at sunderland.ac.uk
Please forward to interested students.
Stefan
***************************************
Stefan Wermter
Professor for Intelligent Systems
Centre for Hybrid Intelligent Systems
School of Computing and Technology
University of Sunderland
St Peters Way
Sunderland SR6 0DD
United Kingdom
phone: +44 191 515 3279
fax: +44 191 515 3553
email: stefan.wermter at sunderland.ac.uk
http://www.his.sunderland.ac.uk/~cs0stw/
http://www.his.sunderland.ac.uk/
****************************************
From norbert at cn.stir.ac.uk Fri Jul 18 10:09:23 2003
From: norbert at cn.stir.ac.uk (Norbert Krueger)
Date: Fri, 18 Jul 2003 15:09:23 +0100
Subject: Special Session: NEXT GENERATION VISION SYSTEMS
Message-ID: <3F17FF93.5DA48668@cn.stir.ac.uk>
Dear Colleagues,
I would like to point you to the special session
NEXT GENERATION VISION SYSTEMS
to be held at the
Fourth International ICSC Symposium at the
ENGINEERING OF INTELLIGENT SYSTEMS (EIS 2004)
With best regards
Norbert Krueger
_______________________________________________________
Special Session
NEXT GENERATION VISION SYSTEMS
Fourth International ICSC Symposium at the
ENGINEERING OF INTELLIGENT SYSTEMS (EIS 2004)
http://www.icsc-naiso.org/conferences/eis2004/index.html
February 29 - March 2, 2004 at the University of Madeira,
Island of Madeira, Portugal
Organisers:
Dr. Norbert Krueger
University of Stirling
Email: norbert at cn.stir.ac.uk
http://www.cn.stir.ac.uk/~norbert
Dr. Volker Krueger
Aalborg University, Esbjerg
Email: vok at cs.aue.auc.dk
Dr. Florentin Woergoetter
University of Stirling
Stirling FK9 4LA Scotland, UK
Email: worgott at cn.stir.ac.uk
Abstract
Vision based devices have been entering the industrial and
private world more and more successfully: Face recognition
systems control the access to buildings; airports and train
stations are controlled by Video Surveillance devices; and
cars become equipped with vision based driver assistance
systems.
However, the gap between human performance and the
top performance of today's artificial visual systems is
considerable. Especially, scene analysis in unfamiliar
environments allowing for highly reliable actions is yet an
outstanding quality of biological systems.
The next generation of vision systems will have to show
stable and reliable performance in uncontrolled environments
in real time. To achieve reliability these systems need to make
use of regularities in visual data. In this respect, the
representation of the temporal structure of visual data as
well as the fusion of visual sub-modalities are crucial.
Such systems also need to be equipped with a sufficient
amount of prestructured knowledge as well as the ability to
deal with uncertainties and to learn in complex environments.
The invited session focusses on requirements for and prospects
of future vision systems. This covers all questions of visual
representation and integration as well as questions of hardware
and software design.
Submission Deadline: 15.9.2003
Maximum number of pages: Fifteen pages (including
diagrams and references)
Papers (either as pdf or postscript) to be send to
norbert at cn.stir.ac.uk
From levys at wlu.edu Fri Jul 18 16:03:04 2003
From: levys at wlu.edu (Simon Levy)
Date: Fri, 18 Jul 2003 16:03:04 -0400
Subject: Software Release Announcement: SNARLI, free/open-source Java package
for neural nets
Message-ID: <3F185278.3040403@wlu.edu>
Dear Connectionists,
I would like to announce the release of a free, open-source Java package
that may be of interest to members of this list. This package is
currently available at http://snarli.sourceforge.net, and is described
below.
Please feel free to download this package, and contact me with question,
criticism, or suggestions. I am especially interested in hearing from
educators and researchers who find the package useful in their work, and
anyone who has a feature or neural architecture that they would like to
see implemented.
Thanks,
Simon
========================
Simon D. Levy
Assistant Professor
Computer Science Department
Washington & Lee University
Lexington, VA 24450
540-458-8419 (voice)
540-458-8479 (fax)
levys at wlu.edu
http://www.cs.wlu.edu/~levy
*SNARLI* (*/S/*imple */N/*eural */AR/*chitecture */LI/*brary) is a Java
package containing two classes: BPLayer, a general back-prop layer
class, and SOM, a class for the Kohonen Self-Organizing Map. BPLayer
also supports sigma-pi connections and back-prop-through-time, allowing
you to build just about any kind of back-prop network found in the
literature.
*SNARLI* differs from existing neural-net packages in two important
ways: First, it is /not/ GUI-based. Instead, it is meant as a code
resource that can be linked directly to new or existing Java-based
projects, for those who want to try a neural-network approach without
having to write a lot of new code. Given the variety of platforms that
currently interface to Java, from HTML to Matlab
, it made more sense to me to focus on the
neural net algorithms, and leave the GUI development to others.
Second, *SNARLI* gets a great deal of mileage out of a single class
(BPLayer), instead of adding a new class for each type of network. Using
this class, my students and I have been able to construct a large
variety of back-prop networks, from simple perceptrons through Pollack's
RAAM , with very
little additional coding. We have used these networks successfully in
coursework , thesis projects
, and research
.
Future versions of *SNARLI* may include classes to support other popular
architectures, such as Support Vector Machines
(SVMs), Hopfield Nets
,
and Long Short-Term Memory
(LSTM), as user
interest dictates.
From ahu at cs.stir.ac.uk Fri Jul 18 20:51:44 2003
From: ahu at cs.stir.ac.uk (Dr. Amir Hussain)
Date: Sat, 19 Jul 2003 01:51:44 +0100
Subject: Final Call for Papers: IJRA Journal Special Issue on Neuromorphic Systems ( IASTED / ACTA Press, Vol.19, 2004)
Message-ID: <002101c34d8f$ee4cf9b0$4f98fc3e@DrAmir>
Please post and distribute to colleagues and friends:
http://www.actapress.com/journals/specialra6.htm
Final Call for Papers: (with apologies for cross-postings!)
Note extended paper submission deadline (upon request from numerous
authors) of: 1 Sep 2003
For readership of the International Journal of Robotics & Automation
(IJRA), please see the parent organization (IASTED's) website:
http://www.iasted.org/
----------
Call for Papers: Special Issue on "Neuromorphic Systems" International
Journal of Robotics & Automation (IJRA),
IASTED / ACTA Press, Vol.19, 2004
There has recently been a growing interest in neuromorphic systems
research, which is part of the larger field of computational
neuroscience. Neuromorphic systems are implementations in silicon of
systems whose architecture and design are based on neurobiology. In
general, however, neuromorphic systems research is not restricted to one
specific implementation technology. This growing area proffers exciting
possibilities, such as sensory systems that can compete with human
senses, pattern recognition systems that can run in real time, and
neuron models that can truly emulate living neurons. Neuromorphic
systems are at the intersection of neuroscience, computer science, and
electrical engineering.
The earliest neuromorphic systems were concerned with providing an
engineering approximation of some aspects of sensory systems, such as
the detection of sound in the auditory system or the detection of light
in the visual system. More recently, there has been considerable work on
robot control systems, on modelling various types of neurons, and on
including adaptation in hardware systems. Biorobotics, or the
intersection between biology and robotics, is a growing area in
neuromorphic systems. Biorobotics aims to investigate biological
sensorimotor control systems by building robot models of them. This
includes the development of novel sensors and actuators, hardware and
software emulations of neural control systems, and embedding and testing
devices in real environments.
The aim of this Special Issue on Neuromorphic Systems is to bring
together active researchers from different areas of this
interdisciplinary field, and to report on the lastest advances in this
area.
Contributions are sought from (amongst others):
- engineers interested in designing and implementing systems based on
neurobiology
- neurobiologists interested in engineering implementations of systems
- modellers and theoreticians from all the relevant disciplines
Any topic relevant to neuromorphic systems and theory, sensory
neuromorphic systems, and neuromorphic hardware will be considered.
Instructions for Manuscripts:
All manuscripts should be e-mailed to the ACTA Press office at
calgary at actapress.com by September 1, 2003. On the e-mail subject line
please put "Submission for the IJRA Special Issue on Neuromorphic
Systems." The paper submission should include the name(s) of the
author(s) and their affiliations, addresses, fax numbers, and e-mail
addresses.
Manuscripts should strictly follow the guidelines of ACTA Press, given
at the following website:
http://www.actapress.com/journals/submission.htm
Important Dates:
Deadline for paper submission: September 1, 2003
Notification of acceptance: December 1, 2003
Final Manuscripts due: January 31, 2004
Publication in special issue: Vol.19, 2004
Guest Editors:
Dr. Amir Hussain
& Prof. L.S.Smith
Dept. of Computing Science & Mathematics
University of Stirling, Stirling FK9 4LA, Scotland, UK
Email: a.hussain at cs.stir.ac.uk Website: http://www.cs.stir.ac.uk/~ahu/
From aude.billard at epfl.ch Fri Jul 18 08:51:58 2003
From: aude.billard at epfl.ch (aude billard)
Date: Fri, 18 Jul 2003 14:51:58 +0200
Subject: Workshop on Robot Learning by Demonstration
Message-ID: <0a4f01c34d2b$5fd1e6f0$7391b280@sti.intranet.epfl.ch>
=======================
Call For Papers
======================
IROS-2003 Workshop on
Robot Learning by Demonstration
http://asl.epfl.ch/events/iros03Workshop/index.php
Friday 31st of October 2003, 12-5pm
IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems: IROS 2003
Bally's Las Vegas Hotel, October 27-31, 2003
Objectives:
===================
Programming by demonstration has become a key research topic in
robotics. It impacts both fundamental research and
application-oriented studies. Work in that area tackles the
development of robust algorithms for motor control, motor learning,
gesture recognition and visuo-motor integration. While the field has
been ongoing for more than twenty years, recent developments, taking
inspiration in biological mechanisms of imitation, have brought a new
perspective, which this workshop aims at assessing.
Call for Papers:
===================
We solicit papers relevant to the general workshop theme in the three
categories:
- research papers
- application papers
- challenge/position statements (typically only 1 or 2 pages)
Relevant workshop topics include (non-exhaustive list):
- Programming by Demonstration
- Imitation learning
- Task and Skill Learning
- Motor control
- Motor learning
- Visuo-motor Integration
- Gesture recognition
Important Dates:
===============
- August 4, 2003 Deadline for paper submission
- August 15, 2003 Notification of acceptance
- August 26, 2003 Deadline for final contributions
Papers should not exceed 8 pages and conform to the single column,
10pt, A4 format. Papers should be submitted by email to:
aude.billard at epfl.ch
Proceedings will be distributed at the Workshop. A number of papers
presented in this workshop will be selected for publication in a
special issue of the Robotics and Autonomous Systems journal.
Detailed Information:
===================
For more detailed information, please visit the workshop website at
http://asl.epfl.ch/events/iros03Workshop/index.php
Program Chairs:
===================
Aude Billard &
Roland Siegwart
Autonomous Systems Lab
EPFL, Swiss Institute of Technology
CH-1015 Lausanne, Switzerland
http://asl.epfl.ch
Program Committee:
====================
Luc Berthouze, ETL, Japan
Henrik Christensen, KTH, Sweden
Kerstin Dautenhahn, University of Hertfordshire, UK
Yiannis Demiris, Imperial College London, UK
Rudiger Dillmann, Karlsruhe, Germany
Auke Jan Ijspeert, EPFL, CH
Helge Ritter, University of Bielefeld, Germany
Stefan Schaal, University of Southern California, USA
Ales Ude, ATR, Japan
Jianwei Zhang, University of Hamburg, Germany
From wolpert at hera.ucl.ac.uk Fri Jul 18 05:43:35 2003
From: wolpert at hera.ucl.ac.uk (Daniel Wolpert)
Date: Fri, 18 Jul 2003 10:43:35 +0100
Subject: Postdoctoral Positions in Sensorimotor Control
Message-ID: <003901c34d11$0e9674f0$51463ec1@aphrodite>
Two Postdoctoral Research Fellows
Sensorimotor Control Laboratory
Sobell Department of Motor Neuroscience & Movement Disorders
Institute of Neurology
University College London
The Sensorimotor Control Laboratory, under the direction of Professor
Daniel Wolpert has openings for two highly motivated Postdoctoral
Research Fellows in the area of computational and experimental human
motor control. The Fellows will join a team investigating planning,
control and learning of skilled action.One Research Fellow will work
on modelling of the motor system using optimal control and Bayesian
approaches and should have a background in a computational field
(e.g. Computational Neuroscience, Engineering, Physics, Maths). The
other will work on psychophysical studies of human motor control and
should have a background in an experimental field (e.g. Neuroscience,
Psychology). Applicants should ideally have a PhD, plus technical
expertise and computational skills relevant to the study of human
movement. Further details of both posts and laboratory facilities can
be found on www.hera.ucl.ac.uk/vacancies. Informal enquiries can be
addressed to Professor Daniel Wolpert by email to
wolpert at hera.ucl.ac.uk.
The positions are available for two years in the first instance with a
starting date from September 2003. Starting salary is up to =A332,794
pa inclusive, depending on experience, superannuable. Applicants
should provide (ideally by email) by August 11th 2003:
- a maximum 1 page statement of research interests relevant to the
project
- copy of CV (2 if sent by post)
- names and contact details of 3 referees
- 1 copy of Declaration (required - see further details of posts)
- Equal Opportunities form (optional - see further details of posts)
to:
Miss E Bertram,
Assistant Secretary (Personnel)
Institute of Neurology
Queen Square
London WC1N 3BG
Fax: +44 (0)20 7278 5069
Email: e.bertram at ion.ucl.ac.uk
Taking Action for Equality
From dhwang at cs.latrobe.edu.au Sun Jul 20 21:47:01 2003
From: dhwang at cs.latrobe.edu.au (Dianhui Wang)
Date: Mon, 21 Jul 2003 11:47:01 +1000
Subject: Call for Papers
References: <3EC48075.E8BBA612@cs.latrobe.edu.au>
Message-ID: <3F1B4614.4A072F26@cs.latrobe.edu.au>
Dear Colleages,
This email solicits your submission for Invited Session on Advances in
Design, Analysis and Applications of Neural/Neuro-fuzzy Classifiers,
KES2004: 8th International Conference on Knowledge-Based Intelligent
Information & Engineering Systems21th-24th September 2004, Hotel
Intercontinental, Wellington, New Zealand.
Details of the "Call for Paper" could be found at
http://homepage.cs.latrobe.edu.au/dhwang/KES04.htm
I am looking forward to receiving your submissions.
Kind regards,
Dr Dianhui Wang (Session Chair)
Department of Computer Science and Computer Engineering
La Trobe University, Melbourne, VIC 3083, Australia
Tel: +61 3 9479 3034 Fax:+61 3 9479 3060
Email: dhwang at cs.latrobe.edu.au
From bengio at idiap.ch Mon Jul 21 08:37:20 2003
From: bengio at idiap.ch (Samy Bengio)
Date: Mon, 21 Jul 2003 14:37:20 +0200 (CEST)
Subject: Open position for a senior in Machine Learning - IDIAP
Message-ID:
Open position for a Senior Researcher in Machine Learning
---------------------------------------------------------
The Dalle Molle Institute for Perceptual Artificial Intelligence (IDIAP,
http://www.idiap.ch) seeks qualified applicants to fill the position of
Senior Researcher in its Machine Learning group.
Given the current scientific strengths of IDIAP, the ideal candidate should
have strong research experience in machine learning problems related to speech
processing, vision processing, and above all, multimodal processing.
The ideal candidate will have been active for several years in the machine
learning research community, and have yielded a strong publication record. He
or She is expected to supervise PhD students and postdoctoral fellows in
machine learning, propose new research projects at a national and European
level, and be open to eventually giving lectures (either at IDIAP, or at the
nearby EPFL engineering school, http://www.epfl.ch). In fact, given the strong
relationship between IDIAP and EPFL, extremely qualified and experienced
candidates have the possibility of being offered an academic title of
professor at EPFL, while working at IDIAP.
IDIAP has recently been awarded several large research projects in multimodal
processing, both at the national and European level (see for instance
http://www.im2.ch), and the ideal candidate will be interested in the research
projects associated with this funding.
IDIAP is an equal opportunity employer and is actively involved in the
European initiative involving the Advancement of Women in Science. IDIAP seeks
to maintain a principle of open competition (on the basis of merit) to appoint
the best candidate, provide equal opportunity for all candidates, and equally
encourages both females and males to consider employment with IDIAP.
Although IDIAP is located in the French part of Switzerland, English is the
main working language. Free English and French lessons are provided.
IDIAP is located in the town of Martigny (http://www.martigny.ch) in Valais,
a scenic region in the south of Switzerland, surrounded by the highest
mountains of Europe, and offering exciting recreational activities, including
hiking, climbing and skiing, as well as varied cultural activities. It is
within close proximity to Montreux (Jazz Festival) and Lausanne.
Interested candidates should send a letter of application, along with their
detailed CV to jobs at idiap.ch. More information can also be obtained by
contacting Samy Bengio (bengio at idiap.ch).
----
Samy Bengio
Research Director. Machine Learning Group Leader.
IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Switzerland.
tel: +41 27 721 77 39, fax: +41 27 721 77 12.
mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio
From ckiw at inf.ed.ac.uk Mon Jul 21 07:37:53 2003
From: ckiw at inf.ed.ac.uk (Chris Williams)
Date: Mon, 21 Jul 2003 12:37:53 +0100 (BST)
Subject: Faculty positions at the British University in Dubai
Message-ID:
[note that machine learning is one of the areas highlighted --- Chris]
The British University in Dubai
Institute of Informatics and Communications
Chair and 4 Lectureships
* Context
The British University in Dubai is an important development in higher
education, providing cutting-edge research and education in key areas
of science and technology, and is supported in its early growth by the
University of Edinburgh and by other front-ranked UK universities.
Early research and teaching programmes will be developed in
association with the University of Edinburgh's 5*-rated School of
Informatics. Newly appointed staff will spend part of their first year
working with colleagues in Edinburgh and be eligible for Honorary
Fellowships in the University of Edinburgh. (This is intended to help
cement the foundations for continuing collaborative research
projects and exchanges.)
* Posts
The new Professor will be Director of the Institute, provide
leadership in creating innovative research and teaching programmes and
be involved in appointments to the lectureships. Appointment to the
chair and the 4 lectureships will be made in areas of Informatics
related to the first programmes to be developed by the Institute in:
- Natural Language and Speech Engineering
- Knowledge Management & Engineering
- Machine Learning
* Closing date
11 August 2003.
* Remuneration etc.
Full details are available at:
http://www.jobs.thes.co.uk/rs6/cl.asp?action=view_ad&ad_id=15134
These will shortly be copied to:
http://www.informatics.ed.ac.uk/events/vacancies/
You are encouraged to consult Professor Michael Fourman
(buid at inf.ed.ac.uk) to learn further details, and to discuss potential
ways of taking up a post.
From ncopp at jsd.claremont.edu Wed Jul 23 13:18:51 2003
From: ncopp at jsd.claremont.edu (Newton Copp)
Date: Wed, 23 Jul 2003 19:18:51 +0200
Subject: POSITION AVAILABLE - ENDOWED CHAIR
Message-ID: <5.1.0.14.0.20030718103048.00b13ee0@jsd.claremont.edu>
To all:
I would appreciate it if you would consider the position described below or
pass this announcement along to someone who might be interested.
Thank you,
Newt Copp
Regarding the William R. Kenan Professorship in Computational Neuroscience
at The Claremont Colleges;
The undergraduate colleges in the Claremont consortium (Claremont McKenna,
Harvey Mudd, Pitzer, Pomona, and Scripps Colleges) seek an accomplished,
broadly trained neuroscientist with expertise in computational work to fill
the William R. Kenan Chair beginning in September of 2004. The Kenan
Professorship was formed as an all-Claremont position to be held by a
person who has achieved a record of distinction in an interdisciplinary
area. The successful candidate will have an unusual opportunity to take a
leadership role in an intercollegiate, interdisciplinary Neuroscience
Program that focuses on undergraduate education and research and involves
faculty members in Biology, Psychology, Engineering, and Philosophy. A
commitment to excellence in undergraduate teaching, an interest in
exploring interdisciplinary collaborations, and an active research program
are expected. Area of research interest is open. Preference will be given
to candidates at the associate professor level or higher, although
outstanding candidates at the advanced assistant professor level may be
considered.
The Claremont Colleges include five highly selective liberal arts colleges,
the Claremont Graduate University, and the Keck Graduate Institute for
Applied Life Sciences (see http://www.claremont.edu/about.html). The hire
will be made within the Joint Science Department (see
http://www.jsd.claremont.edu/), a department of 22 faculty members in
Biology (12), Chemistry (7) and Physics (4) that is co-sponsored by
Claremont McKenna, Pitzer, and Scripps Colleges.
Send a curriculum vita, copies of three publications, statements of
research interests and teaching interests/philosophy to Kenan Search
Committee, W. M. Keck Science Center, 925 N. Mills Ave., Claremont, CA
91711. Arrange to have three letters of recommendation sent to the same
address. Please direct inquires to Newton Copp, Professor of Biology and
Chair of the Search Committee (tel: 909 621-8298; E-mail:
ncopp at jsd.claremont.edu). Review of applications will begin on Dec. 1,
2003 and continue until the position is filled. (This is a re-posting of a
position unfilled last year.)
In a continuing effort to enrich our academic environment and provide equal
educational and employment opportunities, The Claremont Colleges actively
encourage applications from women and members of historically
under-represented groups in higher education.
________________________
Newton Copp
Professor of Biology
Joint Science Department
The Claremont Colleges
Claremont, CA 91711
tel: 909 607-2932
fax: 909 621-8588
From bolshausen at rni.org Wed Jul 23 13:18:50 2003
From: bolshausen at rni.org (Bruno Olshausen)
Date: Wed, 23 Jul 2003 19:18:50 +0200
Subject: Workshop on Inference and Prediction in Neocortical Circuits
Message-ID: <3F11C7B5.4080501@rni.org>
The American Institute of Mathematics will be hosting a
workshop on "Inference and Prediction in Neocortical Circuits,"
September 21-24, in Palo Alto, California. Please see
http://www.aimath.org/ARCC/workshops/brain.html
Space and funding are available for a few more participants.
If you would like to participate, please apply by filling out
the on-line form at
http://koutslts.bucknell.edu/~aimath/WWN/cgi-bin/participantapply.prl?workshop=14
no later than August 1, 2003. Applications are open to all,
and we especially encourage women, underrepresented minorities,
junior mathematicians, and researchers from primarily
undergraduate institutions to apply.
--
Bruno A. Olshausen (650) 321-8282 x233
Redwood Neuroscience Institute (650) 321-8585 (fax)
1010 El Camino Real http://www.rni.org
Menlo Park, CA 94025 bolshausen at rni.org
From H.Bowman at kent.ac.uk Thu Jul 24 09:51:33 2003
From: H.Bowman at kent.ac.uk (hb5)
Date: Thu, 24 Jul 2003 14:51:33 +0100
Subject: NCPW 8 Call for Participation
Message-ID: <3F1FE465.A2BC7FB3@ukc.ac.uk>
Please distribute this call for participation to anybody you think might
be interested in this event. Apologies for multiple copies.
--------------------------------------------
CALL FOR PARTICIPATION
Eighth Neural Computation and Psychology Workshop (NCPW 8)
Connectionist Models of Cognition, Perception and Emotion
28-30 August 2003 at the
University of Kent at Canterbury, UK
The Eighth Neural Computation and Psychology Workshop (NCPW8)
will be held in Canterbury, England from 28-30th August 2003.
The NCPW series is now a well established and lively forum
that brings together researchers from such diverse disciplines
as artificial intelligence, cognitive science, computer science,
neuroscience, philosophy and psychology. 35 papers will be
presented, of which eight will be invited papers. In addition to the
high quality of the papers presented, this Workshop takes
place in an informal setting, in order to encourage interaction
among the researchers present.
Website
-------
More details, including registration information, can be found on
the conference website,
http://www.cs.ukc.ac.uk/events/conf/2003/ncpw/
The Programme
-------------
Highlights of the programme include a session on modelling face
perception, including three invited papers,
Gary Cottrell
University of California, San Diego, USA
Modeling Face Perception
Peter Hancock, Mike Burton and Rob Jenkins
Stirling University, Scotland
Face Recognition: Average or Examplar?
C.J. Solomon, S.J. Gibson, A. Pallares-Bejarano and M. Maylin
University of Kent at Canterbury
Exploring the Case for a Psychological "Face-space"
Five more invited papers have been scheduled,
John A. Bullinaria
The University of Birmingham
On the Evolution of Irrational Behaviour
Bob French
University of Liege, Belgium
The bottom-up nature of category acquisition in 3- to 4-month old
infants:
Predictions of a connectionist model and empirical data
Richard Shillcock and Padraic Monaghan
University of Edinburgh, Scotland
Sublexical units in the computational modelling of visual word
recognition
John G. Taylor
King's College Strand, University of London
Through Attention to Consciousness by CODAM
Marius Usher and Eddy Davelaar
Birkbeck College, University of London
Short/long term memory in terms of activation versus weight based
processes
The full programme can be found at the following site,
http://www.cs.kent.ac.uk/events/conf/2003/ncpw/prog/
Conference Chair
----------------
Howard Bowman, University of Kent, UK
Conference Organisers
---------------------
Howard Bowman, UKC
Colin G. Johnson, UKC
Miguel Mendao, UKC
Vikki Roberts, UKC
Proceedings Editors
--------------------
Howard Bowman, UKC
Christophe Labiouse, Liege
Publication
-----------
Proceedings of the workshop will appear in the series Progress
in Neural Processing, which is published by World Scientific.
From shih at ini.phys.ethz.ch Thu Jul 24 07:30:09 2003
From: shih at ini.phys.ethz.ch (Shih-Chii Liu)
Date: Thu, 24 Jul 2003 13:30:09 +0200 (CEST)
Subject: NIPS03 demonstration track
Message-ID:
Would you like to show off a demo of your hardware system, robots, or
software system to people who are interested in all aspects of neural
and statistical computation?
The Neural Information Processing conference has a relatively new
Demonstration track for submissions of this sort. The participants in
this track will have a chance to show their interactive demos in the
areas of for example, hardware technology, neuromorphic systems,
biologically-inspired and biomimetic systems, robotics, and also
software systems. The only hard rule is that the demo must be live.
Check out the web site,
http://www.nips.cc/Conferences/2003/CFP/CallForDemos.php
for details.
Students can also apply for a limited number of travel funds provided
by the Institute of Neuromorphic Engineering for submissions that
are accepted to the Demonstration Track.
The conference will be held in Vancouver, Canada on Dec 8-10 2003.
The DEADLINE for submissions to this track is on Aug 1, 2003.
Remember that acceptance to this track does not mean an automatic
acceptance to your submission to the main conference.
Regards
Shih-Chii Liu and Tobi Delbruck
Co-Chairs NIPS 2003 Demonstration Track
From bogus@does.not.exist.com Thu Jul 24 11:49:47 2003
From: bogus@does.not.exist.com ()
Date: Thu, 24 Jul 2003 16:49:47 +0100
Subject: PhD Studentship Available: Neural Networks for Natural Language Processing
Message-ID: <2D50DF8AA284EC438C8DCAA2D0021FD52665EF@lime.ntu.ac.uk>
From thrun at robotics.Stanford.EDU Fri Jul 25 11:48:21 2003
From: thrun at robotics.Stanford.EDU (Sebastian Thrun)
Date: Fri, 25 Jul 2003 08:48:21 -0700
Subject: NIPS - Deadline Reminder
Message-ID: <200307251548.h6PFmL421238@robo.Stanford.EDU>
Dear Connectionists:
A brief reminder that NIPS workshop proposals and submission to the
demonstration track are due August 1, 2003. Please consult
nips.cc
for details. You are encouraged to submit.
Sebastian Thrun
NIPS*2003 General Chair
From ted.carnevale at yale.edu Mon Jul 28 16:49:12 2003
From: ted.carnevale at yale.edu (Ted Carnevale)
Date: Mon, 28 Jul 2003 16:49:12 -0400
Subject: NEURON course at SFN 2003 meeting
Message-ID: <3F258C48.2010402@yale.edu>
Short Course Announcement
USING THE NEURON SIMULATION ENVIRONMENT
Satellite Symposium, Society for Neuroscience Meeting
9 AM - 5 PM on Friday, Nov. 7, 2003
Speakers to include M.L. Hines and N.T. Carnevale
This 1 day course with lectures and live demonstrations will
present information essential for teaching and research
applications of NEURON, an advanced simulation environment
that handles realistic models of biophysical mechanisms,
individual neurons, and networks of cells. The emphasis is
on practical issues that are key to the most productive use
of this powerful and convenient modeling tool.
Features that will be covered include:
constructing and managing models of cells and networks
importing detailed morphometric data
expanding NEURON's repertoire of biophysical mechanisms
database resources for empirically-based modeling
Each registrant will a comprehensive set of notes which
include material that has not appeared elsewhere in print.
Registration is limited to 50 individuals on a first-come,
first serve basis.
For more information see
http://www.neuron.yale.edu/no2003.html
--Ted