From madrenas at eel.upc.es Fri Feb 1 11:38:08 2002 From: madrenas at eel.upc.es (Jordi Madrenas) Date: Fri, 1 Feb 2002 17:38:08 +0100 Subject: PhD position on VLSI implementation of neuromorphic systems Message-ID: PhD POSITION ON VLSI IMPLEMENTATION OF NEUROMORPHIC SYSTEMS DEPARTMENT OF ELECTRONIC ENGINEERING TECHNICAL UNIVERSITY OF CATALUNYA BARCELONA - SPAIN A 4-year PhD position is available at Advanced Hardware Architectures Group of Technical University of Catalunya (UPC) for thesis research on a funded project about Bioinspired (Neuromorphic) VLSI Implementation of NN for Image Segmentation. We are seeking candidates with good academic record, strong interest in the fields of microelectronics -especially mixed-signal-, neuromorphic systems, neural networks and complex systems. Interested candidates should contact as soon as possible (by 12th of February) by sending their CV to the following e-mail address: bioseg at eel.upc.es For more information about the group activities, see http://www-eel.upc.es/aha or contact: Dr. Jordi Madrenas (madrenas at eel.upc.es) or Assistant Prof. Jordi Cosp jcosp at eel.upc.es Jordi Madrenas ------------------------------------------------------------- Jordi MADRENAS Professor Titular / Associate Professor AHA (Advanced Hardware Architectures) Group Departament d'Enginyeria Electronica/Department of Electronic Engineering Universitat Politecnica de Catalunya/Technical University of Catalunya Localization: Campus Nord UPC, Edifici C5, Despatx 102 Postal Address: Dept. d'Enginyeria Electronica, Campus Nord UPC Jordi Girona, 1 i 3, Edifici C4 08034 BARCELONA (SPAIN) Tel: +34 93 401 67 47 Fax: +34 93 401 67 56 E-mail: madrenas at eel.upc.es URL : http://www-eel.upc.es/aha ------------------------------------------------------------- From sab-wmc at dai.ed.ac.uk Fri Feb 1 07:26:09 2002 From: sab-wmc at dai.ed.ac.uk (SAB'02 Workshop on Motor Control) Date: Fri, 1 Feb 2002 12:26:09 +0000 (GMT) Subject: CFP: SAB'02-Workshop in Motor Control in Humans and Robots Message-ID: [Apologies if you receive this message more than once] ******************************************************************* CALL FOR PAPERS ******************************************************************* SAB'2002 Workshop on MOTOR CONTROL IN HUMANS AND ROBOTS On the interplay of real brains and artificial devices ------------------ August 10, 2002, Edinburgh, Scotland (UK) http://www.dai.ed.ac.uk/~sab-wmc Recent advances in motor control for humans and robots come from different approaches to Neuroscience such as experimental investigation, via psycho-physical and neuro-physiological experiments, theoretical investigation, via mathematical and computational modelling, and more recently biorobotic investigation, via robotic modelling of aspects of those mechanisms described in the biological literature. Merging these approaches is a trend that aims to fruitful insights in all directions. Robotics require models of perception, sensorimotor coordination, and motor control, whereas Neuroscience studies the biological counterparts of these neural mechanisms. In such terms the intersection of Robotics and Neuroscience is making significant progress in the field of neuroprosthetics. For instance, neural ensemble recording techniques have been successfully used in the motor cortices of rats and monkeys for the control of artificial limbs. Further, these techniques also provide a window for looking into the brain. Among other things, this can prove invaluable Neurobiological approaches on humanoid robot design. For instance, the discovery of the mirror neurons in the pre-motor cortex of macaque monkeys gave rise to a plethora of issues, such as basic perceptual-motor coupling, imitation mechanisms, and even theories on the origins and evolution and/or prehistory of language. The aim of this workshop is to gather people from these different approaches to motor control and promote the interplay among them. The workshop will include discussions on advances on neural ensemble recording techniques and the associated signal processing, human-machine interfaces for neuroprosthetic applications, psychophysical experiments that validate motor control models, as well as discussions on neurophysiologically inspired models of robot motor control. A list that summarises, but is not limited to, the topics of this workshop follows. - Robotic and human motor control - Object manipulation and grasping - Perceptual-motor systems and imitation - Models of motor control for integration into neuroprosthetics - Human-Machine haptics - Brain-Computer interfaces - Multi-Electrode recording techniques - Data analysis of neural ensemble recordings - Neuroprosthetics for restoring limb movement Submissions =========== Papers not exceeding 8 pages in 10pt, two-column format, should be submitted electronically (PDF or PS) as attachment files to the following email address: sab-wmc at dai.ed.ac.uk A Latex template and further formatting instructions can be found at http://www.isab.org.uk/sab02/submit/. Accepted submissions will be published in the SAB workshop proceedings. Authors of selected papers will be asked for an extended paper submission after the workshop for being published in a journal or collection. Further instructions to authors will be posted on the workshop's web page: http://www.dai.ed.ac.uk/~sab-wmc/ Since this is an interdisciplinary workshop involving quite diverse fields, authors should make an effort to make their papers accessible to outsiders to their field, for instance by including a quick introduction at the beginning of the article. Important Dates =============== 1 March, 2002: Submission of papers 12 April, 2002: Notification of acceptance 1 May, 2002: Deadline for camera-ready papers 10 August, 2002: Workshop Scientific committee ==================== Yiannis Demiris, Imperial College, UK Juan Domingo, Universitat de Valencia, Spain John P. Donoghue, Brown University, USA Andrew Fagg, University of Massachusetts, USA Heba Lakany, Essex University, UK Chris Malcolm, University of Edinburgh, UK Joe McIntyre, College de France, France Jose del R. Millan, JRC European Commission, Italy Ferdinando Mussa-Ivaldi, Northwestern University, USA Miguel Nicolelis, Duke University, USA Angel P. del Pobil, Universitat Jaume-I, Spain John Semmlow, Rutgers University, USA Stefan Schaal, University of Southern California, USA Mandayam Srinivasan, MIT, USA Johan Wessberg, Goteborg University, Sweden Organisers ========== Jose M. Carmena Department of Neurobiology Duke University Durham, NC 27710 USA jose at dai.ed.ac.uk George Maistros IPAB, Division of Informatics University of Edinburgh Edinburgh, EH1 2QL, UK georgem at dai.ed.ac.uk From bp1 at cn.stir.ac.uk Sun Feb 3 12:44:06 2002 From: bp1 at cn.stir.ac.uk (Bernd Porr) Date: Sun, 3 Feb 2002 17:44:06 +0000 (GMT) Subject: Workshop: "Modulation and Modification of Sensor-Motor Coupling" in Stirling, Scotland, Feb. 22-24, 2002 (fwd) Message-ID: sorry for duplicate postings Modulation and Modification of Sensor-Motor Coupling Workshop at Stirling University, Feb. 22-24, 2002. This workshop will bring together scientists from different fields, from experimental and theoretical neuroscientists to robotics researchers, who are interested issues of sensor-motor coupling, (temporal sequence) learning and modulation of sensory-motor pathways. The traditional view of sensor-motor systems, which describes them by means of a non-linear transfer function used to transform the sensor input(s) into a motor action, has been replaced during the last years by a more sophisticated view. For some time it has been acknowledged that such systems can be modified by learning and substantial research has been undertaken to understand the underlying neuronal mechanisms. More recently, aspects of motivation and/or attention driven modulation of the efficiency of sensor-motor coupling have also been investigated. Especially, it has been realized that all sensor-motor systems (all animals) interact with their immediate surroundings forming a closed loop with the environment, which adds to the complexity of the problem. All this shows that a multi-disciplinary approach is necessary in trying to solve the "sensor-motor coupling problem" and therefore, the main objective of the forthcoming workshop at Stirling is to facilitate the discussion between researchers from different fields in sensor-motor coupling. To that end the workshop will consist of a number of invited talks, with plenty of time allowed for discussion. So far the following researchers have agreed to speak: Holk Cruse Ansgar B?schges Peter Dayan Benard Hommel Orjan Ekeberg David Wolpert Jeff Krichmar Paul Verschure Attendance at the workshop is open to all interested participants. We would, however, appreciate using our web-form http://www.cn.stir.ac.uk/SensMot/ to give us some personal details to facilitate organization. Organizers: Barbara Webb (b.h.webb at stir.ac.uk) & Florentin W?rg?tter (worgott at cn.stir.ac.uk) Dept. of Psychology and INCITE University of Stirling Scotland, UK Webmaster: Bernd Porr (bp1 at cn.stir.ac.uk) From aonishi at bsp.brain.riken.go.jp Sun Feb 3 21:07:16 2002 From: aonishi at bsp.brain.riken.go.jp (Toru Aonishi) Date: Mon, 04 Feb 2002 11:07:16 +0900 Subject: preprint: paper on coupled oscillator systems Message-ID: <20020204110716S.aonishi@bsp.brain.riken.go.jp> Dear Connectionists, We are pleased to announce the availability of our recent paper and of two potentially related papers. Recent paper: ------------- Acceleration effect of coupled oscillator systems T. Aonishi, K. Kurata and M. Okada, Physical Review E (in press) Available at http://arXiv.org/abs/cond-mat/0201453 Abstract: We have developed a curved isochron clock (CIC) by modifying the radial isochron clock to provide a clean example of the acceleration (deceleration) effect. By analyzing a two-body system of coupled CICs, we determined that an unbalanced mutual interaction caused by curved isochron sets is the minimum mechanism needed for generating the acceleration (deceleration) effect in coupled oscillator systems. From this we can see that the Sakaguchi and Kuramoto (SK) model which is a class of non-frustrated mean feild model has an acceleration (deceleration) effect mechanism. To study frustrated coupled oscillator systems, we extended the SK model to two oscillator associative memory models, one with symmetric and one with asymmetric dilution of coupling, which also have the minimum mechanism of the acceleration (deceleration) effect. We theoretically found that the {\it Onsager reaction term} (ORT), which is unique to frustrated systems, plays an important role in the acceleration (deceleration) effect. These two models are ideal for evaluating the effect of the ORT because, with the exception of the ORT, they have the same order parameter equations. We found that the two models have identical macroscopic properties, except for the acceleration effect caused by the ORT. By comparing the results of the two models, we can extract the effect of the ORT from only the rotation speeds of the oscillators. Related papers: -------------- Multibranch entrainment and slow evolution among branches in coupled oscillators T. Aonishi and M. Okada, Physical Review Letters, 88[2], 024102 (2002) Available at http://prl.aps.org/ http://arXiv.org/abs/cond-mat/0104526 Abstract: In globally coupled oscillators, it is believed that strong higher harmonics of coupling functions are essential for {\it multibranch entrainment} (MBE), in which there exist many stable states, whose number scales as $\sim$ $O(\exp N)$ (where $N$ is the system size). The existence of MBE implies the non-ergodicity of the system. Then, because this apparent breaking of ergodicity is caused by {\it microscopic} energy barriers, this seems to be in conflict with a basic principle of statistical physics. In this paper, using macroscopic dynamical theories, we demonstrate that there is no such ergodicity breaking, and such a system slowly evolves among branch states, jumping over microscopic energy barriers due to the influence of thermal noise. This phenomenon can be regarded as an example of slow dynamics driven by a perturbation along a neutrally stable manifold consisting of an infinite number of branch states. ---- Statistical mechanics of an oscillator associative memory with scattered natural frequencies T. Aonishi, K. Kurata and M. Okada, Physical Review Letters, 82[13], pp. 2800--2803 (1999) Available at http://prl.aps.org/ http://arXiv.org/abs/cond-mat/9808090 Abstract: Analytic treatment of a non-equilibrium random system with large degrees of freedoms is one of most important problems of physics. However, little research has been done on this problem as far as we know. In this paper, we propose a new mean field theory that can treat a general class of a non-equilibrium random system. We apply the present theory to an analysis for an associative memory with oscillatory elements, which is a well-known typical random system with large degrees of freedoms. --------------------------------------------------------------- Regards, Toru Aonishi (Ph.D) Laboratory for Advanced Brain Signal Processing Brain Science Institute The Institute of Physical and Chemical Research (RIKEN) Hirosawa, 2-1, Wako-shi, Saitama, 351-0198, Japan E-mail: aonishi at brain.riken.go.jp URL: http://www.bsp.brain.riken.go.jp/~aonishi/ From ruppin at tau.ac.il Mon Feb 4 04:51:48 2002 From: ruppin at tau.ac.il (Eytan Ruppin) Date: Mon, 4 Feb 2002 11:51:48 +0200 Subject: Comp. Neuroscience with Evolutionary Agents - A Review Paper Message-ID: <200202040951.LAA26772@tau.ac.il> Evolutionary Autonomous Agents: A Neuroscience Perspective ----------------------------------------------------------- Nature Reviews Neuroscience, 3(2), February issue, p. 132 - 142, 2002. http://www.nature.com/cgi-taf/DynaPage.taf?file=/nrn/journal/v3/n2/index.html Abstract: This paper examines the research paradigm of neurally-driven Evolutionary Autonomous Agents (EAAs), from a neuroscience perspective. Two fundamental questions are addressed: 1. Can EAA studies shed new light on the structure and function of biological nervous systems? 2. Can these studies lead to the development of new neuroscientific analysis tools? The value and significant potential of EAA modeling in both respects is demonstrated and discussed. While the study of EAAs as a neuroscience research methodology still faces difficult conceptual and technical challenges, it is a promising and timely endeavor. The paper may also be downloaded from http://www.math.tau.ac.il/~ruppin/. Best, Eytan Ruppin From Gunnar.Raetsch at anu.edu.au Mon Feb 4 08:32:43 2002 From: Gunnar.Raetsch at anu.edu.au (Gunnar Raetsch) Date: Tue, 05 Feb 2002 00:32:43 +1100 Subject: PhD thesis on Boosting available Message-ID: <3C5E8D7B.9080206@anu.edu.au> Dear Connectionists, I am pleased to announce that my PhD thesis entitled "Robust Boosting via Convex Optimization" is now available at http://www.boosting.org/papers/thesis.ps.gz (and .pdf) Please find the summary of my thesis below. Gunnar Summary ======= In this work we consider statistical learning problems. A learning machine aims to extract information from a set of training examples such that it is able to predict the associated label on unseen examples. We consider the case where the resulting classification or regression rule is a combination of simple rules - also called base hypotheses. The so-called boosting algorithms iteratively find a weighted linear combination of base hypotheses that predict well on unseen data. We study the following issues: o The statistical learning theory framework for analyzing boosting methods. We study learning theoretic guarantees on the prediction performance on unseen examples. Recently, large margin classification techniques have emerged as a practical result of the theory of generalization, in particular Boosting and Support Vector Machines. A large margin implies a good generalization performance. Hence, we analyze how large the margins in boosting are and find an improved algorithm that is able to generate the maximum margin solution. o How can boosting methods be related to mathematical optimization techniques? To analyze the properties of the resulting classification or regression rule, it is of high importance to understand whether and under which conditions boosting converges. We show that boosting can be used to solve large scale constrained optimization problems, whose solutions are well characterizable. To show this, we relate boosting methods to methods known from mathematical optimization, and derive convergence guarantees for a quite general family of boosting algorithms. o How to make Boosting noise robust? One of the problems of current boosting techniques is that they are sensitive to noise in the training sample. In order to make boosting robust, we transfer the soft margin idea from support vector learning to boosting. We develop theoretically motivated regularized algorithms that exhibit a high noise robustness. o How to adapt boosting to regression problems? Boosting methods are originally designed for classification problems. To extend the boosting idea to regression problems, we use the previous convergence results and relations to semi- infinite programming to design boosting-like algorithms for regression problems. We show that these leveraging algorithms have desirable properties - from both, the theoretical and the practical side. o Can boosting techniques be useful in practice? The presented theoretical results are guided by simulation results either to illustrate properties of the proposed algorithms or to show that they work well in practice. We report on successful applications in a non-intrusive power monitoring system, chaotic time series analysis and the drug discovery process. -- +-----------------------------------------------------------------+ Gunnar R"atsch http://mlg.anu.edu.au/~raetsch Australian National University mailto:Gunnar.Raetsch at anu.edu.au Research School for Information Tel: (+61) 2 6125-8647 Sciences and Engineering Fax: (+61) 2 6125-8651 Canberra, ACT 0200, Australia From cindy at bu.edu Mon Feb 4 16:26:12 2002 From: cindy at bu.edu (Cynthia Bradford) Date: Mon, 4 Feb 2002 16:26:12 -0500 Subject: Neural Networks 15(1) Message-ID: <200202042126.g14LQCr21060@cns-pc75.bu.edu> NEURAL NETWORKS 15(1) Contents - Volume 15, Number 1 - 2002 ------------------------------------------------------------------ Editorial for 2002: A Time of Exuberant Development Neural Networks Referees used in 2001 NEURAL NETWORKS LETTER: Modeling inferior olive neuron dynamics Manuel G. Velarde, Vladimir I. Nekorkin, Viktor B. Kazantsev, Vladimir I. Makarenko, and Rodolfo Llinas INVITED ARTICLE: A review of evidence of health benefit from artificial neural networks in medical intervention P.J.G. Lisboa CONTRIBUTED ARTICLES: ***** Neuroscience and Neuropsychology ***** Attention modulation of neural tuning through peak and base rate in correlated firing H. Nakahara and S.-I. Amari ***** Mathematical and Computational Analysis ***** Space-filling curves and Kolmogorov superposition-based neural networks David A. Sprecher and Sorin Draghici The bifurcating neuron network 2: An analog associative memory Geehyuk Lee and Nabil H. Farhat Hybrid independent component analysis by adaptive LUT activation function neurons Simone Fiori A new approach to stability of neural networks with time-varying delays Jigen Peng, Hong Qiao, and Zong-ben Xu ***** Engineering and Design ***** Projective ART for clustering data sets in high dimensional spaces Yongqiang Cao and Jianhong Wu Equivariant nonstationary source separation Seungjin Choi, Andrzej Cichocki, and Shunichi Amari ***** Technology and Applications ***** Fractional Fourier transform pre-processing for neural networks and its application to object recognition Billur Barshan and Birsel Ayrulu LETTERS TO THE EDITOR Comments for Rivals and Personnaz (2000): Construction of confidence intervals for neural networks based on least squares estimation Jan Larsen and Lars Kai Hansen Response to comments for Rivals and Personnaz (2000): Construction of confidence intervals for neural networks based on least squares estimation I. Rivals and L. Personnaz BOOK REVIEWS Review of "Model systems and the neurobiology of associative learning" edited by J.E. Steinmetz, M.A. Gluck, and P.R. Solomon by Nestor A. Schmajuk Review of "Oscillations in neural systems" edited by D.S. Levine, V.R. Brown, and V.T. Shirey by Andrzej Przybyszewski and Mark Kon CURRENT EVENTS ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Takashi Nagano Faculty of Engineering Hosei University 3-7-2, Kajinocho, Koganei-shi Tokyo 184-8584 Japan 81 42 387 6350 (phone and fax) jnns at k.hosei.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From pam_reinagel at hms.harvard.edu Mon Feb 4 17:16:27 2002 From: pam_reinagel at hms.harvard.edu (Pamela Reinagel) Date: Mon, 4 Feb 2002 17:16:27 -0500 Subject: Natural Scenes conference Message-ID: 2nd and final post Deadline reminder: March 1, 2002 ------------------------------- A New Gordon Research Conference "Sensory coding and the natural environment: Probabilistic models of perception" June 30 - July 5, 2002 Mount Holyoke College, MA Pamela Reinagel & Bruno Olshausen, Chairs List of speakers and talk titles, as well as instructions on how to apply, are available at: http://www.klab.caltech.edu/~pam/NSS2002.htm. This conference will bring together researchers from diverse disciplines to discuss the statistical structure of natural sensory stimuli, and how nervous systems exploit these statistics to form useful representations of the environment. Topics include sensory neurophysiology, perceptual psychology, and the mathematics of signal statistics, applied to a variety of sensory modalities and organisms. Applications received by MARCH 1, 2002 will receive full consideration. (Late applications may be considered if there is space available.) From butz at illigal.ge.uiuc.edu Mon Feb 4 17:28:22 2002 From: butz at illigal.ge.uiuc.edu (Martin Butz) Date: Mon, 4 Feb 2002 16:28:22 -0600 (CST) Subject: CFP: Adaptive Behavior in Anticipatory Learning Systems Workshop (ABiALS 2002) Message-ID: (We apologize if you received more than one copy of this message) ########################################################################### C A L L F O R P A P E R S ABiALS Workshop 2002 Adaptive Behavior in Anticipatory Learning Systems ########################################################################### August 11., 2002 Edinburgh, Scotland http://www-illigal.ge.uiuc.edu/ABiALS to be held during the seventh international conference on Simulation of Adaptive Behavior (SAB'02) http://www.isab.org.uk/sab02/ This workshops aims for an interdisciplinary gathering of people interested in how anticipations can guide behavior as well as how an anticipatory influence can be implemented in an adaptive behavior system. Particularly, we are looking for adaptive behavior systems that incorporate some online anticipation mechanisms. ___________________________________________________________________________ Aim and Objectives: Most of the research over the last years in artificial adaptive behavior with respect to model learning and anticipatory behavior has focused on the model learning side. Research is particularly engaged in online generalized model learning. Up to now, though, exploitation of the model has been done mainly to show that exploitation is possible or that an appropriate model exists in the first place. Only very few applications exist that show the utility of the model for the simulation of anticipatory processes and a consequent adaptive behavior. The aim of this workshop is to bring together researchers that are interested in anticipatory processes and essentially anticipatory adaptive behavior. It is aimed for an interdisciplinary gathering that brings together researchers from distinct areas so as to discuss the different guises that takes anticipation in these different perspectives. But the workshop intends to focus on anticipations in the form of low-level computational processes rather than high-level processes such as explicit planning. ___________________________________________________________________________ Essential questions: * How can anticipations influence the adaptive behavior of an artificial learning system? * How can anticipatory adaptive behavior be implemented in an artificial learning system? * How does an incomplete model influence anticipatory behavior? * How do anticipations guide further model learning? * How do anticipations control attention? * Can anticipations be used for the detection of special environmental properties? * What are the benefits of anticipations for adaptive behavior? * What is the trade-off between simple bottom-up stimulus-response driven behavior and more top-down anticipatory driven behavior? * In what respect does anticipation mediate between low-level environmental processing and more complex cognitive simulation? * What role do anticipations play for the implementation of motivations and emotions? ___________________________________________________________________________ Submission: Submissions for the workshop should address or at least be related to one of the questions listed above. However, other approaches to anticipatory adaptive behavior are encouraged as well. The workshop is not limited to one particular type of anticipatory learning system or a particular representation of anticipations. However, the learning system should learn its anticipatory representation online rather than being provided by a model of the world beforehand. Nonetheless, background knowledge of a typical environment can be incorporated (and is probably inevitably embodied in the provided sensors, actions, and the coding in any adaptive system). Since this is a full day workshop, we hope to be able to provide more time for presentations and discussions. In that way, the advantages and disadvantages of the different learning systems should become clearer. It is also aimed for several discussion sessions in which anticipatory influences will be discussed in a broader sense. Papers will be reviewed for acceptance by the program committee and the organizers. Papers should be submitted electronically to one of the organizers via email in pdf or ps format. Electronic submission is strongly encouraged. If you cannot submit your contribution electronically, please contact one of the organizers. Submitted papers should be between 10 and 20 pages in 10pt, one-column format. The LNCS Springer-Verlag style is preferred (see http://www.springer.de/comp/lncs/authors.html). Submission deadline is the 31st of March 2002. Dependent on the quality and number of contributions we hope to be able to publish Post-Workshop proceedings as either a Springer LNAI volume or a special issue of a journal. For more information please refer to http://www-illigal.ge.uiuc.edu/ABiALS/ ___________________________________________________________________________ Important Dates: 31.March 2002: Deadline for Submissions 15.May 2002: Notification of Acceptance 15.June 2002: Camera Ready Version for SAB Workshop Proceedings 11.August 2002: Workshop ABiALS ___________________________________________________________________________ Program Committee: Emmanuel Dauc Facult des sciences du sport Universit de la Mditerranne Marseille, France Ralf Moeller Cognitive Robotics Max Planck Institute for Psychological Research Munich, Germany Wolfgang Stolzmann DaimlerChrysler AG Berlin, Germany Jun Tani Lab. for Behavior and Dynamic Cognition Brain Science Institute, RIKEN 2-1 Hirosawa, Wako-shi, Saitama, 351-0198 Japan Stewart W. Wilson President Prediction Dynamics USA ___________________________________________________________________________ Organizers: Martin V. Butz, Illinois Genetic Algorithms Laboratory (IlliGAL), Universtiy of Illinois at Urbana-Champaign, Illinois, USA also: Department of Cognitive Psychology University of Wuerzburg, Germany butz at illigal.ge.uiuc.edu http://www-illigal.ge.uiuc.edu/~butz Pierre Grard, AnimatLab, University Paris VI, Paris, France pierre.gerard at lip6.fr http://animatlab.lip6.fr/Gerard Olivier Sigaud AnimatLab, University Paris VI, Paris, France olivier.sigaud at lip6.fr http://animatlab.lip6.fr/Sigaud From jose at psychology.rutgers.edu Tue Feb 5 17:31:37 2002 From: jose at psychology.rutgers.edu (Stephen Hanson) Date: Tue, 05 Feb 2002 17:31:37 -0500 Subject: POSTDOC Available--- RUTGERS UNIVERSITY --RUMBA LABS Message-ID: <3C605D49.514DE31@psychology.rutgers.edu> COGNITIVE/COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION at RUTGERS UNIVERSITY, Newark Campus. The Rutgers University Mind/Brain Analysis (RUMBA) Project anticipates making one postdoctoral appointment, which is to begin in the SUMMER (June/July) of 2002. This positions are for a minimum of 2 years, with the possibility of continuation for 1 more year and will be in the areas of specialization of cognitive neuroscience with emphasis on the development of new paradigms and methods in neuroimaging, mathematical modeling, signal processing or data analysis in functional brain imaging. Particular interest is in methods and algorithms for fusion of EEG/fMRI. Applications are welcomed begining immediately and review will continue until the position is filled. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation and 1 reprint to Professor S.J. Hanson, Department of Psychology, Rutgers University, Newark, NJ 07102. Email enquiry can be made to jose at psychology.rutgers.edu please put "RUMBA POSTDOC" in your subject field also see http://www.rumba.rutgers.edu. From tbl at cin.ufpe.br Wed Feb 6 11:32:31 2002 From: tbl at cin.ufpe.br (Teresa Bernarda Ludermir) Date: Wed, 06 Feb 2002 14:32:31 -0200 Subject: First CFP SBRN 2002 Message-ID: <3C615A9F.4885B75F@cin.ufpe.br> --------------------------- Apologies for cross-posting --------------------------- FIRST CALL FOR PAPERS ********************************************************************** SBRN'2002 - VII BRAZILIAN SYMPOSIUM ON NEURAL NETWORKS (http://www.cin.ufpe.br/~sbiarn02) Recife, November 11-14, 2002 ********************************************************************** The biannual Brazilian Symposium on Artificial Neural Networks (SBRN) - of which this is the 7th event - is a forum dedicated to Neural Networks (NNs) and other models of computational intelligence. The emphasis of the Symposium will be on original theories and novel applications of these computational models. The Symposium welcomes paper submissions from researchers, practitioners, and students worldwide. The proceedings will be published by the IEEE Computer Society. Selected, extended, and revised papers from SBRN'2002 will be also considered for publication in a special issue of the International Journal of Neural Systems and of the Journal of Intelligent and Fuzzy Systems. SBRN'2002 is sponsored by the Brazilian Computer Society (SBC) and co-sponsored by SIG/INNS/Brazil Special Interest Group of the International Neural Networks Society in Brazil. It will take place November 11-14, and will be held in Recife at a beach resort. Recife, located on the northeast coast of Brazil, is known as the "Brazilian Venice" because of its many canals and waterways and the innumerable bridges that span them. It is the major gateway to the Northeast with regular flights to all major cities in Brazil as well as Lisbon, London, Frankfurt, and Miami. See more information about the place ( http://www.braziliantourism.com.br/pe-pt1-en.html) that will host the event. SBRN'2002 will be held in conjunction with the XVI Brazilian Symposium on Artificial Intelligence (http://www.cin.ufpe.br/~sbiarn02) (SBIA). SBIA has its main focus on symbolic AI. Crossfertilization of these fields will be strongly encouraged. Both Symposiums will feature keynote speeches and tutorials by world-leading researchers. The deadline for submissions is April 15, 2002. More details on paper submission and conference registration will be coming soon. Sponsored by the Brazilian Computer Society (SBC) Co-Sponsored by SIG/INNS/Brazil Special Interest Group of the International Neural Networks Society in Brazil Organised by the Federal University of Pernambuco (UFPE)/Centre of Informatics (CIn) Published by the IEEE Computer Society Deadlines: Submission: 15 April 2002 Acceptance: 17 June 2002 Camera-ready: 22 August 2002 Non-exhaustive list of topics which will be covered during SBRN'2002: Applications: finances, data mining, neurocontrol, time series analysis, bioinformatics; Architectures: cellular NNs, hardware and software implementations, new models, weightless models; Cognitive Sciences: adaptive behaviour, natural language, mental processes; Computational Intelligence: evolutionary systems, fuzzy systems, hybrid systems; Learning: algorithms, evolutionary and fuzzy techniques, reinforcement learning; Neurobiological Systems: bio-inspired systems, biologically plausible networks, vision; Neurocontrol: robotics, dynamic systems, adaptive control; Neurosymbolic processing: hybrid approaches, logical inference, rule extraction, structured knowledge; Pattern Recognition: signal processing, artificial/computational vision; Theory: radial basis functions, Bayesian systems, function approximation, computability, learnability, computational complexity. Paper Submission: Prospective authors are invited to submit 6-page, 11-point, double-column papers (postscript or pdf format) written in English, Portuguese or Spanish - see the style file at (http://computer.org/cspress/instruct.htm).=20 More details on paper submission and conference registration will be coming soon. The first volume of the Proceedings will be published by IEEE Computer Society Press, in time for distribution at the symposium. It will include only accepted papers written in English and abstracts of accepted papers written in Portuguese or Spanish. A second volume will be issued as a CD-ROM, and will contain accepted papers originally written in Portuguese or Spanish. General Chair: Teresa B. Ludermir (UFPE/CIn, Brazil) tbl at cin.ufpe.br Program Chair: Marcilio C. P. de Souto (UFPE/CIn, Brazil) mcps at cin.ufpe.br Publications Chair: Marley Vellasco (PUC-RJ, Brazil) marley at ele.puc-rio.br Organising Committee Teresa B. Ludermir (UFPE, BR) Marc=EDlio C. P. de Souto (UFPE, BR) Steering Committee: Aluizio F. R. Araujo (USP-SC, BR) Antonio de P. Braga (UFMG, BR) Andre P. L. F. de Carvalho (USP-SC, BR) Teresa B. Ludermir (UFPE, BR) Carlos H. C. Ribeiro (ITA, BR) Marc=EDlio C. P. de Souto (UFPE, BR) Marley Vellasco (PUC-RJ, BR) Gerson Zaverucha (UFRJ, BR) Program Committee (Preliminary) Igor Aleksander (Imperial College, UK) Aluizio F. R. Araujo (USP-SC, BR) Pierre Baldi (Univ. of California at Irvine, USA) Valmir Barbosa (UFRJ, BR) Allan Kardec D. Barros (UFMA, BR) Antonio de P. Braga (UFMG, BR) Anne M. de P. Canuto (UFRN, BR) Otavio Carpiteiro (EFEI, BR) Andre P. L. F. Carvalho (USP-SC, BR) Alejandro Ceccatto (Univ. of Rosario, AR) Phillipe DeWilde (Imperial College, UK) Paulo M. Engel (UFRGS, BR) Felipe Franca (UFRJ, BR) Fernando Gomide (UNICAMP, BR) Maria Eunice Gonzales (UNESP-Marilia, BR) Stephen Grossberg (Boston University, USA) Bart Kosko (University of Southern California, USA) Teresa B. Ludermir (UFPE, BR) Wolfgang Maass (Technische Univ. Graz, AUSTRIA) Marcio L. de Andrade Netto (UNICAMP, BR) Jose R. C. Piqueira (USP, BR) Jose Pr=EDncipe (Univ. of Florida, USA) Carlos H. C. Ribeiro (ITA, BR) Jude W. Shavlik (Univ. of Wisconsin, USA) Marcilio C. P. de Souto (UFPE, BR) Harold Szu (Univ. of SW Lousiana, USA) Germano C. Vasconcelos (UFPE, BR) Marley Vellasco (PUC-RJ, BR) Takashi Yoneyama (ITA, BR) Gerson Zaverucha (UFRJ, BR) Jack M. Zurada (Univ. of Louisville, USA) From samengo at cab.cnea.gov.ar Wed Feb 6 12:35:47 2002 From: samengo at cab.cnea.gov.ar (=?iso-8859-1?Q?Ines?= Samengo) Date: Wed, 06 Feb 2002 14:35:47 -0300 Subject: paper on limited sampling Message-ID: <3C616973.3E55650E@cab.cnea.gov.ar> Dear connectionists, the following paper may be of interest to you. Thank you very much, Ines. Estimating probabilities from experimental frequencies Ines Samengo, to be published in Physical Review E, 2002 Estimating the probability distribution 'q' governing the behaviour of a certain variable by sampling its value a finite number of times most typically involves an error. Successive measurements allow the construction of a histogram, or frequency count 'f', of each of the possible outcomes. In this work, the probability that the true distribution be 'q', given that the frequency count 'f' was sampled, is studied. Such a probability may be written as a Gibbs distribution. A thermodynamic potential, which allows an easy evaluation of the mean Kullback-Leibler divergence between the true and measured distribution, is defined. For a large number of samples, the expectation value of any function of 'q' is expanded in powers of the inverse number of samples. As an example, the moments, the entropy and the mutual information are analyzed. http://www.cab.cnea.gov.ar/users/samengo/pub.html -- ______________________________________________________ Ines Samengo samengo at cab.cnea.gov.ar http://www.cab.cnea.gov.ar/users/samengo/samengo.html tel: +54 2944 445100 - fax: +54 2944 445299 Centro Atomico Bariloche (8.400) San Carlos de Bariloche Rio Negro, Argentina ______________________________________________________ From wolfskil at MIT.EDU Wed Feb 6 15:38:36 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Wed, 06 Feb 2002 15:38:36 -0500 Subject: book announcement--Herbrich Message-ID: <5.0.2.1.2.20020206153729.00ae5470@po14.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/026208306X/ Thank you! Best, Jud Learning Kernel Classifiers Theory and Algorithms Ralf Herbrich Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifier--a limited, but well-established and comprehensively studied model--and extends its applicability to a wide range of nonlinear pattern-recognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PAC-Bayesian theory, data-dependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library. Ralf Herbrich is a Postdoctoral Researcher in the Machine Learning and Perception Group at Microsoft Research Cambridge and a Research Fellow of Darwin College, University of Cambridge. 7 x 9, 384 pp., 0-262-08306-X Adaptive Computation and Machine Learning series Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From ken at phy.ucsf.edu Thu Feb 7 14:18:19 2002 From: ken at phy.ucsf.edu (Ken Miller) Date: Thu, 7 Feb 2002 11:18:19 -0800 Subject: Paper available: Neural Noise and Power-Law Nonlinearities Message-ID: <15458.54011.801331.179542@coltrane.ucsf.edu> The following paper is now available from ftp://ftp.keck.ucsf.edu/pub/ken/miller_troyer02.pdf or from http://www.keck.ucsf.edu/~ken (click on 'Publications', then on 'Models of Neuronal Integration and Circuitry') This is a final draft of a paper that appeared as Journal of Neurophysiology 87, 653-659 (2002). Neural Noise Can Explain Expansive, Power-Law Nonlinearities in Neural Response Functions Kenneth D. Miller and Todd W. Troyer Abstract: Many phenomenological models of the responses of simple cells in primary visual cortex have concluded that a cell's firing rate should be given by its input raised to a power greater than one. This is known as an expansive power-law nonlinearity. However, intracellular recordings have shown that a different nonlinearity, a linear-threshold function, appears to give a good prediction of firing rate from a cell's low-pass-filtered voltage response. Using a model based on a linear-threshold function, Anderson et al. (2000) showed that voltage noise was critical to converting voltage responses with contrast-invariant orientation tuning into spiking responses with contrast-invariant tuning. We present two separate results clarifying the connection between noise-smoothed linear-threshold functions and power-law nonlinearities. First, we prove analytically that a power-law nonlinearity is the only input-output function that converts contrast-invariant input tuning into contrast-invariant spike tuning. Second, we examine simulations of a simple model that assumes (i) instantaneous spike rate is given by a linear-threshold function of voltage, and (ii) voltage responses include significant noise. We show that the resulting average spike rate is well described by an expansive power law of the average voltage (averaged over multiple trials), provided that average voltage remains less than about 1.5 standard deviations of the noise above threshold. Finally, we use this model to show that the noise levels recorded by Anderson et al. (2000) are consistent with the degree to which the orientation tuning of spiking responses is more sharply tuned than the orientation tuning of voltage responses. Thus, neuronal noise can robustly generate power-law input-output functions of the form frequently postulated for simple cells. Kenneth D. Miller telephone: (415) 476-8217 Associate Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From wolfskil at MIT.EDU Thu Feb 7 14:21:00 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Thu, 07 Feb 2002 14:21:00 -0500 Subject: book announcement: Schlkopf Message-ID: <5.0.2.1.2.20020207141501.02fd7bc8@po14.mit.edu> MIT has recently published another book I thought Connectionist readers might be interested in. For more information, please visit http://mitpress.mit.edu/0262194759/ Thank you! Best, Jud Learning with Kernels Support Vector Machines, Regularization, Optimization, and Beyond Bernhard Schlkopf and Alexander J. Smola In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs--kernels--for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years. 8 x 10, 632 pp., 138 illus., cloth, ISBN 0-262-19475-9 Adaptive Computation and Machine Learning series Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From becker at meitner.psychology.mcmaster.ca Fri Feb 8 22:59:34 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Fri, 8 Feb 2002 22:59:34 -0500 (EST) Subject: FACULTY POSITION IN COMPUTATIONAL BIOLOGY Message-ID: ASSISTANT OR ASSOCIATE PROFESSOR COMPUTATIONAL BIOLOGY McMASTER UNIVERSITY McMaster University is a research-intensive institution and leading centre for biological and biomedical research. The Department of Biology is expanding and over the next year will fill six new faculty positions. We invite applications for a tenure-track position in Computational Biology at the Assistant or Associate Professor level, effective July 1, 2002. Candidates must hold a Ph.D. in Biology or a related field, possess at least one year of postdoctoral experience, and have a productive research record in an area of Computational Biology. We encourage applications from a broad range of individuals applying mathematics, statistics, and/or computer science to the study of biological questions. Research areas include but are not limited to bioinformatics, basic developmental biology, genomics, molecular biology, molecular evolution, neurobiology, ecology, population biology and population genetics. In addition, candidates that make use of parallel programming/computers are particularly encouraged to apply and will be able to take advantage of the local Shared Hierarchical Academic Research Computing network (SHARCNET) cluster of over 300 processors. The successful applicant will be expected to establish and maintain an independent and externally funded research program and contribute to the education of undergraduate and graduate students. Applicants should submit a curriculum vitae, a statement of their research interests, a statement of their teaching interests and experience, and three of their most important publications. Applicants should arrange for three letters of recommendation to be sent to - Dr. T.M. Finan, Chair of Biology, McMaster University, Department of Biology, 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada. Evaluation of applicants will begin March 25, 2002 but the position will remain open until filled. Please refer to the position to which you are applying in your covering letter. (see http://www.science.mcmaster.ca/Biology/Dept.html) All qualified candidates are encouraged to apply; however Canadian citizens and permanent residents will be given priority. McMaster University is committed to employment equity and encourages applications from qualified candidates, including, members of visible minorities, aboriginal peoples, persons with disabilities and women. From jose.dorronsoro at iic.uam.es Fri Feb 8 13:10:11 2002 From: jose.dorronsoro at iic.uam.es (Jose Dorronsoro) Date: Fri, 08 Feb 2002 19:10:11 +0100 Subject: ICANN 2002 Submission Deadline Extension Message-ID: <1.5.4.32.20020208181011.01274134@iic.uam.es> Note: efforts have been made to avoid duplicate postings of this message. Apologies if, nevertheless, you are getting them. ICANN 2002 Submission Deadline Extension Because of numerous requests, the February 15 deadline for submission of papers to the 12th International Conference on Artificial Neural Networks, ICANN 2002, has been extended to February 28. Acceptance or rejection will be notified by April 15, 2002 Submissions must be in postscript or pdf format and can be either uploaded or sent by surface mail or e-mail attach. Please check the author's instructions in the ICANN 2002 web page, www.ii.uam.es/icann2002. Notice also that, in any case, a Unique Tracking Number must be obtained first for each submission. The very simple procedure for UTN getting can also be started from the ICANN 2002 web page. Jose Dorronsoro ETS Informatica Universidad Autonoma de Madrid 28049 Madrid jose.dorronsoro at iic.uam.es Tlfno: 34 91 348 2329 Fax: 34 91 348 2334 From gbarreto at sel.eesc.sc.usp.br Sun Feb 10 21:33:38 2002 From: gbarreto at sel.eesc.sc.usp.br (Guilherme de Alencar Barreto) Date: Sun, 10 Feb 2002 23:33:38 -0300 (EST) Subject: Papers on Unsupervised Temporal Sequence Processing Message-ID: Dear Connectionists, The following three papers, on unsupervised temporal sequence processing, are available from http://www.sel.eesc.sc.usp.br/lasi/www/gbarreto/publicacoes.htm 1) Arajo, A.F.R. and Barreto, G.A. (2002). Context in temporal sequence processing: A self-organizing approach and its application to robotics. IEEE Transactions on Neural Networks, Vol. 13, No. 1, pp. 45-57, January Issue. Abstract: A self-organizing neural network for learning and recall of complex temporal sequences is developed and applied to robot trajectory planning. We consider trajectories with both repeated and shared states. Both cases give rise to ambiguities during reproduction of stored trajectories which are resolved via temporal context information. Feedforward weights encode spatial features of the input trajectories, while the temporal order is learned by lateral weights through a time-delayed Hebbian learning rule. After training is completed, the network model operates in an anticipative fashion by always recalling the successor of the current input state. Redundancy in sequence representation improves the robustness of the network to noise and faults. The network uses memory resources efficiently by reusing neurons that have previously stored repeated/shared states. Simulations have been carried out to evaluate the performance of the network in terms of trajectory reproduction, convergence time and memory usage, tolerance to fault and noise, and sensitivity to trajectory sampling rate. The results show that the network model is fast, accurate and robust. Its performance is discussed in comparison with other neural networks models. Keywords: Context, temporal sequences, self-organization, Hebbian learning, robotics, trajectory planning. 2) Barreto, G.A. and Arajo, A.F.R. (2001). Time in self-organizing maps: An overview of models. International Journal of Computer Research, Special Issue on Neural Networks: Past, Present and Future, 10(2):139-179. Abstract: We review a number of neural models of self-organizing feature maps designed to process sequential patterns in engineering and cognitive applications. This type of pattern inherently holds information of both a spatial and a temporal nature. The latter includes the temporal order, relative duration of the time interval, and temporal correlations of the items in the sequence. We present the main concepts related to the processing of spatiotemporal sequences and then discuss how the time dimension can be incorporated into the network dynamics through the use of various short-term memory models. The vast majority of the models are based on Kohonen's self-organizing map, being organized according to the network architecture and learning rules, and presented in nearly chronological order. We conclude the paper by suggesting possible directions for further research on temporal sequence processing through self-organizing maps. Keywords: Self-organizing maps, unsupervised learning, time dimension, temporal sequence, short-term memory, temporal context. 3) Barreto, G.A. and Arajo, A.F.R. (2001). Unsupervised learning and temporal context to recall complex robot trajectories. International Journal of Neural Systems, 11(1):11-22. Abstract: An unsupervised neural network is proposed to learn and recall complex robot trajectories. Two cases are considered: (i) A single trajectory in which a particular arm configuration (state) may occur more than once, and (ii) trajectories sharing states with each other. Ambiguities occur in both cases during recall of such trajectories. The proposed model consists of two groups of synaptic weights trained by competitive and Hebbian learning laws. They are responsible for encoding spatial and temporal features of the input sequences, respectively. Three mechanisms allow the network to deal with repeated or shared states: local and global context units, neurons disabled from learning, and redundancy. The network reproduces the current and the next state of the learned sequences and is able to resolve ambiguities. The model was simulated over various sets of robot trajectories in order to evaluate learning and recall, trajectory sampling effects and robustness. Guilherme de A. Barreto Dept. of Electrical Engineering University of So Paulo (USP) So Carlos, SP, BRAZIL FAX: 55- 16 - 273 9372 PHONE: 55- 16 - 273 9357 From d.mareschal at bbk.ac.uk Mon Feb 11 06:16:19 2002 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Mon, 11 Feb 2002 12:16:19 +0100 Subject: Phd Studentships Message-ID: Readers of this list may be interested in the following Phd positions. PLEASE DO NOT RESPOND DIRECTLY TO ME. --------------------------- The School of Psychology, Birkbeck College as a number of Phd Studentships on offer for Phds starting in October 2002. Birkbeck College is part of the University of London and is situated in the central Bloomsbury area of London, in close proximity to University College London, The Insitute of Cognitive Neuroscience, the Gatsby Computational Neurosciences Unit, the Institue of Child Health, and the Insitute of Education. The School of Psychology has a very active internationally recognised research programme with particular interests in cognitive sciences, cognitive neurosciences, computational neuroscience, and cognitive and social development. However, the School welcomes applications for studentships in all areas of psychology For more information about the Schools research profile and studentships available, please visit our website: www.psyc.bbk.ac.uk OR contact: Ms Mina Daniel Postgraduate Administrator Tel.: 020 7631 6862 E-mail: s.daniel at psychology.bbk.ac.uk ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 (0)20 7631-6582/6226 reception: 6207 fax +44 (0)20 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From wolfskil at MIT.EDU Thu Feb 7 14:21:00 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Thu, 07 Feb 2002 14:21:00 -0500 Subject: book announcement: Schlkopf Message-ID: <5.0.2.1.2.20020207141501.02fd7bc8@po14.mit.edu> MIT has recently published another book I thought Connectionist readers might be interested in. For more information, please visit http://mitpress.mit.edu/0262194759/ Thank you! Best, Jud Learning with Kernels Support Vector Machines, Regularization, Optimization, and Beyond Bernhard Schlkopf and Alexander J. Smola In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs--kernels--for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years. 8 x 10, 632 pp., 138 illus., cloth, ISBN 0-262-19475-9 Adaptive Computation and Machine Learning series Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From becker at meitner.psychology.mcmaster.ca Mon Feb 11 16:54:29 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Mon, 11 Feb 2002 16:54:29 -0500 (EST) Subject: COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION Message-ID: COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION A postdoctoral candidate is sought to develop computational models of the role of ascending neuromodulatory systems in both learning and motivated behaviour. Topics of interest include the role of dopamine in gating signal transmission in pathways involved in generating motivated action, development of fears and paranoias in hyperdopaminergic conditions, learning aversive and emotional conditioned responses, and the biological bases of emotional memory formation in structures including the hippocampus and amygdala. It is anticipated that this project will lead to fundamental contributions to the literature on models of self-organization, by combining unsupervised and semi-supervised (reinforcement) learning methods. The work will also have important implications for the team's research in schizophrenia, wherein the actions of new classes of antipsychotic drugs are being investigated both in clinical trials and using brain imaging and behavioural pharmacology. The candidate must have a PhD in cognitive science, computer science, or a related discipline, and experience in neural network modelling. The model development will proceed in close collaboration with researchers at the CAMH and University of Toronto investigating learning and memory using behavioural pharmacology. Depending upon the interests of the candidate, opportunities also exist to acquire training in human functional neuroimaging, and conduct studies with clinical populations. The position is available for a minimum of two years. This research is part of a collaborative effort involving Dr. S. Becker, Department of Psychology, McMaster University (computational neuroscience), Dr. S. Kapur, Centre for Addiction and Mental Health (CAMH) and Department of Psychiatry, University of Toronto (behavioural pharmacology, human neuroimaging with PET and fMRI) and Dr. P. Fletcher, Department of Psychology, University of Toronto, and CAMH (animal models and behavioural pharmacological studies). For further information on the research interests of the team see www.science.mcmaster.ca/Psychology/sb.html http://www.camh.net/research/research_ar2001/schizophrenia.html http://www.rotman-baycrest.on.ca/content/people/profiles/kapur.html http://www.camh.net/research/research_ar2001/biopsychology.html Interested candidates should send a letter of intention, a CV and two letters of recommendation to Dr. S. Becker at the address below. Dr. Sue Becker Department of Psychology McMaster University 1280 Main Street West, Hamilton, Ont. L8S 4K1 becker at mcmaster.ca Fax: (905)529-6225 From reggia at cs.umd.edu Tue Feb 12 11:48:22 2002 From: reggia at cs.umd.edu (James A. Reggia) Date: Tue, 12 Feb 2002 11:48:22 -0500 (EST) Subject: Postdoc Position Message-ID: <200202121648.LAA12502@avion.cs.umd.edu> A two-year postdoc position is available within the context of the fellowship program described below. The position involves the use of neural networks, genetic algorithms/programming, or related computational methods to study hemispheric specialization or related aspects of the neurobiological basis of language. A strong computational background is expected for this position. Applications arriving by March 15, 2002, will receive full consideration, with a target starting date of this coming summer or early fall. Jim Reggia -------- Post-doctoral Fellowships: Cognitive Neuroscience of Language & its Disorders Two-year National Research Service Award fellowships are available at the University of Maryland, Baltimore and College Park campuses. Training opportunities will provide experience in the application of contemporary research methods (including computational modeling, cognitive neuropsychology, event-related potentials and functional neuroimaging) to the topic of normal and/or disordered language processing. Applicants with doctoral degrees in related basic science areas (computer science, neuroscience, linguistics, cognitive psychology, etc.) and clinical disciplines (speech/language pathology; clinical neuropsychology) are invited to apply. Applicants must be U.S. citizens or permanent residents to be considered, under the terms of the NRSA program. Inquiries may be directed to Rita Berndt at rberndt at umaryland.edu or to Jim Reggia at reggia at cs.umd.edu . To apply, send your C.V., the names and addresses of three referees, your contact information, and a statement of research interests and career goals to: James A. Reggia email: reggia at cs.umd.edu Department of Computer Science A. V. Williams Bldg. University of Maryland fax: (301) 405-6707 College Park MD 20742 USA Applications may be sent by mail, fax or email electronic attachments. From C.Campbell at bristol.ac.uk Wed Feb 13 11:47:01 2002 From: C.Campbell at bristol.ac.uk (Colin Campbell, Engineering Mathematics) Date: Wed, 13 Feb 2002 16:47:01 +0000 (GMT Standard Time) Subject: CFP: Bioinformatics Special Section/Analysis of Microarray Data Message-ID: The following may be of interest: Special Section of the Journal Bioinformatics on: Analysis of Microarray Data Organisers: Colin Campbell (University of Bristol) and Shayan Mukherjee (MIT) Microarray technology is rapidly accelerating progress in many areas of biomedical research. For the first time this technology gives a global view of the expression level of thousands of genes. This Special Section of Bioinformatics will focus on new algorithmic or theoretical techniques for analyzing such datasets. This Special Section was announced at the NIPS2001 Workshop on Machine Learning Techniques for Bioinformatics held at the Whistler Resort, British Columbia, Canada on December, 2001. Analysis of microarray data frequently utilizes machine learning techniques such as cluster analysis, classification, feature selection, regression, sample complexity, determination of network structures and feature dependencies, for example. However, we also welcome papers from researchers interested in analytical methods beyond machine learning (e.g. statistics) which may include techniques for evaluating the effect of noise, imputing missing values, discovering outliers, scoring features, etc. We welcome case studies in which the techniques described above are applied to new datasets, illustrating practical problems and the successful use of these methods. Further details can be found at: http://lara.enm.bris.ac.uk/cig/nips01/bioss.htm The deadline for submissions is *** 30th April 2002 ***. -------------------------------------------- Dr. Colin Campbell, Dept. of Engineering Mathematics, Bristol University, Bristol BS8 1TR, United Kingdom http://lara.enm.bris.ac.uk/cig/ Tel +44 (0) 117 928 9858 C.Campbell at bristol.ac.uk From skoenig at cc.gatech.edu Wed Feb 13 12:59:00 2002 From: skoenig at cc.gatech.edu (Sven Koenig) Date: Wed, 13 Feb 2002 12:59:00 -0500 (EST) Subject: CFP: SARA 2002 Message-ID: <200202131759.g1DHx0U16794@cleon.cc.gatech.edu> Our apologies if you receive more than one call for papers for SARA 2002 (Symposium on Abstraction, Reformulation and Approximation). The deadline for indicating an intent to submit is February 20, 2002. Cheers, Robert Holte Sven Koenig ---------------------------------------------------------------------- CALL FOR PAPERS SARA-2002 Symposium on Abstraction, Reformulation and Approximation Kananaskis Mountain Lodge, Kananaskis, Alberta, Canada August 2-4, 2002 (immediately after AAAI-2002) OVERVIEW SARA-2002 is an Artificial Intelligence symposium on all aspects of abstraction, reformulation, and approximation. Like past SARAs, it will consist of stimulating technical presentations spanning the traditional boundaries that fragment Artificial Intelligence research. Three invited speakers will give their perspectives on abstraction, reformulation, and approximation. Attendance is limited to approximately 50 participants. Some funding is available to subsidize the cost of graduate students whose research involves techniques of abstraction, reformulation or approximation. SARA-2002 will be situated amidst the spectacular Rocky Mountains of the Kananaskis Valley, 60 miles west of Calgary, Alberta, and 45 miles southeast of Banff, Alberta. To make it convenient for AAAI-2002 attendees to participate in SARA, a luxurious bus will drive from the AAAI conference site to the SARA site the afternoon of August 1. *** PAPER SUBMISSIONS *** Submissions are due on February 25, and may be either full papers or extended abstracts. Authors are requested to send a notification of intent to submit, together with a draft title and short abstract, by February 20, 2002 to sara-submission at cc.gatech.edu *** FOR MORE INFORMATION *** Additional information, including a complete call for papers, may be obtained from the symposium home page http://www.cs.ualberta.ca/~holte/SARA2002/ If you would like to receive updates about the conference, please send email to holte at cs.ualberta.ca and ask to be added to the SARA mailing list. We gratefully acknowledge support from AAAI and NASA. -- Robert Holte holte at cs.ualberta.ca Sven Koenig skoenig at cc.gatech.edu SARA-2002 co-chairs From mpp at us.ibm.com Thu Feb 14 09:59:45 2002 From: mpp at us.ibm.com (Michael Perrone) Date: Thu, 14 Feb 2002 09:59:45 -0500 Subject: IBM Graduate Summer Intern Positions in Handwriting Recognition Message-ID: _________________________________________________________________________________ Graduate Summer Intern Positions at IBM _________________________________________________________________________________ The Pen Technologies Group at the IBM T.J. Watson Research Center is looking for graduate students to fill summer R&D positions in the area of large-vocabulary, unconstrained, handwriting recognition. Candidates should have the following qualifications: - Currently enrolled in a PhD program in EE, CS, Math, Physics or similar field - Research experience in handwriting recognition or IR - Strong mathematics/probability background - Excellent programming skills (in C and/or C++ and/or Java) - Creativity Our current projects include: - HMM-based, unconstrained, handwriting recognition - Language and grammar modeling - Accurate, high-speed, search methods - Document understanding and processing - Pen computing - Handwritten document retrieval The IBM T.J. Watson Research Center is one of the top industrial laboratories in the world. We offer an exciting research environment with the opportunity to become involved in all aspects of cutting edge technology in the computer industry. We encourage your early reply as positions fill quickly. ______________________ Please send CV's to: Michael P. Perrone mpp at us.ibm.com -or- Michael P. Perrone IBM T.J. Watson Research Center - 36-207 Route 134 Yorktown Heights, NY 10598 914-945-1779 From ddepi001 at umaryland.edu Fri Feb 15 14:29:45 2002 From: ddepi001 at umaryland.edu (Didier A. Depireux Dr) Date: Fri, 15 Feb 2002 14:29:45 -0500 (EST) Subject: Post-Doct: Encoding of Dynamic Spectrum in Auditory Cortex Message-ID: Several post-doctoral positions are available in the Department of Anatomy and Neurobiology of the School of Medicine of the Univ. of Maryland (in Baltimore) to work in the laboratory of Dr. Didier Depireux. The overall goal of the research is to determine how the shape of the acoustic spectrum is represented in the unit responses of auditory cortex of the awake and alert ferret. We develop system models to characterize response features that are extensions of the classical concepts of response areas and impulse response functions. The project could involve correlated psychophysical studies in ferrets and/or human subjects. The successful candidate will have training experience in either electrophysiology in animals or a strong interest in applying quantitative methods to the field of neuroscience. A representative paper of the techniques used can be found at http://www.isr.umd.edu/~didier/torcs.pdf A representative paper of the recording methods and results in cortex is http://www.isr.umd.edu/~didier/1220.pdf I prefer email applications. Please contact me for questions and applications at ddepi001 at umaryland.edu Didier -- Didier A Depireux ddepi001 at umaryland.edu 685 W.Baltimore Str http://neurobiology.umaryland.edu/depireux.htm Anatomy and Neurobiology Phone: 410-706-1272 (off) University of Maryland -1273 (lab) Baltimore MD 21201 USA Fax: 1-301-314-9920 From tewon at salk.edu Fri Feb 15 21:15:40 2002 From: tewon at salk.edu (Te-Won Lee) Date: Fri, 15 Feb 2002 18:15:40 -0800 Subject: ICA-2001 Proceedings Online Message-ID: <000a01c1b68f$d471f060$5293ef84@redmond.corp.microsoft.com> Third International Conference on Independent Component Analysis and Blind Signal Separation, December 9-12, 2001 - San Diego, California, USA. Editors: T.-W. Lee, T.-P. Jung, S. Makeig, and T. J. Sejnowski The complete Program and Proceedings of this conference are available on-line at: http://www.ica2001.org Information on how to obtain a CD ROM and hardcopies of the proceedings can also be found at this URL. Te-Won Te-Won Lee, Ph.D. Institute for Neural Computation - MC0523 University of California, San Diego La Jolla, CA 92039-0523, USA 858-534-9662 office 858-534-2014 fax http://rhythm.ucsd.edu/~tewon From: esann To: "Connectionists at cs.cmu.edu" References: From bogus@does.not.exist.com Fri Feb 15 11:14:43 2002 From: bogus@does.not.exist.com () Date: Fri, 15 Feb 2002 17:14:43 +0100 Subject: ESANN'2002 programme ( European Symposium on Artificial Neural Networks) Message-ID: ---------------------------------------------------- | | | ESANN'2002 | | | | 10th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 24-25-26, 2002 | | | | Preliminary programme | ---------------------------------------------------- The preliminary programme of the ESANN'2002 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. For 10 years the ESANN conference has become a major event in the field of neural computation. ESANN is a human-size conference focusing on fundamental aspects of artificial neural networks (theory, models, algorithms, links with statistics, data analysis, biological background,...). This year, 81 scientific communications will be presented, covering most areas of the neural computation field. The programme of the conference can be found at the URL http://www.dice.ucl.ac.be/esann, together with practical information about the conference venue, registration,... Other information can be obtained by sending an e-mail to esann at dice.ucl.ac.be . ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From harnad at coglit.ecs.soton.ac.uk Sat Feb 16 10:25:40 2002 From: harnad at coglit.ecs.soton.ac.uk (S.Harnad) Date: Sat, 16 Feb 2002 15:25:40 GMT Subject: Budapest Open Access Initiative Message-ID: <200202161525.PAA13050@coglit.ecs.soton.ac.uk> This message is addressed to scholars and scientists and it concerns the Budapest Open Access Initiative (BOAI) http://www.soros.org/openaccess launched on 14 February by George Soros's Open Society Institute. To be useful, research must be used. To be used (read, cited, applied, extended) it must be accessible. There are currently 20,000 peer-reviewed journals of scientific and scholarly research worldwide, publishing over 4 million articles per year, every single one of them given away for free by its researcher-authors and their research-institutions, with the sole goal of maximizing their uptake and usage by further researchers, and hence their impact on worldwide research, to the benefit of learning and of humanity. Yet access to those 4 million annual research articles can only be had for a fee. Hence they are accessible only to the lucky researchers at that minority of the world's research institutions that can pay for them. And even the wealthiest of these institutions can only afford a small and shrinking proportion of those annual 20,000 journals. The result is exactly as if all those 4 million articles had been written for royalties or fees, just the way most of the normal literature is written, rather than having been given away for free by their authors and their institutions for the benefit of research and humanity. As a consequence, other researchers' access to all this work, and hence its potential impact on and benefit to research progress, is being minimized by access tolls that most research institutions and individuals worldwide cannot afford to pay. Those access tolls were necessary, and hence justified, in the Gutenberg era of print-on-paper, with its huge real costs, and no alternatives. But they are no longer necessary or justified, and are instead in direct conflict with what is best for research, researchers, and society, in today's PostGutenberg era of on-line-eprints, when virtually all of those Gutenberg costs have vanished, and those remaining costs can be covered in a way that allows open access. The Budapest Open Access Initiative is dedicated to freeing online access to this all-important but anomalous (because give-away) literature, now that open access has at long last become possible, by (I) providing universities with the means of freeing online access to their own annual peer-reviewed research output (as published in the 20,000 established journals) through institutional self-archiving, as well as by (II) providing support for new alternative journals that offer open online access to their full text contents directly (and for established journals that are committed to making the transition to offering open full-text access online). It is entirely fitting that it should be George Soros's Open Society Institute that launches this initiative to open access to the world's refereed research literature at last. Open access is now accessible, indeed already overdue, at a mounting cost in lost benefits to research and to society while we delay implementing it. What better way to open society than to open access to the fruits of its science and scholarship, already freely donated by its creators, but until now not freely accessible to all of its potential users? Fitting too is the fact that this initiative should originate from a part of the world that has known all too long and all too well the privations of a closed society and access denial. Please have a look at the BOAI at http://www.soros.org/openaccess and, if you or your organization are implementing, or planning to implement either Strategy I or Strategy II, I hope you will sign the BOAI, either as an individual or an organization. Below, I append links to some of the press coverage of the BOAI so far. Sincerely, Stevan Harnad Declan Butler, Soros Offers Access to Science Papers (for Nature) http://makeashorterlink.com/?U21535A6 Ivan Noble, Boost for Research Paper Access (for BBC) http://news.bbc.co.uk/hi/english/sci/tech/newsid_1818000/1818652.stm Michael Smith, Soros Backs Academic Rebels (for UPI) http://www.upi.com/view.cfm?StoryID=12022002-031227-9710r [Alexander Grimwade, Open Societies Need Open Access (The Scientist) http://www.the-scientist.com/yr2002/feb/comm_020218.html ] [Denis Delbecq, L'abordage des revvues scientifiques (Liberation, Paris) http://www.liberation.com/quotidien/semaine/020214-050019088SCIE.html ] [http://slashdot.org/] From klikharev at notes.cc.sunysb.edu Mon Feb 18 07:30:07 2002 From: klikharev at notes.cc.sunysb.edu (Konstantin Likharev) Date: Mon, 18 Feb 2002 07:30:07 -0500 Subject: Postdoctoral position References: <12415.1013994928@ammon.boltz.cs.cmu.edu> Message-ID: <002901c1b878$00175540$9d383181@likharev> I am looking for an outstanding postdoctoral candidate to work at Stony Brook University (www.sunysb.edu) on the development of large scale self-evolving neural networks based on nanoscale latching switches. The basic ideas of this effort are described in our recent IJCNN'01 presentation (see http://rsfq1.physics.sunysb.edu/~likharev/nano/IJCNN'01.pdf). The person is the position shall be responsible for the network design and simulation (using, in particular, our new 162-processor cluster Njal), and is supposed to work in close contact with other members of our multi-disciplinary Stony Brook - centered collaboration. The collaboration is working on all aspects of the development of self-evolving networks, including their conceptual design and globally supervised training (Dr. J. Barhen of ORNL, Prof. M. Bender and myself), CMOS VLSI prototyping (Prof. A. Leuciuc), design of nanoscale single-electron latching switches (Prof. P. Allen and myself), and their molecular implementation (Dr. B. Brunschwig of BNL, Prof. J. Lukens, Prof. A. Mayr). The position is initially for one year, with a nearly-automatic extension for at least one more year if the work is running smoothly. Effort compensation is in the range $35-50K/yr (depending on candidate's credentials) plus a generous fringe benefit package. Stony Brook University is a EO/AA employer. Interested persons should send me their C.V.'s including list of publications and names of 3 references. (No reference letters without a specific request, please.) Additional questions by e-mail are welcome. Regards, K. Likharev ________________________________________________________________ Konstantin K. Likharev Professor of Physics State University of New York at Stony Brook Stony Brook, NY 11794-3800 Phone 631-632-8159 Fax 631-632-4977 E-mail klikharev at notes.cc.sunysb.edu (or likharev at rsfq1.physics.sunysb.edu) Web page http://rsfq1.physics.sunysb.edu/~likharev/personal/index.html From giugliano at pyl.unibe.ch Tue Feb 19 10:36:02 2002 From: giugliano at pyl.unibe.ch (Michele Giugliano) Date: Tue, 19 Feb 2002 16:36:02 +0100 Subject: NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL -- Univ. of Genova (Italy) -- June 10-15, 2002 Message-ID: <017201c1b95b$23a6eff0$9da25c82@physio.unibe.ch> Apologies if you receive this more than once. ***** NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL ***** ***** sponsored by the University of Genova, Italy ***** ***** CALL FOR APPLICATION ***** June 10 - 15 2002 University of Genova, Italy http://130.251.89.117/homepage/workshop2002.htm Registration fee: 50 Euro (students), 100 Euro (others). MAIN GOAL: to understand , modify and use brain plasticity in order to advance Neuroscience at the network level and to inspire new computer architectures. Scientific background: Bioengineering, Electronics, Informatics, Neuroscience. How to reach the main goal: By interfacing in vitro neurons to standard and microelectronic transducers capable to monitor and modify the neuron electrophysiological activity By creating hybrid neuro-electronic systems By developing neuro-prostheses By computer simulating plasticity at the network level By developing neuromorphic silicon neurons INVITED SPEAKERS: Alain Destexhe, Unite de Neuroscience Integratives et Computationelles, CNRS, Gif-sur-Yvette, France Michael Rudolph, Unite de Neuroscience Integratives et Computationelles, CNRS, Gif-sur-Yvette, France Ferdinando Mussa-Ivaldi, Department of Physiology Northwestern University Medical School, Chicago (IL), USA John Nicholls, SISSA, Trieste, Italy Miguel Nicolelis, Department of Neurobiology, Duke University, Durham (NC), USA Stefano Fusi, Institute of Physiology, University of Bern, Bern, Switzerland Michele Giugliano, Institute of Physiology, University of Bern, Bern, Switzerland Giacomo Indiveri, Institute for Neuroinformatics, ETH / University of Zurich, Switzerland Pietro Morasso, Department of Communications, Computer and System Sciences, University of Genova, Italy Rinaldo Poluzzi, ST Microelectronics, Italy Giulio Sandini, Department of Communications, Computer and System Sciences, University of Genova, Italy REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below and send it by email to news2002 at bio_nt.dibe.unige.it . REGISTRATION FORM (Please send it by email to: news2002 at bio_nt.dibe.unige.it ) NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL June 10 - 15 2002 University of Genova, Department of Biophysical and Electronic Engineering V. Opera Pia 11a, 16145 Genova Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ Registration fee: CHECK ONE: ( ) Euro 50 Registration Fee (Student) ( ) Euro 100 Registration Fee (Regular) PREFERRED METHOD OF PAYMENT : [ ] Bank transfer: Bank CARIGE, Agency 41, ABI:6175, CAB: 1472 c/c DIBE - University of Genoa 5341/90. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (VISA), check or cash at the very beginning of the workshop. From genesee at ego.psych.mcgill.ca Tue Feb 19 11:01:05 2002 From: genesee at ego.psych.mcgill.ca (Fred Genesee) Date: Tue, 19 Feb 2002 11:01:05 -0500 Subject: McGill Job Announcement Message-ID: <200202191600.LAA05931@ego.psych.mcgill.ca> McGill University Department of Psychology Canada Research Chair in Psychology of Language The Department of Psychology of McGill University invites applications from exceptional candidates for a Tier II Canada Research Chair in Psychology of Language. The successful applicant will have a tenure-track appointment at the Assistant or junior Associate Professor level. Consideration will be given to candidates with interests in any domain of scientific language research including, acquisition, speech and language perception and processing, neural representation, and language disorders. The Department has excellent facilities for interdisciplinary research through the Centre for Language, Mind, and Brain which links researchers in related academic units at McGill University (Linguistics, Communication Sciences and Disorders, and Education), the Montreal Neurological Institute, and other universities in Montreal. Applicants are expected to have a doctorate in psychology or a closely related field, a record of significant, externally-funded research, an aptitude for undergraduate and graduate teaching and the ability and interest to work collaboratively in an interdisciplinary research environment. Consideration of applications will begin March 1 and continue until suitable candidates have been identified. Applicants should submit a curriculum vitae, a description of research interests and philosophy, a statement of teaching interests and philosophy, selected reprints of publications, and should arrange for three confidential letters of recommendation to be sent to Chair, Psychology of Language Search Committee Department of Psychology McGill University 1205 Dr. Penfield Avenue Montreal, Quebec, Canada H3A 1B1. All qualified candidates are encouraged to apply, however Canadians and permanent residents will be given priority. Psychology Department phone: (514) 398-6022 McGill University fax: (514) 398-4896 1205 Docteur Penfield Ave. Montreal, Quebec Canada H3A 1B1 From m.stetter at mchp.siemens.de Tue Feb 19 11:21:58 2002 From: m.stetter at mchp.siemens.de (Martin Stetter) Date: Tue, 19 Feb 2002 17:21:58 +0100 Subject: Book Announcement: Exploration of Cortical Function Message-ID: <3C727BA6.4139505A@mchp.siemens.de> Dear collegues, I would like to announce my new book: Exploration of Cortical Function Imaging and Modeling Cortical Population Coding Strategies Martin Stetter Exploration of Cortical Function summarizes recent efforts aiming at the revelation of cortical population coding and signal processing strategies. Topics include optical detection techniques of population activity in the submillimeter range, advanced methods for the statistical analysis of these data, and biologically inspired neuronal models for population activities in the framework of optimal coding, statistical learning theory and meanfield recurrent networks. The book covers one complete branch of population-based brain research ranging from methods for data acquisition over data analysis up to neuronal models for the quantification of functional principles. The volume covers an area which is of great current interest to researchers working on cerebral cortex. The combination of models and image analysis techniques to examine the activity of large cohorts of neurons is especially intriguing and prone to considerable debate. Readership is aimed at students and researchers from many disciplines including neuroscience, biology, physics and computer science interested in how an interdisciplinary framework from biology, statistics and computational neuroscience can be used to gather a quantitative understanding of cortical function. Experimentalists may gain insight into statistical and neuronal modeling techniques, whereas theoreticians will find an introductory treatment of neuroanatomy, neurophysiology and measurement techniques. Kluwer Academic Publishers 269 pp., 132 illus. ISBN 1-4020-0435-4 (hardcover) ISBN 1-4020-0436-2 (paperback) -- ================================================================== Dr. Martin Stetter loc : Mch-P 63-418 Siemens AG, CT IC 4 phone : +49-89-636-55734 Corporate Technology fax : +49-89-636-49767 D-81730 Muenchen, Germany mailto: martin.stetter at mchp.siemens.de ================================================================== From nestor at ftnp.ft.uam.es Tue Feb 19 16:13:57 2002 From: nestor at ftnp.ft.uam.es (Nestor Parga Carballeda) Date: Tue, 19 Feb 2002 22:13:57 +0100 (MET) Subject: Systems Neuroscience / Madrid Message-ID: Interested candidates are invited to apply for a "Ramon y Cajal" position for experimental research work in Systems Neuroscience at the group of Computational Neuroscience of the Universidad Autonoma de Madrid, Spain. These are five year positions co-funded by the Spanish Ministery of Science and Technology and the Universities. This is the second year that these contracts are given in Spain and there will be a third call next year. More information about them can be found at the web address: http://www.mcyt.es/cajal/default.htm Applicants are should to submit a research proposal (of no more than 2,000 words). They are expected to work in interaction with the theoretical team of the Computational Neuroscience group. The basic idea is to carry out joint work on information processing in the brain by combining theoretical and experimental approaches. This leaves a rather broad field within which the candidates can make their proposals. Details about the current (theoretical) work of the group can be found at the web site: http://ket.ft.uam.es/~neurociencia/ Apart from the research proposal, applicants should also provide a full CV, a list of all publications and a statement of research interests to: Nestor Parga at one of two following e-mail addresses: nestor at ftnp.ft.uam.es parga at delta.ft.uam.es The Universities should make a decision about the type of projects that they are willing to fund by the end of February. For this reason applications should be sent to the address above preferably before February 25. This is the first stage of the selection process. A formal application and evaluation will be done later this year (see http://www.mcyt.es/cajal/default.htm for details) ------------------------------------------------------------------- | Nestor Parga | | | | Phone : (+34) 91-397-4542 | | Dpto. de Fisica Teorica, C-XI | Fax : (+34) 91-397-3936 | | Universidad Autonoma de Madrid | E-mail: nestor at ftnp.ft.uam.es| | 28049 Madrid, SPAIN | parga at delta.ft.uam.es| | | | http://ket.ft.uam.es/~neurociencia/nestor | ------------------------------------------------------------------- From piuri at fusberta.elet.polimi.it Tue Feb 19 15:35:09 2002 From: piuri at fusberta.elet.polimi.it (Vincenzo Piuri) Date: Tue, 19 Feb 2002 21:35:09 +0100 Subject: CALL FOR PAPERS: DEADLINE EXTENSION TO MARCH 8 !!! VIMS 2002 IN CONJUNCTION WITH IMTC 2002 Message-ID: <5.1.0.14.0.20020219213424.02e23110@pop3.norton.antivirus> VIMS 2002 2002 IEEE INTERNATIONAL SYMPOSIUM ON VIRTUAL AND INTELLIGENT MEASUREMENT SYSTEMS Mt. Alyeska Resort Hotel (nearby Anchorage), AK, USA - 19-20 May 2002 Sponsored by IEEE Instrumentation and Measurement Society & IEEE Neural Network Council With the technical cooperation of International Neural Network Society Instrumentation, Systems, and Automation Society VIMS2002 is held in conjunction with IEEE World Congress on Computational Intelligence - WCCI'02, Honolulu, HW, USA, 12-17 May 2002 IEEE Instrumentation and Measurement Technology Conference - IMTC2002, Anchorage, AK, USA, 21-23 May 2002 >>>> PAPER SUBMISSION DEADLINE: EXTENDED TO 8 MARCH 2002 <<<< ALL DETAILED INFORMATION ARE AVAILABLE AT http://ewh.ieee.org/soc/im/vims/ Virtual environments become highly attractive to afford complex application problems where simulation plays a relevant role to model and analyze the behavior of complex systems, to design innovative solutions for industrial production processes and products, to assess the feasibility and the effectiveness of processes and products. Besides, realization of virtual systems on computer-based environments allows for re-using components and adapting their behaviors to the application needs, hence reducing production cost and time. On the other hand, adaptive and evolving solutions become increasingly more and more relevant in applications requiring an adaptable behavior for the system according to the changing needs of the users, the application, and the environment. Intelligent techniques based on soft-computing (i.e., neural networks, fuzzy logic, and genetic algorithms) have been proved effective to support such an adaptation. Nowadays, integration of different components, realized by using heterogeneous computing paradigms, is important to save investment and exploit the features offered by well-assessed algorithmic approaches. Globalization and the need of distributed sensing, monitoring, and control drive also the inclusion of computer science technologies (e.g., distributed networks, agents, cooperative systems, web, mobile systems, micro- and nano-robots) to achieve the appropriate and modular integration in larger and more complex systems. This issue is becoming relevant in environmental monitoring, power distribution, and many other industrial applications. Intelligent distributed virtual environments will therefore play a key role in the industry as well as in the daily life to support the evolving needs of the users and the economy. Suppliers continually offer more affordable hardware and software components to implement these innovative approaches. Industries, government agencies, and research institutions widely consider and use these techniques. Up to now, analysis and experiments have been performed mainly at qualitative level by scientists and practitioners, aiming to understand the underlying technologies and methodologies, but without any specific focus on the mandatory need of a quantitative assessment and a metrological analysis. The VIMS 2002 symposium is therefore directed to fill this gap in knowledge and practice, especially by focusing on the quantitative aspect of instrumentation and measurement issues. Sessions will cover all aspects of soft computing technologies and virtual environments related to instrumentation and measurement, from the point of view both of theory and practical applications. General Co-Chairs: Vincenzo Piuri, University of Milan, Italy Enrique H. Ruspini, SRI International, USA Technical Program Co-Chairs: Cesare Alippi, Politecnico di Milano, Italy Evangelia Micheli-Tzanakou, Rutgers University, USA Mel Siegel, Carnegie Mellon University, USA Symposium Coordinator: Robert Myers, Myers-Smith Inc., USA From edwin at cs.brandeis.edu Tue Feb 19 18:21:18 2002 From: edwin at cs.brandeis.edu (Edwin de Jong) Date: Tue, 19 Feb 2002 18:21:18 -0500 (EST) Subject: CFP: ICML Workshop on Development of Representations Message-ID: [Apologies if you receive multiple copies of this announcement.] -------------------------------------------------------------------------- CALL FOR PAPERS ICML Workshop Development of Representations July 9th, 2002, Sydney, Australia http://www.demo.cs.brandeis.edu/icml02ws/ DESCRIPTION The representation of a learning problem has long been known to be a major factor in learning performance. The nature of appropriate representations and representational change as a part of the learning process have been studied in a variety of forms in a number of subfields within machine learning, artificial intelligence and, more recently, other communities. Despite this fact, representations are typically hand-coded rather than acquired automatically. The goal of this workshop is to explore problems in the area of automated development of representations and to build ties between the various relevant communities. From juergen at idsia.ch Tue Feb 19 10:14:15 2002 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Tue, 19 Feb 2002 16:14:15 +0100 Subject: optimal predictors Message-ID: <3C726BC7.32E774BF@idsia.ch> There is an optimal way of predicting the future, given past observations. Normally we do not know the true conditional probability distribution p(next event | past). But assume we do know that p is in some set P of distributions. Choose a fixed weight w_q for each q in P such that the w_q add up to 1 (for simplicity, let P be countable). Then construct the Bayesmix M(x) = Sum_q w_q q(x), and predict using M instead of the optimal but unknown p. How wrong is it to do that? The recent exciting work of Marcus Hutter (IDSIA) provides general and sharp (!) loss bounds: Let LM(n) and Lp(n) be the total expected losses of the M-predictor and the p-predictor, respectively, for the first n events. Then LM(n)-Lp(n) is at most of the order of sqrt[Lp(n)]. That is, M is not much worse than p. And in general, no other predictor can do better than that! In particular, if p is deterministic, then the M-predictor soon won't make any errors any more. If P contains ALL computable distributions, then M becomes the celebrated enumerable universal prior. That is, after decades of somewhat stagnating research we now have sharp loss bounds for Solomonoff's universal (but incomputable) induction scheme. Similarly, if we replace M by the Speed Prior S - where S(x) is small if x is hard to compute by any method - we obtain appropriate loss bounds for computable S-based induction. Alternatively, reduce M to what you get if you just add up weighted estimated future finance data probabilities generated by 1000 commercial stock-market prediction software packages. If only one of them happens to work fine (but you do not know which) you still should get rich. Note that the approach is much more general than what is normally done in traditional statistical learning theory, where the often quite unrealistic assumption is that the observations are statistically independent. To learn more, please read Optimality of Universal Bayesian Sequence Prediction for General Loss and Alphabet: ftp://ftp.idsia.ch/pub/techrep/IDSIA-02-02.ps.gz and also check out Hutter's other recent papers at ICML, ECML, NIPS, Int. J. of Foundations of CS: www.idsia.ch/~marcus ------------------------------------------------- Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland juergen at idsia.ch http://www.idsia.ch/~juergen From terry at salk.edu Wed Feb 20 20:27:16 2002 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 20 Feb 2002 17:27:16 -0800 (PST) Subject: NEURAL COMPUTATION 14:3 In-Reply-To: <200201221744.g0MHiB917659@purkinje.salk.edu> Message-ID: <200202210127.g1L1RGo36508@dax.salk.edu> Neural Computation - Contents - Volume 14, Number 3 - March 1, 2002 VIEW What Geometric Visual Hallucinations Tell Us About the Visual Cortex Paul C. Bressloff, Jack D. Cowan, Martin Golubitsky, Peter J. Thomas, and Matthew C. Wiener LETTERS An Amplitude Equation Approach to Contextual Effects in Visual Cortex Paul C. Bressloff and Jack D. Cowan Derivation of the Visual Contrast Response Function by Maximizing Information Rate Allan Gottschalk A Bayesian Framework for Sensory Adaptation Norberto M. Grzywacz and Rosario M. Balboa Analysis of Oscillations in a Reciprocally Inhibitory Network with Synaptic Depression Adam L. Taylor, Garrison W. Cottrell, William B. Kristan, Jr. Activity-Dependent Development of Axonal and Dendritic Delays or, Why Synaptic Transmission Should Be Unreliable Walter Senn, Martin Schneider, and Berthold Ruf Impact of Geometrical Structures on the Output of Neuronal Models: A Theoretical and Numerical Analysis Jianfeng Feng and Guibin Li Sparse On-Line Gaussian Processes Lehel Csato and Manfred Opper Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem Mark Girolami Natural Discriminant Analysis using Interactive Potts Models Jiann-Ming Wu ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2002 - VOLUME 14 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $506 $451.42 $554 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From K.Branney at elsevier.nl Wed Feb 20 09:01:01 2002 From: K.Branney at elsevier.nl (Branney, Kate (ELS)) Date: Wed, 20 Feb 2002 09:01:01 -0500 Subject: Call for papers: Special issue of NEUROCOMPUTING on Bioinformatic s Message-ID: <46414F09B351C64BAA875CE0B37BE07101446312@elsamsvexch02.elsevier.nl> Apologies for cross-postings. CALL FOR PAPERS NEUROCOMPUTING An International Journal published by Elsevier Science B.V., vol. 42-47, 24 issues, in 2002 ISNN 0925-2312, URL: http://www.elsevier.com/locate/neucom Special Issue on Bioinformatics Paper Submission Deadline: July 31st, 2002 Bioinformatics applies -- simply stated -- computational methods to the solution of biological problems. Bioinformatics, genomics, molecular biology, molecular evolution, computational biology, and affine fields are at the intersection between two axes: data sequences/physiology and information technology. Sequences include DNA sequences (gene, genome, organization), molecular evolution, protein structure, folding, function, and interaction, metabolic pathways, regulation signaling networks, physiology and cell biology (interspecies, interaction), as well as ecology and environment. Information technology in this context includes hardware and instrumentation, computation, as well as mathematical and physical models. The intersection between two subfields, one in each axis, generates areas including those known as genome sequencing, proteomics, functional genomics (microarrays, 2D-PAGE, ...), high-tech field ecology, genomic data analysis, statistical genomics, protein structure, prediction, protein dynamics, protein folding and design, data standards, data representations, analytical tools for complex biological data, dynamical systems modeling, as well as computational ecology. Research in these fields comprises property abstraction from the biological system, design and development of data analysis algorithms, as well as of databases and data access web-tools. Genome sequencing and related projects generate vast amounts of data that needs to be analyzed, thus emphasizing the relevance of efficient methods of data analysis and of the whole discipline. The Neurocomputing journal invites original contributions for the forthcoming special issue on Bioinformatics from a broad scope of areas. Some topics relevant to this special issue include, but are not restricted to: -- Theoretical foundations, algorithms, implementations, and complete systems -- Sequence analysis (single, multiple), alignment, annotation, etc. -- Improvements in databases and web-tools for bioinformatics -- Novel metrics and biological data preprocessing for posterior analysis -- Systems biology models and data modeling techniques including statistical inference, stochastic processes, random walks, Markov chains, hidden Markov models, motifs, profiles, dynamic programming, pattern recognition techniques, neural networks, support vector machines, evolutionary models, tree estimation, etc. -- Pathway inference, e.g. to determine where to target a drug using gene expression data and address side effects by providing information on where else a target metabolite appears. -- Key applications in diverse fields including bioinformatics, genomics, molecular biology, molecular evolution, computational biology, drug design, etc. Please send two hardcopies of the manuscript before July 31st, 2002, to: V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130, Pasadena, CA 91116-6130, U.S.A. Street address: 1149 Wotkyns Drive Pasadena, CA 91103, U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net including abstract, keywords, a cover page containing the title and author names, corresponding author name's complete address including telephone, fax, and email address, and clear indication to be a submission to the Special Issue on Bioinformatics. Guest Editors Harvey J. Greenberg Center for Computational Biology University of Colorado at Denver P.O. Box 173364 Denver, CO 80217-3364 Phone: (303) 556-8464 Fax: (303) 556-8550 Email: Harvey.Greenberg at cudenver.edu Lawrence Hunter Center for Computational Pharmacology University of Colorado Health Science Center 4200 E. Ninth Ave. Denver, CO 80262 Phone: (303) 315-1094 Fax: (303) 315-1098 Email: Larry.Hunter at uchsc.edu Satoru Miyano Human Genome Center Institute of Medical Science University of Tokyo 4-6-1 Shirokanedai, Minato-ku, Tokyo 108-8639, Japan. Phone: +81-3-5449-5615 Fax: +81-3-5449-5442 Email: miyano at ims.u-tokyo.ac.jp Ralf Zimmer Praktische Informatik und Bioinformatik Institut fr Informatik LMU Mnchen Theresienstrasse 39 D-80333 Mnchen Phone: +49-89-2180-4447 Fax: +49-89-2180-4054 Email: zimmer at bio.informatik.uni-muenchen.de V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130 Pasadena, CA 91116-6130, U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net From Gustavo.Deco at mchp.siemens.de Thu Feb 21 05:02:31 2002 From: Gustavo.Deco at mchp.siemens.de (Gustavo Deco) Date: Thu, 21 Feb 2002 11:02:31 +0100 Subject: New Book Announcement Message-ID: <3C74C5B7.9C642A94@mchp.siemens.de> NEW BOOK ANNOUNCEMENT "Computational Neuroscience of Vision" Edmund T. Rolls University of Oxford, Department of Experimental Psychology and Gustavo Deco Siemens Corporate Technology, Germany Oxford University Press, 2002 588 pages, numerous figures, 238X168mm ISBN 0-19-852489-7 Hardback ISBN 0-19-852488-9 Paperback This exciting new book describes visual information processing in the brain. The book focusses on the visual information processing and computational operations in the visual system that lead to representations of objects in the brain, and on the mechanisms that underlie attentional processes. In addition to visual processing, it also considers how visual inputs reach and are involved in the computations underlying a wide range of behaviour, including short term memory, long term memory and emotion, thus providing a foundation for understanding the operation of a number of different brain systems. This fascinating book will be of value to all those interested in understanding how the brain works, and in understanding vision, attention, memory, emotion, motivation and action. The book combines a neurocomputational approach with neurophysiological, neuropsychological, and neuroimaging approaches. Readership: Neuroscientists, psychologists, and neuropsychologists interested in vision. Computational neuroscientists. Vision scientists. Contents Preface 1 Introduction 2 The primary visual cortex 3 Extrastriate visual areas 4 The parietal cortex 5 Inferior temporal cortical visual areas 6 Visual attentional mechanisms 7 Neural network models 8 Models of invariant object recognition 9 The cortical neurodynamics of visual attention - a model 10 Visual search: Attentional neurodynamics at work 11 A computational approach to the neuropsychology of visual attention 12 Outputs of visual processing 13 Principles and conclusions Appendix A. Introduction to linear algebra for neural networks Appendix B. Information theory References Index The book can be ordered directly from Oxford University Press, http://www.oup.co.uk/isbn/0-19-852488-9, and is available in bookshops. Updates to the publications cited are available at www.cns.ox.ac.uk From gai at gmdh.kiev.ua Thu Feb 21 12:44:10 2002 From: gai at gmdh.kiev.ua (Gregory Ivakhnenko) Date: Thu, 21 Feb 2002 19:44:10 +0200 Subject: ICIM 2002 - Deadline extended Message-ID: <013301c1baff$77155c80$06bc5dc2@niss.gov.ua> Hello, Because of numerous requests for extension, the deadline for submission of papers to the I International Conference on Inductive Modelling (ICIM 2002) is now March 20, 2002. Information regarding the ICIM'2002 can be found on our website at: http://www.niss.gov.ua/Center/ICIM/index.htm This I International Conference on Inductive Modelling will be held in Lviv, Ukraine, from June 25-28, 2002 and will present the latest results in the growing field of neural networks in data mining and forecasting, pattern recognition and parallel computing. Regards, Gregory Ivakhnenko -- National Institute for Strategic Studies Kyiv, Ukraine http://www.GMDH.net From meesad at okstate.edu Thu Feb 21 15:53:04 2002 From: meesad at okstate.edu (Phayung Meesad) Date: Thu, 21 Feb 2002 14:53:04 -0600 Subject: IJCNN'02: Preliminary Technical Program Message-ID: <003501c1bb19$c1f01dc0$323b4e8b@okstate.edu> Dear prospective participants, We want to update you on some of the activities with regard to the IJCNN=9202 which is a part of the 2002 IEEE World Congress on Computational Intelligence (WCCI2002). The preliminary technical programs are now available on the web at http://www.wcci2002.org under the WCCI Program link. These data are preliminary and may be revised, but can offer you a good picture of the diverse range of papers that will be presented at the WCCI2002 meeting. In addition, under the special technical program link, you'll find the information for the plenary and special lectures, and the tutorials that will be presented on Sunday, May 12. Each of the tutorials has an accompanying abstract so you can learn more about it and its presenter. If you registered but didn't sign up for any tutorials and would like to revise your registration to include one or more tutorials, please let me know. If you registered before Feb. 1, 2002 we'll be pleased to offer the early registration prices for any tutorials that you would like to attend. If you have not already made your hotel reservations at the Hilton Hawaiian Village, please do so at your earliest convenience. There is a link from our main web page at http://www.wcci2002.org for hotel reservations. The hotel is reserving space for the WCCI2002 conference participants. We look forward to seeing you in Honolulu. Sincerely, David Fogel, General Chairman, WCCI2002 Gary Yen and Phayung Meesad, IJCNN=9202 Publicity=20 From dtl at marr.bsee.swin.edu.au Fri Feb 22 02:35:22 2002 From: dtl at marr.bsee.swin.edu.au (Dr David Liley) Date: Fri, 22 Feb 2002 17:35:22 +1000 Subject: Australia (Melbourne) - Postdoctoral position in Theoretical Neurobiology Message-ID: Post-doctoral Research Fellow, Center for Intelligent Systems and Complex Processes, School of Biophysical Sciences and Electrical Engineering, Swinburne University of Technology, A$ 36,460 - 49,337 Available immediately for 2 1/2 years with the possibility of extension A postdoctoral researcher is required for a 3-year Australian Research Council funded project available from April 2002. The project concerns the experimental validation of a physiologically specific mathematical theory of alpha electroencephalographic (8-13 Hz) activity. This theory has been developed by researchers within the Center for Intelligent Systems and Complex Processes and suggests a novel basis for electroencephalographic rhythmogenesis that depends upon local inhibitory-inhibitory neuronal population interactions. This non-linear theory provides good descriptions of scalp recordable alpha activity in the context of plausible physiological and anatomical parameterization and gives rise to relatively specific predictions regarding the form of evoked electroencephalographic activity. Further information on this theory can be found at http://marr.bsee.swin.edu.au It is anticipated that many of the anatomical and physiologically specific parameters can be estimated by curve fitting an analytical white noise fluctuation (linear) spectrum, obtained from the theory, to spontaneous alpha EEG activity. Novel methods based on evolutionary and swarm (collective intelligence) directed search strategies will be developed to robustly and efficiently estimate these parameters. The full non-linear theory will then be used to constrain all of the remaining model parameters by solving a set of boundary value non-linear ordinary differential equations. The validity of these parameter sets will then be assessed by using them to predict the latency and amplitude of the middle and late components of the corresponding evoked cortical EEG activity. This project will be based within the School of Biophysical Sciences and Electrical Engineering which has significant facilities for high performance computing (a 64 node Compaq Alpha cluster), data visualization (including stereoscopic viewing facilities) and high density EEG data collection (64 channel Neuroscan and EGI systems). A strong mathematical and computer modeling background is required in order to parametrically constrain systems of partial differential equations that describe the most pertinent dynamical features of scalp recordable EEG. Practical knowledge of digital signal processing and time series analysis is necessary. Applicants must have a relevant PhD in Physics, Applied Mathematics, Computational Physics or Theoretical Neuroscience. Informal enquiries regarding these positions and the project in general should be addressed to the project co-ordinator, Dr David T J Liley, School of Biophysical Sciences and Electrical Engineering, Swinburne University of Technology, Hawthorn VIC 3122, Australia + 61 3 9214 8812, email: dliley at swin.edu.au, WWW: http://marr.bsee.swin.edu.au. Further details regarding this position (position number 22591) and application instructions are available at http://www.swin.edu.au/corporate/hr/posvac/external.htm Closing Date: 22th March 2002 ----------------------------------------------------------------------- Dr David Liley MBChB, PhD Senior Lecturer in Biophysics School of Biophysical Sciences and Electrical Engineering Swinburne University of Technology P.O. Box 218 Hawthorn VIC 3122 Australia ph: +61 3 9214 8812 fax: +61 3 9819 0856 email: dliley at swin.edu.au WWW: http://marr.bsee.swin.edu.au/~dtl ----------------------------------------------------------------------- From rothschild at cs.haifa.ac.il Fri Feb 22 08:08:39 2002 From: rothschild at cs.haifa.ac.il (Rothschild Institute) Date: Fri, 22 Feb 2002 15:08:39 +0200 (IST) Subject: Postdoc Position and Visitors at the Caesarea Rothschild Institute Message-ID: Please forward to interested candidates. Apologies in advance if you are subscribed to multiple mailinglists. -------------------------------------------------------------------------- The Caesarea Edmond Benjamin de Rothschild Institute for Interdisciplinary Applications of Computer Science invites applications from new postdoctoral graduates and established researchers who wish to visit the University of Haifa for short-term or long-term periods during 2002-04. The Institute was founded in 2001 and has an active program of workshops, seminars and collaborative projects. Focus areas include Combinatorial and Graph Theoretic Algorithms, Artificial Intelligence, Computational Linguistics and Neural Science, CS Applications in Statistics, Multimedia and Education, Vision and Psychology. On our website http://www.rothschild.haifa.ac.il you will find Call for Proposals for visitors workshops new interdisciplinary courses and other information about our programs including Research-in-Pairs. The University of Haifa is located on Mt. Carmel overlooking the Carmel forest and the Mediterranean Sea. POST-DOC POSITION in ALGORITHMIC GRAPH THEORY As part of our ongoing research and our focus year on Applications of Graph Theory and Algorithms, outstanding candidates with a recent doctoral degree in computer science and strong publication record are encouraged to apply. A working knowledge of Hebrew is an asset but is not required. Please send applications to Prof. Martin Golumbic . From nils at nero.uni-bonn.de Fri Feb 22 11:46:09 2002 From: nils at nero.uni-bonn.de (ws-grow@nero.uni-bonn.de) Date: Fri, 22 Feb 2002 17:46:09 +0100 Subject: CFP: SAB'02-Workshop: On Growing up Artifacts that Live Message-ID: <200202221646.g1MGk5u19039@marvin.nero.uni-bonn.de> [Apologies if you receive this message more than once] ########################################################## ### ### ### CALL FOR PAPERS ### ### ### ########################################################## SAB'2002 Workshop ON GROWING UP ARTIFACTS THAT LIVE Basic Principles and Future Trends http://www.nero.uni-bonn.de/ws-grow.html August 10, 2002, Edinburgh, Scotland (UK) To be held in conjunction with SAB'02 Conference http://www.isab.org.uk/sab02 Important dates ========================== 5 April, 2002: Submission of papers, up to 1o pages 3 May, 2002: Notification of acceptance 14 June, 2002: Deadline for camera-ready papers 4-9 August, 2002: SAB'02 Conference 10 August, 2002: Workshop, Edinburgh Call for Participation =========================== One of the most challenging features of living artifacts is the ability to grow. One of the most interesting features of growing is the special capability to grow up. Aim and scope ================= The aim of the workshop is to enlighten basic principles and fundamental requirements to enfore artifacts that can grow up. To "Grow Up" means, that the system starts with a basic, pre-structured set of functionalities and develops its individual capabilities during livetime in close interaction with the environment. A schedule for temporal development will drive the artefact through a a well defined sequence of stages from the infancy state to an individually matured entity. Along this sequence the artefact will learn with respect to, and in interaction with the environment, thus piling up experience, and leading to new qualitative stages of behaviour. Besides adequate learning and adaptaion rules, the organisation of the memory and the modular structure of the system must be featured to enable this ontogenetic process of development. Below you will find a brief summary of theses and principles that are said to lead to a living, up-growing artefact: - One of the most challenging features of living artifacts is the ability to grow. - One of the most interesting features of a growing artefact is the special capability of growing up. - Growing up means the evolution from an infant-like pre-defined state to a fully matured entity. - Growing up requires a special organisational structure of the entire artefact, that allows to grow up. - Growing up requires interaction with the environment, including the interaction with other "living artifacts". - Growing up requires the capability of learning from the experience acquired in interaction with the environment. - Learning from experience requires a specialised structure of the underlying system. - The specialised structure (e.g. systemic architecture) is covering: adaptive structures, learning schemes, organisation of memory and reasoning, ... Fundamentals from psychology, from memory organisation, from theory of learning (machine learning and psychology), underlying systemic architectures enabling the required capabilities, cognition science and behavioural knowledge and further principles are within the scope of the workshop. The workshop will envisage, but not be limited to the topics listed below: - Internal models and representation - Architectures for autonomous agents - Behavioural sequencing - Learning and development - Psychology of learning - Motivation and emotion - Emergent structures and behaviours - Evolutionary and co-evolutionary approaches Not only the state of the art, but actual and novel ideas and future trends are focused by this workshop. Especially unconventional, Blue-Sky like ideas are welcome, and will be considered valuable for presentation and discussion within the workshop. Therefore an open, hopefully, brain storming discussion will be part of the workshop. The talks and the posters will be on an open basis, encouraging scientists to present even unusual ideas. Paper submission and publication ===================================== Papers not exceeding 10 pages in 10pt, one-column format (Springer LNCS style), should be submitted electronically (PDF or PS) as attachment files to the following email address: ws-grow at nero.uni-bonn.de In case electronic submission is causing problems, please contact the organisers. Formatting instructions, including a Latex template: http://www.springer.de/comp/lncs/authors.html All submissions will be reviewed for acceptance as talks or poster presentation by the program committee and the organisers. Authors of selected papers will be asked for an extended paper submission after the workshop for publication. Since the topic of the workshop is aiming beyond state of the art development, involving a variety of different fields, the authors are asked to facilitate the accession to the content of their contribution by including a brief introductory passage at the beginning of the article. Important dates ========================== 5 April, 2002: Submission of papers 3 May, 2002: Notification of acceptance 14 June, 2002: Deadline for camera-ready papers 4-9 August, 2002: SAB'02 Conference 10 August, 2002: Workshop, On Growing up Artifacts that Live Programme / Scientific Committee ====================================== Alois Knoll, Technical University Munich (TUM), Germany Andy M. Tyrell, The University of York, United Kingdom Horst-Michael Gross, Ilmenau Technical University, Germany Tim Pearce, University of Leicester, United Kingdom Ulrich Rueckert, University of Paderborn, Germany Giulio Sandini, University of Genova, Italy Thomas Christaller, Fraunhofer Institute AiS, Germany Bruno Apolloni, University of Milan, Italy Peter Ross , School of Computing, Napier University, Edinburgh, Scotland (UK) Georg Dorffner, Austrian Research Institute for Artificial Intelligence (OFAI), Austria Erich Prem, Austrian Research Institute for Artificial Intelligence (OFAI), Austria David Willshaw, Institute for Adaptive and Neural Computation, The University of Edinburgh, Scotland (UK) Giovanna Morgavi, Istituto per i Circuiti Elettronici, National Research Council (ICECNR), Italy Nils Goerke, Neuroinformatics, University of Bonn, Germany Organisers ================ Nils Goerke Division of Neuroinformatics (NERO), University of Bonn Roemerstr. 164, D-53117 Bonn, Germany http://www.nero.uni-bonn.de E-Mail: goerke at nero.uni-bonn.de Peter Ross School of Computing, Napier University, Edinburgh, Scotland (UK) http://www.soc.napier.ac.uk Georg Dorffner Erich Prem Austrian Research Institute for Artificial Intelligence (OFAI), Vienna, Austria http://www.ai.univie.ac.at/oefai/oefai.html Giovanna Morgavi Istituto per i Circuiti Elettronici, National Research Council, (ICECNR), Genova, Italy http://www.ge.cnr.it David Willshaw Institute for Adaptive and Neural Computation, The University of Edinburgh, Edinburgh, Scotland (UK) http://www.informatics.ed.ac.uk/research/ianc/ PLEASE DISTRIBUTE THIS CALL FOR PAPERS ============================================= Hope to see you in Edinburgh for the workshop Best regards Nils Goerke From dblazis at mbl.edu Fri Feb 22 10:37:15 2002 From: dblazis at mbl.edu (Diana Blazis) Date: Fri, 22 Feb 2002 10:37:15 -0500 Subject: Methods in Computational Neuroscience Message-ID: <5.1.0.14.0.20020222103625.00a48ec0@mail.mbl.edu> 2002 Marine Biological Laboratory Special Topics Course Methods in Computational Neuroscience August 4 - September 1, 2002 Directors: William Bialek, Princeton University Rob de Ruyter, NEC Research Institute. Financial assistance is available for this course Deadline extended to: March 7, 2002 Animals interact with a complex world, encountering a wide variety of challenges: they must gather data about the environment, discover useful structures in these data, store and recall information about past events, plan and guide actions, learn the consequences of these actions, etc. These are, in part, computational problems that are solved by networks of neurons, from roughly 100 cells in a small worm to 100 billion in humans. Careful study of the natural context for these tasks leads to new mathematical formulations of the problems that brains are solving, and these theoretical approaches in turn suggest new experiments to characterize neurons and networks. This interplay between theory and experiment is the central theme of this course. For more information and application forms please visit http://courses.mbl.edu/ or contact Carol Hamel, Admissions Coordinator at 508/289-7401 or admissions at mbl.edu Diana E.J. Blazis, Ph.D. dblazis at mbl.edu Staff Scientist and Director, CASSLS at the Marine Biological Laboratory PH (508) 289-7535 7 MBL Street, Woods Hole, MA 02543 FAX (508) 289-7951 http://www.mbl.edu/CASSLS From cl at andrew.cmu.edu Sun Feb 24 22:40:45 2002 From: cl at andrew.cmu.edu (Christian Lebiere) Date: Sun, 24 Feb 2002 22:40:45 -0500 Subject: postdoctoral positions at CMU Message-ID: <148772.3223579245@[10.0.1.2]> Dear colleagues, I apologize in advance if you received this message more than once. We have several postdoctoral positions available at CMU on the Combined Computational and Behavioral approaches to the study of cognition. Please advise any suitable candidates who might be interested. The application deadline is March 15. The applicants must be US citizens or nationals and should be interested in learning to develop computational models of cognition (or continuing to train in that area.). Below is the list of members of the training grant. In addition to contacting a prospective advisor, interested applicants should let me know that s/he will be sending in an application. Thanks for your help. Sincerely, Lynne Reder Dr. John Anderson Dr. Marlene Behrmann Dr. Patricia Carpenter Dr. Albert Corbett Dr. Bonnie John Dr. Marcel Just Dr. Roberta Klatzky Dr. Kenneth Koedinger Dr. Kenneth Kotovsky Dr. Christian Lebiere Dr. Marsha Lovett Dr. James McClelland Dr. David Plaut Dr. Lynne Reder Dr. Robert Siegler Dr. David Touretzky Dr. Raul Valdes-Perez Lynne M. Reder, Professor Department of Psychology Carnegie Mellon University Pittsburgh, PA 15213 phone: (412)268-3792 fax: (412) 268-2844 email: reder at cmu.edu URL: http://www.andrew.cmu.edu/~reder/reder.html From ericwan at ece.ogi.edu Mon Feb 25 18:38:59 2002 From: ericwan at ece.ogi.edu (Eric Wan) Date: Mon, 25 Feb 2002 15:38:59 -0800 Subject: Postdoc Position in Neural Controls Message-ID: <005f01c1be55$99230030$a85a5f81@ece.ogi.edu> POST-DOCTORAL RESEARCH ASSOCIATE The OGI School or Science and Engineering at OHSU has an opening for a post-doctoral research associate to participate in an interdisciplinary UAV neural controls project. Project overview: This project involves the design and implementation of nonlinear reconfigurable controllers using neural networks that exploit the coupled dynamics between a vehicle model (e.g., helicopter) and adaptive models of the environment. New model-predictive neural control techniques are developed to perform on-line optimization of vehicle control trajectories under dynamic and situational constraints. Now entering the 3rd year of this project, the main focus is on 1) increased simulation realism for ship-based VTOL, and 2) demonstration of the approaches using an instrumented RC helicopter. The successful candidate will work closely with an interdisciplinary team of software and control engineers, with specific responsibility for various aspects pertaining to control design, vehicle and aerodynamic modeling, and system integration. Home page: http://www.cse.ogi.edu/PacSoft/projects/sec/ Requirements: Candidate should have a Ph.D. with an expertise in nonlinear control and/or a strong background in flight dynamics modeling for rotorcraft, including rotor and airframe aerodynamics. Salary range $45,000 - $55,000 plus benefits. Location: OHSU's OGI School of Science and Engineering campus is in Hillsboro, Oregon, approximately 11 miles west of downtown Portland. Sponsor: DARPA Oregon Health & Science University is an Equal Opportunity Employer. Please send inquiries and background information to Prof. Eric A. Wan: ericwan at ece.ogi.edu. Eric A. Wan Associate Professor Department of Electrical and Computer Engineering Center for Spoken Language Understanding OGI School of Science and Engineering, OHSU http://www.ece.ogi.edu/~ericwan/ Note: On July 1, 2001, the Oregon Graduate Institute merged with the Oregon Health & Science University, becoming the OGI School of Science and Engineering at OHSU. From juergen at idsia.ch Tue Feb 26 05:23:45 2002 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Tue, 26 Feb 2002 11:23:45 +0100 Subject: PhD fellowship Message-ID: <3C7B6231.9555447B@idsia.ch> We are seeking a PhD student interested in optimal search algorithms & universal learning algorithms & reinforcement learning in partially observable environments. Please see http://www.idsia.ch/~juergen/phd2002.html ------------------------------------------------- Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland juergen at idsia.ch www.idsia.ch/~juergen From mhc27 at cornell.edu Wed Feb 27 11:38:03 2002 From: mhc27 at cornell.edu (Morten H. Christiansen) Date: Wed, 27 Feb 2002 11:38:03 -0500 Subject: No subject Message-ID: Apologies if you receive more than one copy of this announcement. The following book may be of interest to the readers of this list: Christiansen, M.H. & Chater, N. (Eds.) (2001). Connectionist Psycholinguistics. Westport, CT: Ablex. Description: Setting forth the state of the art, leading researchers present a survey on the fast-developing field of Connectionist Psycholinguistics: using connectionist or "neural" networks, which are inspired by brain architecture, to model empirical data on human language processing. Connectionist psycholinguistics has already had a substantial impact on the study of a wide range of aspects of language processing, ranging from inflectional morphology, to word recognition, to parsing and language production. Christiansen and Chater begin with an extended tutorial overview of Connectionist Psycholinguistics which is followed by the latest research by leading figures in each area of research. The book also focuses on the implications and prospects for connectionist models of language, not just for psycholinguistics, but also for computational and linguistic perspectives on natural language. The interdisciplinary approach will be relevant for, and accessible to psychologists, cognitive scientists, linguists, philosophers, and researchers in artificial intelligence. The book is suitable as a text book for advanced courses in connectionist approaches to language processing. It can also be used as a recommended or complementary text book in graduate and advanced undergraduate courses on the psychology of language. Table of Contents: Preface 1. Connectionist Psycholinguistics: The Very Idea Morten H. Christiansen and Nick Chater PART I: THE STATE OF THE ART 2. Connectionist Psycholinguistics in Perspective Morten H. Christiansen and Nick Chater 3. Simulating Parallel Activation in Spoken Word Recognition M. Gareth Gaskell and William D. Marslen-Wilson 4. A Connectionist Model of English Past Tense and Plural Morphology Kim Plunkett and Patrick Juola 5. Finite Models of Infinite Language: A Connectionist Approach to Recursion Morten H. Christiansen and Nick Chater 6. Dynamic Systems for Sentence Processing Whitney Tabor and Michael K. Tanenhaus 7. Connectionist Models of Language Production: Lexical Access and Grammatical Encoding Gary S. Dell, Franklin Chang, and Zenzi M. Griffin 8. A Connectionist Approach to Word Reading and Acquired Dyslexia: Extension to Sequential Processing David C. Plaut PART II: FUTURE PROSPECTS 9. Constraint Satisfaction in Language Acquisition and Processing Mark S. Seidenberg and Maryellen C. MacDonald 10. Grammar-based Connectionist Approach to Language Paul Smolensky 11. Connectionist Sentence Processing in Perspective Mark Steedman Index About the Editors and Contributors Connectionist psycholinguistics Edited by Morten H. Christiansen and Nick Chater ISBN: 1-56750-595-3 (pbk.) ISBN: 1-56750-594-5 (hc.) 400 pages, figures, tables Ablex Publishing Best regards, Morten Christiansen -- ------------------------------------------------------------------------ Morten H. Christiansen Assistant Professor Phone: +1 (607) 255-3570 Department of Psychology Fax: +1 (607) 255-8433 Cornell University Email: mhc27 at cornell.edu Ithaca, NY 14853 Office: 240 Uris Hall Web: http://www.psych.cornell.edu/faculty/people/Christiansen_Morten.htm Lab Web Site: http://cnl.psych.cornell.edu ------------------------------------------------------------------------ From ps629 at columbia.edu Thu Feb 28 18:25:28 2002 From: ps629 at columbia.edu (Paul Sajda) Date: Thu, 28 Feb 2002 18:25:28 -0500 Subject: BCI Data Competition Message-ID: <3C7EBC67.B5DDE058@columbia.edu> NIPS 2001: Post Workshop Data Competition In an effort to foster development of machine learning techniques and evaluate different algorithms for brain computer interfaces (BCI), we are announcing a data analysis competition. Datasets are available for download from http://newton.bme.columbia.edu/competition.htm . Participants are asked to follow a few simple rules: 1. All data sets should be evaluated single-trial--you should not average across multiple trials. 2. Report your classification results (i.e. labels) on the test set. You can report results for any or all of the datasets. 3. Please submit a short (one page or less) description of the pattern classifier you used. Please include a description of any pre/post processing that you may have done. 4. Use of these datasets implies that the participant agrees to cite the origin of the data in any publication (e.g. see each dataset description for bibTex entry). ALL SUBMISSIONS ARE DUE JUNE 1st 2002 . Announcement of results will be coordinated with the June 2002 BCI Workshop (in upstate NY). Email submissions to nips-bci at newton.bme.columbia.edu. Questions should be directed to nips-bci at newton.bme.columbia.edu. New information on the competition and datasets will be posted periodically on this site. Good Luck - Paul Sajda, Ph.D. Associate Professor Department of Biomedical Engineering Columbia University 351 Engineering Terrace Building, Mail Code 8904 1210 Amsterdam Avenue New York, NY 10027 tel: (212) 854-5279 fax: (212) 854-8725 email: ps629 at columbia.edu http://newton.bme.columbia.edu From madrenas at eel.upc.es Fri Feb 1 11:38:08 2002 From: madrenas at eel.upc.es (Jordi Madrenas) Date: Fri, 1 Feb 2002 17:38:08 +0100 Subject: PhD position on VLSI implementation of neuromorphic systems Message-ID: PhD POSITION ON VLSI IMPLEMENTATION OF NEUROMORPHIC SYSTEMS DEPARTMENT OF ELECTRONIC ENGINEERING TECHNICAL UNIVERSITY OF CATALUNYA BARCELONA - SPAIN A 4-year PhD position is available at Advanced Hardware Architectures Group of Technical University of Catalunya (UPC) for thesis research on a funded project about Bioinspired (Neuromorphic) VLSI Implementation of NN for Image Segmentation. We are seeking candidates with good academic record, strong interest in the fields of microelectronics -especially mixed-signal-, neuromorphic systems, neural networks and complex systems. Interested candidates should contact as soon as possible (by 12th of February) by sending their CV to the following e-mail address: bioseg at eel.upc.es For more information about the group activities, see http://www-eel.upc.es/aha or contact: Dr. Jordi Madrenas (madrenas at eel.upc.es) or Assistant Prof. Jordi Cosp jcosp at eel.upc.es Jordi Madrenas ------------------------------------------------------------- Jordi MADRENAS Professor Titular / Associate Professor AHA (Advanced Hardware Architectures) Group Departament d'Enginyeria Electronica/Department of Electronic Engineering Universitat Politecnica de Catalunya/Technical University of Catalunya Localization: Campus Nord UPC, Edifici C5, Despatx 102 Postal Address: Dept. d'Enginyeria Electronica, Campus Nord UPC Jordi Girona, 1 i 3, Edifici C4 08034 BARCELONA (SPAIN) Tel: +34 93 401 67 47 Fax: +34 93 401 67 56 E-mail: madrenas at eel.upc.es URL : http://www-eel.upc.es/aha ------------------------------------------------------------- From sab-wmc at dai.ed.ac.uk Fri Feb 1 07:26:09 2002 From: sab-wmc at dai.ed.ac.uk (SAB'02 Workshop on Motor Control) Date: Fri, 1 Feb 2002 12:26:09 +0000 (GMT) Subject: CFP: SAB'02-Workshop in Motor Control in Humans and Robots Message-ID: [Apologies if you receive this message more than once] ******************************************************************* CALL FOR PAPERS ******************************************************************* SAB'2002 Workshop on MOTOR CONTROL IN HUMANS AND ROBOTS On the interplay of real brains and artificial devices ------------------ August 10, 2002, Edinburgh, Scotland (UK) http://www.dai.ed.ac.uk/~sab-wmc Recent advances in motor control for humans and robots come from different approaches to Neuroscience such as experimental investigation, via psycho-physical and neuro-physiological experiments, theoretical investigation, via mathematical and computational modelling, and more recently biorobotic investigation, via robotic modelling of aspects of those mechanisms described in the biological literature. Merging these approaches is a trend that aims to fruitful insights in all directions. Robotics require models of perception, sensorimotor coordination, and motor control, whereas Neuroscience studies the biological counterparts of these neural mechanisms. In such terms the intersection of Robotics and Neuroscience is making significant progress in the field of neuroprosthetics. For instance, neural ensemble recording techniques have been successfully used in the motor cortices of rats and monkeys for the control of artificial limbs. Further, these techniques also provide a window for looking into the brain. Among other things, this can prove invaluable Neurobiological approaches on humanoid robot design. For instance, the discovery of the mirror neurons in the pre-motor cortex of macaque monkeys gave rise to a plethora of issues, such as basic perceptual-motor coupling, imitation mechanisms, and even theories on the origins and evolution and/or prehistory of language. The aim of this workshop is to gather people from these different approaches to motor control and promote the interplay among them. The workshop will include discussions on advances on neural ensemble recording techniques and the associated signal processing, human-machine interfaces for neuroprosthetic applications, psychophysical experiments that validate motor control models, as well as discussions on neurophysiologically inspired models of robot motor control. A list that summarises, but is not limited to, the topics of this workshop follows. - Robotic and human motor control - Object manipulation and grasping - Perceptual-motor systems and imitation - Models of motor control for integration into neuroprosthetics - Human-Machine haptics - Brain-Computer interfaces - Multi-Electrode recording techniques - Data analysis of neural ensemble recordings - Neuroprosthetics for restoring limb movement Submissions =========== Papers not exceeding 8 pages in 10pt, two-column format, should be submitted electronically (PDF or PS) as attachment files to the following email address: sab-wmc at dai.ed.ac.uk A Latex template and further formatting instructions can be found at http://www.isab.org.uk/sab02/submit/. Accepted submissions will be published in the SAB workshop proceedings. Authors of selected papers will be asked for an extended paper submission after the workshop for being published in a journal or collection. Further instructions to authors will be posted on the workshop's web page: http://www.dai.ed.ac.uk/~sab-wmc/ Since this is an interdisciplinary workshop involving quite diverse fields, authors should make an effort to make their papers accessible to outsiders to their field, for instance by including a quick introduction at the beginning of the article. Important Dates =============== 1 March, 2002: Submission of papers 12 April, 2002: Notification of acceptance 1 May, 2002: Deadline for camera-ready papers 10 August, 2002: Workshop Scientific committee ==================== Yiannis Demiris, Imperial College, UK Juan Domingo, Universitat de Valencia, Spain John P. Donoghue, Brown University, USA Andrew Fagg, University of Massachusetts, USA Heba Lakany, Essex University, UK Chris Malcolm, University of Edinburgh, UK Joe McIntyre, College de France, France Jose del R. Millan, JRC European Commission, Italy Ferdinando Mussa-Ivaldi, Northwestern University, USA Miguel Nicolelis, Duke University, USA Angel P. del Pobil, Universitat Jaume-I, Spain John Semmlow, Rutgers University, USA Stefan Schaal, University of Southern California, USA Mandayam Srinivasan, MIT, USA Johan Wessberg, Goteborg University, Sweden Organisers ========== Jose M. Carmena Department of Neurobiology Duke University Durham, NC 27710 USA jose at dai.ed.ac.uk George Maistros IPAB, Division of Informatics University of Edinburgh Edinburgh, EH1 2QL, UK georgem at dai.ed.ac.uk From bp1 at cn.stir.ac.uk Sun Feb 3 12:44:06 2002 From: bp1 at cn.stir.ac.uk (Bernd Porr) Date: Sun, 3 Feb 2002 17:44:06 +0000 (GMT) Subject: Workshop: "Modulation and Modification of Sensor-Motor Coupling" in Stirling, Scotland, Feb. 22-24, 2002 (fwd) Message-ID: sorry for duplicate postings Modulation and Modification of Sensor-Motor Coupling Workshop at Stirling University, Feb. 22-24, 2002. This workshop will bring together scientists from different fields, from experimental and theoretical neuroscientists to robotics researchers, who are interested issues of sensor-motor coupling, (temporal sequence) learning and modulation of sensory-motor pathways. The traditional view of sensor-motor systems, which describes them by means of a non-linear transfer function used to transform the sensor input(s) into a motor action, has been replaced during the last years by a more sophisticated view. For some time it has been acknowledged that such systems can be modified by learning and substantial research has been undertaken to understand the underlying neuronal mechanisms. More recently, aspects of motivation and/or attention driven modulation of the efficiency of sensor-motor coupling have also been investigated. Especially, it has been realized that all sensor-motor systems (all animals) interact with their immediate surroundings forming a closed loop with the environment, which adds to the complexity of the problem. All this shows that a multi-disciplinary approach is necessary in trying to solve the "sensor-motor coupling problem" and therefore, the main objective of the forthcoming workshop at Stirling is to facilitate the discussion between researchers from different fields in sensor-motor coupling. To that end the workshop will consist of a number of invited talks, with plenty of time allowed for discussion. So far the following researchers have agreed to speak: Holk Cruse Ansgar B?schges Peter Dayan Benard Hommel Orjan Ekeberg David Wolpert Jeff Krichmar Paul Verschure Attendance at the workshop is open to all interested participants. We would, however, appreciate using our web-form http://www.cn.stir.ac.uk/SensMot/ to give us some personal details to facilitate organization. Organizers: Barbara Webb (b.h.webb at stir.ac.uk) & Florentin W?rg?tter (worgott at cn.stir.ac.uk) Dept. of Psychology and INCITE University of Stirling Scotland, UK Webmaster: Bernd Porr (bp1 at cn.stir.ac.uk) From aonishi at bsp.brain.riken.go.jp Sun Feb 3 21:07:16 2002 From: aonishi at bsp.brain.riken.go.jp (Toru Aonishi) Date: Mon, 04 Feb 2002 11:07:16 +0900 Subject: preprint: paper on coupled oscillator systems Message-ID: <20020204110716S.aonishi@bsp.brain.riken.go.jp> Dear Connectionists, We are pleased to announce the availability of our recent paper and of two potentially related papers. Recent paper: ------------- Acceleration effect of coupled oscillator systems T. Aonishi, K. Kurata and M. Okada, Physical Review E (in press) Available at http://arXiv.org/abs/cond-mat/0201453 Abstract: We have developed a curved isochron clock (CIC) by modifying the radial isochron clock to provide a clean example of the acceleration (deceleration) effect. By analyzing a two-body system of coupled CICs, we determined that an unbalanced mutual interaction caused by curved isochron sets is the minimum mechanism needed for generating the acceleration (deceleration) effect in coupled oscillator systems. From this we can see that the Sakaguchi and Kuramoto (SK) model which is a class of non-frustrated mean feild model has an acceleration (deceleration) effect mechanism. To study frustrated coupled oscillator systems, we extended the SK model to two oscillator associative memory models, one with symmetric and one with asymmetric dilution of coupling, which also have the minimum mechanism of the acceleration (deceleration) effect. We theoretically found that the {\it Onsager reaction term} (ORT), which is unique to frustrated systems, plays an important role in the acceleration (deceleration) effect. These two models are ideal for evaluating the effect of the ORT because, with the exception of the ORT, they have the same order parameter equations. We found that the two models have identical macroscopic properties, except for the acceleration effect caused by the ORT. By comparing the results of the two models, we can extract the effect of the ORT from only the rotation speeds of the oscillators. Related papers: -------------- Multibranch entrainment and slow evolution among branches in coupled oscillators T. Aonishi and M. Okada, Physical Review Letters, 88[2], 024102 (2002) Available at http://prl.aps.org/ http://arXiv.org/abs/cond-mat/0104526 Abstract: In globally coupled oscillators, it is believed that strong higher harmonics of coupling functions are essential for {\it multibranch entrainment} (MBE), in which there exist many stable states, whose number scales as $\sim$ $O(\exp N)$ (where $N$ is the system size). The existence of MBE implies the non-ergodicity of the system. Then, because this apparent breaking of ergodicity is caused by {\it microscopic} energy barriers, this seems to be in conflict with a basic principle of statistical physics. In this paper, using macroscopic dynamical theories, we demonstrate that there is no such ergodicity breaking, and such a system slowly evolves among branch states, jumping over microscopic energy barriers due to the influence of thermal noise. This phenomenon can be regarded as an example of slow dynamics driven by a perturbation along a neutrally stable manifold consisting of an infinite number of branch states. ---- Statistical mechanics of an oscillator associative memory with scattered natural frequencies T. Aonishi, K. Kurata and M. Okada, Physical Review Letters, 82[13], pp. 2800--2803 (1999) Available at http://prl.aps.org/ http://arXiv.org/abs/cond-mat/9808090 Abstract: Analytic treatment of a non-equilibrium random system with large degrees of freedoms is one of most important problems of physics. However, little research has been done on this problem as far as we know. In this paper, we propose a new mean field theory that can treat a general class of a non-equilibrium random system. We apply the present theory to an analysis for an associative memory with oscillatory elements, which is a well-known typical random system with large degrees of freedoms. --------------------------------------------------------------- Regards, Toru Aonishi (Ph.D) Laboratory for Advanced Brain Signal Processing Brain Science Institute The Institute of Physical and Chemical Research (RIKEN) Hirosawa, 2-1, Wako-shi, Saitama, 351-0198, Japan E-mail: aonishi at brain.riken.go.jp URL: http://www.bsp.brain.riken.go.jp/~aonishi/ From ruppin at tau.ac.il Mon Feb 4 04:51:48 2002 From: ruppin at tau.ac.il (Eytan Ruppin) Date: Mon, 4 Feb 2002 11:51:48 +0200 Subject: Comp. Neuroscience with Evolutionary Agents - A Review Paper Message-ID: <200202040951.LAA26772@tau.ac.il> Evolutionary Autonomous Agents: A Neuroscience Perspective ----------------------------------------------------------- Nature Reviews Neuroscience, 3(2), February issue, p. 132 - 142, 2002. http://www.nature.com/cgi-taf/DynaPage.taf?file=/nrn/journal/v3/n2/index.html Abstract: This paper examines the research paradigm of neurally-driven Evolutionary Autonomous Agents (EAAs), from a neuroscience perspective. Two fundamental questions are addressed: 1. Can EAA studies shed new light on the structure and function of biological nervous systems? 2. Can these studies lead to the development of new neuroscientific analysis tools? The value and significant potential of EAA modeling in both respects is demonstrated and discussed. While the study of EAAs as a neuroscience research methodology still faces difficult conceptual and technical challenges, it is a promising and timely endeavor. The paper may also be downloaded from http://www.math.tau.ac.il/~ruppin/. Best, Eytan Ruppin From Gunnar.Raetsch at anu.edu.au Mon Feb 4 08:32:43 2002 From: Gunnar.Raetsch at anu.edu.au (Gunnar Raetsch) Date: Tue, 05 Feb 2002 00:32:43 +1100 Subject: PhD thesis on Boosting available Message-ID: <3C5E8D7B.9080206@anu.edu.au> Dear Connectionists, I am pleased to announce that my PhD thesis entitled "Robust Boosting via Convex Optimization" is now available at http://www.boosting.org/papers/thesis.ps.gz (and .pdf) Please find the summary of my thesis below. Gunnar Summary ======= In this work we consider statistical learning problems. A learning machine aims to extract information from a set of training examples such that it is able to predict the associated label on unseen examples. We consider the case where the resulting classification or regression rule is a combination of simple rules - also called base hypotheses. The so-called boosting algorithms iteratively find a weighted linear combination of base hypotheses that predict well on unseen data. We study the following issues: o The statistical learning theory framework for analyzing boosting methods. We study learning theoretic guarantees on the prediction performance on unseen examples. Recently, large margin classification techniques have emerged as a practical result of the theory of generalization, in particular Boosting and Support Vector Machines. A large margin implies a good generalization performance. Hence, we analyze how large the margins in boosting are and find an improved algorithm that is able to generate the maximum margin solution. o How can boosting methods be related to mathematical optimization techniques? To analyze the properties of the resulting classification or regression rule, it is of high importance to understand whether and under which conditions boosting converges. We show that boosting can be used to solve large scale constrained optimization problems, whose solutions are well characterizable. To show this, we relate boosting methods to methods known from mathematical optimization, and derive convergence guarantees for a quite general family of boosting algorithms. o How to make Boosting noise robust? One of the problems of current boosting techniques is that they are sensitive to noise in the training sample. In order to make boosting robust, we transfer the soft margin idea from support vector learning to boosting. We develop theoretically motivated regularized algorithms that exhibit a high noise robustness. o How to adapt boosting to regression problems? Boosting methods are originally designed for classification problems. To extend the boosting idea to regression problems, we use the previous convergence results and relations to semi- infinite programming to design boosting-like algorithms for regression problems. We show that these leveraging algorithms have desirable properties - from both, the theoretical and the practical side. o Can boosting techniques be useful in practice? The presented theoretical results are guided by simulation results either to illustrate properties of the proposed algorithms or to show that they work well in practice. We report on successful applications in a non-intrusive power monitoring system, chaotic time series analysis and the drug discovery process. -- +-----------------------------------------------------------------+ Gunnar R"atsch http://mlg.anu.edu.au/~raetsch Australian National University mailto:Gunnar.Raetsch at anu.edu.au Research School for Information Tel: (+61) 2 6125-8647 Sciences and Engineering Fax: (+61) 2 6125-8651 Canberra, ACT 0200, Australia From cindy at bu.edu Mon Feb 4 16:26:12 2002 From: cindy at bu.edu (Cynthia Bradford) Date: Mon, 4 Feb 2002 16:26:12 -0500 Subject: Neural Networks 15(1) Message-ID: <200202042126.g14LQCr21060@cns-pc75.bu.edu> NEURAL NETWORKS 15(1) Contents - Volume 15, Number 1 - 2002 ------------------------------------------------------------------ Editorial for 2002: A Time of Exuberant Development Neural Networks Referees used in 2001 NEURAL NETWORKS LETTER: Modeling inferior olive neuron dynamics Manuel G. Velarde, Vladimir I. Nekorkin, Viktor B. Kazantsev, Vladimir I. Makarenko, and Rodolfo Llinas INVITED ARTICLE: A review of evidence of health benefit from artificial neural networks in medical intervention P.J.G. Lisboa CONTRIBUTED ARTICLES: ***** Neuroscience and Neuropsychology ***** Attention modulation of neural tuning through peak and base rate in correlated firing H. Nakahara and S.-I. Amari ***** Mathematical and Computational Analysis ***** Space-filling curves and Kolmogorov superposition-based neural networks David A. Sprecher and Sorin Draghici The bifurcating neuron network 2: An analog associative memory Geehyuk Lee and Nabil H. Farhat Hybrid independent component analysis by adaptive LUT activation function neurons Simone Fiori A new approach to stability of neural networks with time-varying delays Jigen Peng, Hong Qiao, and Zong-ben Xu ***** Engineering and Design ***** Projective ART for clustering data sets in high dimensional spaces Yongqiang Cao and Jianhong Wu Equivariant nonstationary source separation Seungjin Choi, Andrzej Cichocki, and Shunichi Amari ***** Technology and Applications ***** Fractional Fourier transform pre-processing for neural networks and its application to object recognition Billur Barshan and Birsel Ayrulu LETTERS TO THE EDITOR Comments for Rivals and Personnaz (2000): Construction of confidence intervals for neural networks based on least squares estimation Jan Larsen and Lars Kai Hansen Response to comments for Rivals and Personnaz (2000): Construction of confidence intervals for neural networks based on least squares estimation I. Rivals and L. Personnaz BOOK REVIEWS Review of "Model systems and the neurobiology of associative learning" edited by J.E. Steinmetz, M.A. Gluck, and P.R. Solomon by Nestor A. Schmajuk Review of "Oscillations in neural systems" edited by D.S. Levine, V.R. Brown, and V.T. Shirey by Andrzej Przybyszewski and Mark Kon CURRENT EVENTS ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Takashi Nagano Faculty of Engineering Hosei University 3-7-2, Kajinocho, Koganei-shi Tokyo 184-8584 Japan 81 42 387 6350 (phone and fax) jnns at k.hosei.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From pam_reinagel at hms.harvard.edu Mon Feb 4 17:16:27 2002 From: pam_reinagel at hms.harvard.edu (Pamela Reinagel) Date: Mon, 4 Feb 2002 17:16:27 -0500 Subject: Natural Scenes conference Message-ID: 2nd and final post Deadline reminder: March 1, 2002 ------------------------------- A New Gordon Research Conference "Sensory coding and the natural environment: Probabilistic models of perception" June 30 - July 5, 2002 Mount Holyoke College, MA Pamela Reinagel & Bruno Olshausen, Chairs List of speakers and talk titles, as well as instructions on how to apply, are available at: http://www.klab.caltech.edu/~pam/NSS2002.htm. This conference will bring together researchers from diverse disciplines to discuss the statistical structure of natural sensory stimuli, and how nervous systems exploit these statistics to form useful representations of the environment. Topics include sensory neurophysiology, perceptual psychology, and the mathematics of signal statistics, applied to a variety of sensory modalities and organisms. Applications received by MARCH 1, 2002 will receive full consideration. (Late applications may be considered if there is space available.) From butz at illigal.ge.uiuc.edu Mon Feb 4 17:28:22 2002 From: butz at illigal.ge.uiuc.edu (Martin Butz) Date: Mon, 4 Feb 2002 16:28:22 -0600 (CST) Subject: CFP: Adaptive Behavior in Anticipatory Learning Systems Workshop (ABiALS 2002) Message-ID: (We apologize if you received more than one copy of this message) ########################################################################### C A L L F O R P A P E R S ABiALS Workshop 2002 Adaptive Behavior in Anticipatory Learning Systems ########################################################################### August 11., 2002 Edinburgh, Scotland http://www-illigal.ge.uiuc.edu/ABiALS to be held during the seventh international conference on Simulation of Adaptive Behavior (SAB'02) http://www.isab.org.uk/sab02/ This workshops aims for an interdisciplinary gathering of people interested in how anticipations can guide behavior as well as how an anticipatory influence can be implemented in an adaptive behavior system. Particularly, we are looking for adaptive behavior systems that incorporate some online anticipation mechanisms. ___________________________________________________________________________ Aim and Objectives: Most of the research over the last years in artificial adaptive behavior with respect to model learning and anticipatory behavior has focused on the model learning side. Research is particularly engaged in online generalized model learning. Up to now, though, exploitation of the model has been done mainly to show that exploitation is possible or that an appropriate model exists in the first place. Only very few applications exist that show the utility of the model for the simulation of anticipatory processes and a consequent adaptive behavior. The aim of this workshop is to bring together researchers that are interested in anticipatory processes and essentially anticipatory adaptive behavior. It is aimed for an interdisciplinary gathering that brings together researchers from distinct areas so as to discuss the different guises that takes anticipation in these different perspectives. But the workshop intends to focus on anticipations in the form of low-level computational processes rather than high-level processes such as explicit planning. ___________________________________________________________________________ Essential questions: * How can anticipations influence the adaptive behavior of an artificial learning system? * How can anticipatory adaptive behavior be implemented in an artificial learning system? * How does an incomplete model influence anticipatory behavior? * How do anticipations guide further model learning? * How do anticipations control attention? * Can anticipations be used for the detection of special environmental properties? * What are the benefits of anticipations for adaptive behavior? * What is the trade-off between simple bottom-up stimulus-response driven behavior and more top-down anticipatory driven behavior? * In what respect does anticipation mediate between low-level environmental processing and more complex cognitive simulation? * What role do anticipations play for the implementation of motivations and emotions? ___________________________________________________________________________ Submission: Submissions for the workshop should address or at least be related to one of the questions listed above. However, other approaches to anticipatory adaptive behavior are encouraged as well. The workshop is not limited to one particular type of anticipatory learning system or a particular representation of anticipations. However, the learning system should learn its anticipatory representation online rather than being provided by a model of the world beforehand. Nonetheless, background knowledge of a typical environment can be incorporated (and is probably inevitably embodied in the provided sensors, actions, and the coding in any adaptive system). Since this is a full day workshop, we hope to be able to provide more time for presentations and discussions. In that way, the advantages and disadvantages of the different learning systems should become clearer. It is also aimed for several discussion sessions in which anticipatory influences will be discussed in a broader sense. Papers will be reviewed for acceptance by the program committee and the organizers. Papers should be submitted electronically to one of the organizers via email in pdf or ps format. Electronic submission is strongly encouraged. If you cannot submit your contribution electronically, please contact one of the organizers. Submitted papers should be between 10 and 20 pages in 10pt, one-column format. The LNCS Springer-Verlag style is preferred (see http://www.springer.de/comp/lncs/authors.html). Submission deadline is the 31st of March 2002. Dependent on the quality and number of contributions we hope to be able to publish Post-Workshop proceedings as either a Springer LNAI volume or a special issue of a journal. For more information please refer to http://www-illigal.ge.uiuc.edu/ABiALS/ ___________________________________________________________________________ Important Dates: 31.March 2002: Deadline for Submissions 15.May 2002: Notification of Acceptance 15.June 2002: Camera Ready Version for SAB Workshop Proceedings 11.August 2002: Workshop ABiALS ___________________________________________________________________________ Program Committee: Emmanuel Dauc Facult des sciences du sport Universit de la Mditerranne Marseille, France Ralf Moeller Cognitive Robotics Max Planck Institute for Psychological Research Munich, Germany Wolfgang Stolzmann DaimlerChrysler AG Berlin, Germany Jun Tani Lab. for Behavior and Dynamic Cognition Brain Science Institute, RIKEN 2-1 Hirosawa, Wako-shi, Saitama, 351-0198 Japan Stewart W. Wilson President Prediction Dynamics USA ___________________________________________________________________________ Organizers: Martin V. Butz, Illinois Genetic Algorithms Laboratory (IlliGAL), Universtiy of Illinois at Urbana-Champaign, Illinois, USA also: Department of Cognitive Psychology University of Wuerzburg, Germany butz at illigal.ge.uiuc.edu http://www-illigal.ge.uiuc.edu/~butz Pierre Grard, AnimatLab, University Paris VI, Paris, France pierre.gerard at lip6.fr http://animatlab.lip6.fr/Gerard Olivier Sigaud AnimatLab, University Paris VI, Paris, France olivier.sigaud at lip6.fr http://animatlab.lip6.fr/Sigaud From jose at psychology.rutgers.edu Tue Feb 5 17:31:37 2002 From: jose at psychology.rutgers.edu (Stephen Hanson) Date: Tue, 05 Feb 2002 17:31:37 -0500 Subject: POSTDOC Available--- RUTGERS UNIVERSITY --RUMBA LABS Message-ID: <3C605D49.514DE31@psychology.rutgers.edu> COGNITIVE/COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION at RUTGERS UNIVERSITY, Newark Campus. The Rutgers University Mind/Brain Analysis (RUMBA) Project anticipates making one postdoctoral appointment, which is to begin in the SUMMER (June/July) of 2002. This positions are for a minimum of 2 years, with the possibility of continuation for 1 more year and will be in the areas of specialization of cognitive neuroscience with emphasis on the development of new paradigms and methods in neuroimaging, mathematical modeling, signal processing or data analysis in functional brain imaging. Particular interest is in methods and algorithms for fusion of EEG/fMRI. Applications are welcomed begining immediately and review will continue until the position is filled. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation and 1 reprint to Professor S.J. Hanson, Department of Psychology, Rutgers University, Newark, NJ 07102. Email enquiry can be made to jose at psychology.rutgers.edu please put "RUMBA POSTDOC" in your subject field also see http://www.rumba.rutgers.edu. From tbl at cin.ufpe.br Wed Feb 6 11:32:31 2002 From: tbl at cin.ufpe.br (Teresa Bernarda Ludermir) Date: Wed, 06 Feb 2002 14:32:31 -0200 Subject: First CFP SBRN 2002 Message-ID: <3C615A9F.4885B75F@cin.ufpe.br> --------------------------- Apologies for cross-posting --------------------------- FIRST CALL FOR PAPERS ********************************************************************** SBRN'2002 - VII BRAZILIAN SYMPOSIUM ON NEURAL NETWORKS (http://www.cin.ufpe.br/~sbiarn02) Recife, November 11-14, 2002 ********************************************************************** The biannual Brazilian Symposium on Artificial Neural Networks (SBRN) - of which this is the 7th event - is a forum dedicated to Neural Networks (NNs) and other models of computational intelligence. The emphasis of the Symposium will be on original theories and novel applications of these computational models. The Symposium welcomes paper submissions from researchers, practitioners, and students worldwide. The proceedings will be published by the IEEE Computer Society. Selected, extended, and revised papers from SBRN'2002 will be also considered for publication in a special issue of the International Journal of Neural Systems and of the Journal of Intelligent and Fuzzy Systems. SBRN'2002 is sponsored by the Brazilian Computer Society (SBC) and co-sponsored by SIG/INNS/Brazil Special Interest Group of the International Neural Networks Society in Brazil. It will take place November 11-14, and will be held in Recife at a beach resort. Recife, located on the northeast coast of Brazil, is known as the "Brazilian Venice" because of its many canals and waterways and the innumerable bridges that span them. It is the major gateway to the Northeast with regular flights to all major cities in Brazil as well as Lisbon, London, Frankfurt, and Miami. See more information about the place ( http://www.braziliantourism.com.br/pe-pt1-en.html) that will host the event. SBRN'2002 will be held in conjunction with the XVI Brazilian Symposium on Artificial Intelligence (http://www.cin.ufpe.br/~sbiarn02) (SBIA). SBIA has its main focus on symbolic AI. Crossfertilization of these fields will be strongly encouraged. Both Symposiums will feature keynote speeches and tutorials by world-leading researchers. The deadline for submissions is April 15, 2002. More details on paper submission and conference registration will be coming soon. Sponsored by the Brazilian Computer Society (SBC) Co-Sponsored by SIG/INNS/Brazil Special Interest Group of the International Neural Networks Society in Brazil Organised by the Federal University of Pernambuco (UFPE)/Centre of Informatics (CIn) Published by the IEEE Computer Society Deadlines: Submission: 15 April 2002 Acceptance: 17 June 2002 Camera-ready: 22 August 2002 Non-exhaustive list of topics which will be covered during SBRN'2002: Applications: finances, data mining, neurocontrol, time series analysis, bioinformatics; Architectures: cellular NNs, hardware and software implementations, new models, weightless models; Cognitive Sciences: adaptive behaviour, natural language, mental processes; Computational Intelligence: evolutionary systems, fuzzy systems, hybrid systems; Learning: algorithms, evolutionary and fuzzy techniques, reinforcement learning; Neurobiological Systems: bio-inspired systems, biologically plausible networks, vision; Neurocontrol: robotics, dynamic systems, adaptive control; Neurosymbolic processing: hybrid approaches, logical inference, rule extraction, structured knowledge; Pattern Recognition: signal processing, artificial/computational vision; Theory: radial basis functions, Bayesian systems, function approximation, computability, learnability, computational complexity. Paper Submission: Prospective authors are invited to submit 6-page, 11-point, double-column papers (postscript or pdf format) written in English, Portuguese or Spanish - see the style file at (http://computer.org/cspress/instruct.htm).=20 More details on paper submission and conference registration will be coming soon. The first volume of the Proceedings will be published by IEEE Computer Society Press, in time for distribution at the symposium. It will include only accepted papers written in English and abstracts of accepted papers written in Portuguese or Spanish. A second volume will be issued as a CD-ROM, and will contain accepted papers originally written in Portuguese or Spanish. General Chair: Teresa B. Ludermir (UFPE/CIn, Brazil) tbl at cin.ufpe.br Program Chair: Marcilio C. P. de Souto (UFPE/CIn, Brazil) mcps at cin.ufpe.br Publications Chair: Marley Vellasco (PUC-RJ, Brazil) marley at ele.puc-rio.br Organising Committee Teresa B. Ludermir (UFPE, BR) Marc=EDlio C. P. de Souto (UFPE, BR) Steering Committee: Aluizio F. R. Araujo (USP-SC, BR) Antonio de P. Braga (UFMG, BR) Andre P. L. F. de Carvalho (USP-SC, BR) Teresa B. Ludermir (UFPE, BR) Carlos H. C. Ribeiro (ITA, BR) Marc=EDlio C. P. de Souto (UFPE, BR) Marley Vellasco (PUC-RJ, BR) Gerson Zaverucha (UFRJ, BR) Program Committee (Preliminary) Igor Aleksander (Imperial College, UK) Aluizio F. R. Araujo (USP-SC, BR) Pierre Baldi (Univ. of California at Irvine, USA) Valmir Barbosa (UFRJ, BR) Allan Kardec D. Barros (UFMA, BR) Antonio de P. Braga (UFMG, BR) Anne M. de P. Canuto (UFRN, BR) Otavio Carpiteiro (EFEI, BR) Andre P. L. F. Carvalho (USP-SC, BR) Alejandro Ceccatto (Univ. of Rosario, AR) Phillipe DeWilde (Imperial College, UK) Paulo M. Engel (UFRGS, BR) Felipe Franca (UFRJ, BR) Fernando Gomide (UNICAMP, BR) Maria Eunice Gonzales (UNESP-Marilia, BR) Stephen Grossberg (Boston University, USA) Bart Kosko (University of Southern California, USA) Teresa B. Ludermir (UFPE, BR) Wolfgang Maass (Technische Univ. Graz, AUSTRIA) Marcio L. de Andrade Netto (UNICAMP, BR) Jose R. C. Piqueira (USP, BR) Jose Pr=EDncipe (Univ. of Florida, USA) Carlos H. C. Ribeiro (ITA, BR) Jude W. Shavlik (Univ. of Wisconsin, USA) Marcilio C. P. de Souto (UFPE, BR) Harold Szu (Univ. of SW Lousiana, USA) Germano C. Vasconcelos (UFPE, BR) Marley Vellasco (PUC-RJ, BR) Takashi Yoneyama (ITA, BR) Gerson Zaverucha (UFRJ, BR) Jack M. Zurada (Univ. of Louisville, USA) From samengo at cab.cnea.gov.ar Wed Feb 6 12:35:47 2002 From: samengo at cab.cnea.gov.ar (=?iso-8859-1?Q?Ines?= Samengo) Date: Wed, 06 Feb 2002 14:35:47 -0300 Subject: paper on limited sampling Message-ID: <3C616973.3E55650E@cab.cnea.gov.ar> Dear connectionists, the following paper may be of interest to you. Thank you very much, Ines. Estimating probabilities from experimental frequencies Ines Samengo, to be published in Physical Review E, 2002 Estimating the probability distribution 'q' governing the behaviour of a certain variable by sampling its value a finite number of times most typically involves an error. Successive measurements allow the construction of a histogram, or frequency count 'f', of each of the possible outcomes. In this work, the probability that the true distribution be 'q', given that the frequency count 'f' was sampled, is studied. Such a probability may be written as a Gibbs distribution. A thermodynamic potential, which allows an easy evaluation of the mean Kullback-Leibler divergence between the true and measured distribution, is defined. For a large number of samples, the expectation value of any function of 'q' is expanded in powers of the inverse number of samples. As an example, the moments, the entropy and the mutual information are analyzed. http://www.cab.cnea.gov.ar/users/samengo/pub.html -- ______________________________________________________ Ines Samengo samengo at cab.cnea.gov.ar http://www.cab.cnea.gov.ar/users/samengo/samengo.html tel: +54 2944 445100 - fax: +54 2944 445299 Centro Atomico Bariloche (8.400) San Carlos de Bariloche Rio Negro, Argentina ______________________________________________________ From wolfskil at MIT.EDU Wed Feb 6 15:38:36 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Wed, 06 Feb 2002 15:38:36 -0500 Subject: book announcement--Herbrich Message-ID: <5.0.2.1.2.20020206153729.00ae5470@po14.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/026208306X/ Thank you! Best, Jud Learning Kernel Classifiers Theory and Algorithms Ralf Herbrich Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifier--a limited, but well-established and comprehensively studied model--and extends its applicability to a wide range of nonlinear pattern-recognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PAC-Bayesian theory, data-dependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library. Ralf Herbrich is a Postdoctoral Researcher in the Machine Learning and Perception Group at Microsoft Research Cambridge and a Research Fellow of Darwin College, University of Cambridge. 7 x 9, 384 pp., 0-262-08306-X Adaptive Computation and Machine Learning series Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From ken at phy.ucsf.edu Thu Feb 7 14:18:19 2002 From: ken at phy.ucsf.edu (Ken Miller) Date: Thu, 7 Feb 2002 11:18:19 -0800 Subject: Paper available: Neural Noise and Power-Law Nonlinearities Message-ID: <15458.54011.801331.179542@coltrane.ucsf.edu> The following paper is now available from ftp://ftp.keck.ucsf.edu/pub/ken/miller_troyer02.pdf or from http://www.keck.ucsf.edu/~ken (click on 'Publications', then on 'Models of Neuronal Integration and Circuitry') This is a final draft of a paper that appeared as Journal of Neurophysiology 87, 653-659 (2002). Neural Noise Can Explain Expansive, Power-Law Nonlinearities in Neural Response Functions Kenneth D. Miller and Todd W. Troyer Abstract: Many phenomenological models of the responses of simple cells in primary visual cortex have concluded that a cell's firing rate should be given by its input raised to a power greater than one. This is known as an expansive power-law nonlinearity. However, intracellular recordings have shown that a different nonlinearity, a linear-threshold function, appears to give a good prediction of firing rate from a cell's low-pass-filtered voltage response. Using a model based on a linear-threshold function, Anderson et al. (2000) showed that voltage noise was critical to converting voltage responses with contrast-invariant orientation tuning into spiking responses with contrast-invariant tuning. We present two separate results clarifying the connection between noise-smoothed linear-threshold functions and power-law nonlinearities. First, we prove analytically that a power-law nonlinearity is the only input-output function that converts contrast-invariant input tuning into contrast-invariant spike tuning. Second, we examine simulations of a simple model that assumes (i) instantaneous spike rate is given by a linear-threshold function of voltage, and (ii) voltage responses include significant noise. We show that the resulting average spike rate is well described by an expansive power law of the average voltage (averaged over multiple trials), provided that average voltage remains less than about 1.5 standard deviations of the noise above threshold. Finally, we use this model to show that the noise levels recorded by Anderson et al. (2000) are consistent with the degree to which the orientation tuning of spiking responses is more sharply tuned than the orientation tuning of voltage responses. Thus, neuronal noise can robustly generate power-law input-output functions of the form frequently postulated for simple cells. Kenneth D. Miller telephone: (415) 476-8217 Associate Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From wolfskil at MIT.EDU Thu Feb 7 14:21:00 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Thu, 07 Feb 2002 14:21:00 -0500 Subject: book announcement: Schlkopf Message-ID: <5.0.2.1.2.20020207141501.02fd7bc8@po14.mit.edu> MIT has recently published another book I thought Connectionist readers might be interested in. For more information, please visit http://mitpress.mit.edu/0262194759/ Thank you! Best, Jud Learning with Kernels Support Vector Machines, Regularization, Optimization, and Beyond Bernhard Schlkopf and Alexander J. Smola In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs--kernels--for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years. 8 x 10, 632 pp., 138 illus., cloth, ISBN 0-262-19475-9 Adaptive Computation and Machine Learning series Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From becker at meitner.psychology.mcmaster.ca Fri Feb 8 22:59:34 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Fri, 8 Feb 2002 22:59:34 -0500 (EST) Subject: FACULTY POSITION IN COMPUTATIONAL BIOLOGY Message-ID: ASSISTANT OR ASSOCIATE PROFESSOR COMPUTATIONAL BIOLOGY McMASTER UNIVERSITY McMaster University is a research-intensive institution and leading centre for biological and biomedical research. The Department of Biology is expanding and over the next year will fill six new faculty positions. We invite applications for a tenure-track position in Computational Biology at the Assistant or Associate Professor level, effective July 1, 2002. Candidates must hold a Ph.D. in Biology or a related field, possess at least one year of postdoctoral experience, and have a productive research record in an area of Computational Biology. We encourage applications from a broad range of individuals applying mathematics, statistics, and/or computer science to the study of biological questions. Research areas include but are not limited to bioinformatics, basic developmental biology, genomics, molecular biology, molecular evolution, neurobiology, ecology, population biology and population genetics. In addition, candidates that make use of parallel programming/computers are particularly encouraged to apply and will be able to take advantage of the local Shared Hierarchical Academic Research Computing network (SHARCNET) cluster of over 300 processors. The successful applicant will be expected to establish and maintain an independent and externally funded research program and contribute to the education of undergraduate and graduate students. Applicants should submit a curriculum vitae, a statement of their research interests, a statement of their teaching interests and experience, and three of their most important publications. Applicants should arrange for three letters of recommendation to be sent to - Dr. T.M. Finan, Chair of Biology, McMaster University, Department of Biology, 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada. Evaluation of applicants will begin March 25, 2002 but the position will remain open until filled. Please refer to the position to which you are applying in your covering letter. (see http://www.science.mcmaster.ca/Biology/Dept.html) All qualified candidates are encouraged to apply; however Canadian citizens and permanent residents will be given priority. McMaster University is committed to employment equity and encourages applications from qualified candidates, including, members of visible minorities, aboriginal peoples, persons with disabilities and women. From jose.dorronsoro at iic.uam.es Fri Feb 8 13:10:11 2002 From: jose.dorronsoro at iic.uam.es (Jose Dorronsoro) Date: Fri, 08 Feb 2002 19:10:11 +0100 Subject: ICANN 2002 Submission Deadline Extension Message-ID: <1.5.4.32.20020208181011.01274134@iic.uam.es> Note: efforts have been made to avoid duplicate postings of this message. Apologies if, nevertheless, you are getting them. ICANN 2002 Submission Deadline Extension Because of numerous requests, the February 15 deadline for submission of papers to the 12th International Conference on Artificial Neural Networks, ICANN 2002, has been extended to February 28. Acceptance or rejection will be notified by April 15, 2002 Submissions must be in postscript or pdf format and can be either uploaded or sent by surface mail or e-mail attach. Please check the author's instructions in the ICANN 2002 web page, www.ii.uam.es/icann2002. Notice also that, in any case, a Unique Tracking Number must be obtained first for each submission. The very simple procedure for UTN getting can also be started from the ICANN 2002 web page. Jose Dorronsoro ETS Informatica Universidad Autonoma de Madrid 28049 Madrid jose.dorronsoro at iic.uam.es Tlfno: 34 91 348 2329 Fax: 34 91 348 2334 From gbarreto at sel.eesc.sc.usp.br Sun Feb 10 21:33:38 2002 From: gbarreto at sel.eesc.sc.usp.br (Guilherme de Alencar Barreto) Date: Sun, 10 Feb 2002 23:33:38 -0300 (EST) Subject: Papers on Unsupervised Temporal Sequence Processing Message-ID: Dear Connectionists, The following three papers, on unsupervised temporal sequence processing, are available from http://www.sel.eesc.sc.usp.br/lasi/www/gbarreto/publicacoes.htm 1) Arajo, A.F.R. and Barreto, G.A. (2002). Context in temporal sequence processing: A self-organizing approach and its application to robotics. IEEE Transactions on Neural Networks, Vol. 13, No. 1, pp. 45-57, January Issue. Abstract: A self-organizing neural network for learning and recall of complex temporal sequences is developed and applied to robot trajectory planning. We consider trajectories with both repeated and shared states. Both cases give rise to ambiguities during reproduction of stored trajectories which are resolved via temporal context information. Feedforward weights encode spatial features of the input trajectories, while the temporal order is learned by lateral weights through a time-delayed Hebbian learning rule. After training is completed, the network model operates in an anticipative fashion by always recalling the successor of the current input state. Redundancy in sequence representation improves the robustness of the network to noise and faults. The network uses memory resources efficiently by reusing neurons that have previously stored repeated/shared states. Simulations have been carried out to evaluate the performance of the network in terms of trajectory reproduction, convergence time and memory usage, tolerance to fault and noise, and sensitivity to trajectory sampling rate. The results show that the network model is fast, accurate and robust. Its performance is discussed in comparison with other neural networks models. Keywords: Context, temporal sequences, self-organization, Hebbian learning, robotics, trajectory planning. 2) Barreto, G.A. and Arajo, A.F.R. (2001). Time in self-organizing maps: An overview of models. International Journal of Computer Research, Special Issue on Neural Networks: Past, Present and Future, 10(2):139-179. Abstract: We review a number of neural models of self-organizing feature maps designed to process sequential patterns in engineering and cognitive applications. This type of pattern inherently holds information of both a spatial and a temporal nature. The latter includes the temporal order, relative duration of the time interval, and temporal correlations of the items in the sequence. We present the main concepts related to the processing of spatiotemporal sequences and then discuss how the time dimension can be incorporated into the network dynamics through the use of various short-term memory models. The vast majority of the models are based on Kohonen's self-organizing map, being organized according to the network architecture and learning rules, and presented in nearly chronological order. We conclude the paper by suggesting possible directions for further research on temporal sequence processing through self-organizing maps. Keywords: Self-organizing maps, unsupervised learning, time dimension, temporal sequence, short-term memory, temporal context. 3) Barreto, G.A. and Arajo, A.F.R. (2001). Unsupervised learning and temporal context to recall complex robot trajectories. International Journal of Neural Systems, 11(1):11-22. Abstract: An unsupervised neural network is proposed to learn and recall complex robot trajectories. Two cases are considered: (i) A single trajectory in which a particular arm configuration (state) may occur more than once, and (ii) trajectories sharing states with each other. Ambiguities occur in both cases during recall of such trajectories. The proposed model consists of two groups of synaptic weights trained by competitive and Hebbian learning laws. They are responsible for encoding spatial and temporal features of the input sequences, respectively. Three mechanisms allow the network to deal with repeated or shared states: local and global context units, neurons disabled from learning, and redundancy. The network reproduces the current and the next state of the learned sequences and is able to resolve ambiguities. The model was simulated over various sets of robot trajectories in order to evaluate learning and recall, trajectory sampling effects and robustness. Guilherme de A. Barreto Dept. of Electrical Engineering University of So Paulo (USP) So Carlos, SP, BRAZIL FAX: 55- 16 - 273 9372 PHONE: 55- 16 - 273 9357 From d.mareschal at bbk.ac.uk Mon Feb 11 06:16:19 2002 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Mon, 11 Feb 2002 12:16:19 +0100 Subject: Phd Studentships Message-ID: Readers of this list may be interested in the following Phd positions. PLEASE DO NOT RESPOND DIRECTLY TO ME. --------------------------- The School of Psychology, Birkbeck College as a number of Phd Studentships on offer for Phds starting in October 2002. Birkbeck College is part of the University of London and is situated in the central Bloomsbury area of London, in close proximity to University College London, The Insitute of Cognitive Neuroscience, the Gatsby Computational Neurosciences Unit, the Institue of Child Health, and the Insitute of Education. The School of Psychology has a very active internationally recognised research programme with particular interests in cognitive sciences, cognitive neurosciences, computational neuroscience, and cognitive and social development. However, the School welcomes applications for studentships in all areas of psychology For more information about the Schools research profile and studentships available, please visit our website: www.psyc.bbk.ac.uk OR contact: Ms Mina Daniel Postgraduate Administrator Tel.: 020 7631 6862 E-mail: s.daniel at psychology.bbk.ac.uk ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 (0)20 7631-6582/6226 reception: 6207 fax +44 (0)20 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From wolfskil at MIT.EDU Thu Feb 7 14:21:00 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Thu, 07 Feb 2002 14:21:00 -0500 Subject: book announcement: Schlkopf Message-ID: <5.0.2.1.2.20020207141501.02fd7bc8@po14.mit.edu> MIT has recently published another book I thought Connectionist readers might be interested in. For more information, please visit http://mitpress.mit.edu/0262194759/ Thank you! Best, Jud Learning with Kernels Support Vector Machines, Regularization, Optimization, and Beyond Bernhard Schlkopf and Alexander J. Smola In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs--kernels--for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years. 8 x 10, 632 pp., 138 illus., cloth, ISBN 0-262-19475-9 Adaptive Computation and Machine Learning series Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From becker at meitner.psychology.mcmaster.ca Mon Feb 11 16:54:29 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Mon, 11 Feb 2002 16:54:29 -0500 (EST) Subject: COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION Message-ID: COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION A postdoctoral candidate is sought to develop computational models of the role of ascending neuromodulatory systems in both learning and motivated behaviour. Topics of interest include the role of dopamine in gating signal transmission in pathways involved in generating motivated action, development of fears and paranoias in hyperdopaminergic conditions, learning aversive and emotional conditioned responses, and the biological bases of emotional memory formation in structures including the hippocampus and amygdala. It is anticipated that this project will lead to fundamental contributions to the literature on models of self-organization, by combining unsupervised and semi-supervised (reinforcement) learning methods. The work will also have important implications for the team's research in schizophrenia, wherein the actions of new classes of antipsychotic drugs are being investigated both in clinical trials and using brain imaging and behavioural pharmacology. The candidate must have a PhD in cognitive science, computer science, or a related discipline, and experience in neural network modelling. The model development will proceed in close collaboration with researchers at the CAMH and University of Toronto investigating learning and memory using behavioural pharmacology. Depending upon the interests of the candidate, opportunities also exist to acquire training in human functional neuroimaging, and conduct studies with clinical populations. The position is available for a minimum of two years. This research is part of a collaborative effort involving Dr. S. Becker, Department of Psychology, McMaster University (computational neuroscience), Dr. S. Kapur, Centre for Addiction and Mental Health (CAMH) and Department of Psychiatry, University of Toronto (behavioural pharmacology, human neuroimaging with PET and fMRI) and Dr. P. Fletcher, Department of Psychology, University of Toronto, and CAMH (animal models and behavioural pharmacological studies). For further information on the research interests of the team see www.science.mcmaster.ca/Psychology/sb.html http://www.camh.net/research/research_ar2001/schizophrenia.html http://www.rotman-baycrest.on.ca/content/people/profiles/kapur.html http://www.camh.net/research/research_ar2001/biopsychology.html Interested candidates should send a letter of intention, a CV and two letters of recommendation to Dr. S. Becker at the address below. Dr. Sue Becker Department of Psychology McMaster University 1280 Main Street West, Hamilton, Ont. L8S 4K1 becker at mcmaster.ca Fax: (905)529-6225 From reggia at cs.umd.edu Tue Feb 12 11:48:22 2002 From: reggia at cs.umd.edu (James A. Reggia) Date: Tue, 12 Feb 2002 11:48:22 -0500 (EST) Subject: Postdoc Position Message-ID: <200202121648.LAA12502@avion.cs.umd.edu> A two-year postdoc position is available within the context of the fellowship program described below. The position involves the use of neural networks, genetic algorithms/programming, or related computational methods to study hemispheric specialization or related aspects of the neurobiological basis of language. A strong computational background is expected for this position. Applications arriving by March 15, 2002, will receive full consideration, with a target starting date of this coming summer or early fall. Jim Reggia -------- Post-doctoral Fellowships: Cognitive Neuroscience of Language & its Disorders Two-year National Research Service Award fellowships are available at the University of Maryland, Baltimore and College Park campuses. Training opportunities will provide experience in the application of contemporary research methods (including computational modeling, cognitive neuropsychology, event-related potentials and functional neuroimaging) to the topic of normal and/or disordered language processing. Applicants with doctoral degrees in related basic science areas (computer science, neuroscience, linguistics, cognitive psychology, etc.) and clinical disciplines (speech/language pathology; clinical neuropsychology) are invited to apply. Applicants must be U.S. citizens or permanent residents to be considered, under the terms of the NRSA program. Inquiries may be directed to Rita Berndt at rberndt at umaryland.edu or to Jim Reggia at reggia at cs.umd.edu . To apply, send your C.V., the names and addresses of three referees, your contact information, and a statement of research interests and career goals to: James A. Reggia email: reggia at cs.umd.edu Department of Computer Science A. V. Williams Bldg. University of Maryland fax: (301) 405-6707 College Park MD 20742 USA Applications may be sent by mail, fax or email electronic attachments. From C.Campbell at bristol.ac.uk Wed Feb 13 11:47:01 2002 From: C.Campbell at bristol.ac.uk (Colin Campbell, Engineering Mathematics) Date: Wed, 13 Feb 2002 16:47:01 +0000 (GMT Standard Time) Subject: CFP: Bioinformatics Special Section/Analysis of Microarray Data Message-ID: The following may be of interest: Special Section of the Journal Bioinformatics on: Analysis of Microarray Data Organisers: Colin Campbell (University of Bristol) and Shayan Mukherjee (MIT) Microarray technology is rapidly accelerating progress in many areas of biomedical research. For the first time this technology gives a global view of the expression level of thousands of genes. This Special Section of Bioinformatics will focus on new algorithmic or theoretical techniques for analyzing such datasets. This Special Section was announced at the NIPS2001 Workshop on Machine Learning Techniques for Bioinformatics held at the Whistler Resort, British Columbia, Canada on December, 2001. Analysis of microarray data frequently utilizes machine learning techniques such as cluster analysis, classification, feature selection, regression, sample complexity, determination of network structures and feature dependencies, for example. However, we also welcome papers from researchers interested in analytical methods beyond machine learning (e.g. statistics) which may include techniques for evaluating the effect of noise, imputing missing values, discovering outliers, scoring features, etc. We welcome case studies in which the techniques described above are applied to new datasets, illustrating practical problems and the successful use of these methods. Further details can be found at: http://lara.enm.bris.ac.uk/cig/nips01/bioss.htm The deadline for submissions is *** 30th April 2002 ***. -------------------------------------------- Dr. Colin Campbell, Dept. of Engineering Mathematics, Bristol University, Bristol BS8 1TR, United Kingdom http://lara.enm.bris.ac.uk/cig/ Tel +44 (0) 117 928 9858 C.Campbell at bristol.ac.uk From skoenig at cc.gatech.edu Wed Feb 13 12:59:00 2002 From: skoenig at cc.gatech.edu (Sven Koenig) Date: Wed, 13 Feb 2002 12:59:00 -0500 (EST) Subject: CFP: SARA 2002 Message-ID: <200202131759.g1DHx0U16794@cleon.cc.gatech.edu> Our apologies if you receive more than one call for papers for SARA 2002 (Symposium on Abstraction, Reformulation and Approximation). The deadline for indicating an intent to submit is February 20, 2002. Cheers, Robert Holte Sven Koenig ---------------------------------------------------------------------- CALL FOR PAPERS SARA-2002 Symposium on Abstraction, Reformulation and Approximation Kananaskis Mountain Lodge, Kananaskis, Alberta, Canada August 2-4, 2002 (immediately after AAAI-2002) OVERVIEW SARA-2002 is an Artificial Intelligence symposium on all aspects of abstraction, reformulation, and approximation. Like past SARAs, it will consist of stimulating technical presentations spanning the traditional boundaries that fragment Artificial Intelligence research. Three invited speakers will give their perspectives on abstraction, reformulation, and approximation. Attendance is limited to approximately 50 participants. Some funding is available to subsidize the cost of graduate students whose research involves techniques of abstraction, reformulation or approximation. SARA-2002 will be situated amidst the spectacular Rocky Mountains of the Kananaskis Valley, 60 miles west of Calgary, Alberta, and 45 miles southeast of Banff, Alberta. To make it convenient for AAAI-2002 attendees to participate in SARA, a luxurious bus will drive from the AAAI conference site to the SARA site the afternoon of August 1. *** PAPER SUBMISSIONS *** Submissions are due on February 25, and may be either full papers or extended abstracts. Authors are requested to send a notification of intent to submit, together with a draft title and short abstract, by February 20, 2002 to sara-submission at cc.gatech.edu *** FOR MORE INFORMATION *** Additional information, including a complete call for papers, may be obtained from the symposium home page http://www.cs.ualberta.ca/~holte/SARA2002/ If you would like to receive updates about the conference, please send email to holte at cs.ualberta.ca and ask to be added to the SARA mailing list. We gratefully acknowledge support from AAAI and NASA. -- Robert Holte holte at cs.ualberta.ca Sven Koenig skoenig at cc.gatech.edu SARA-2002 co-chairs From mpp at us.ibm.com Thu Feb 14 09:59:45 2002 From: mpp at us.ibm.com (Michael Perrone) Date: Thu, 14 Feb 2002 09:59:45 -0500 Subject: IBM Graduate Summer Intern Positions in Handwriting Recognition Message-ID: _________________________________________________________________________________ Graduate Summer Intern Positions at IBM _________________________________________________________________________________ The Pen Technologies Group at the IBM T.J. Watson Research Center is looking for graduate students to fill summer R&D positions in the area of large-vocabulary, unconstrained, handwriting recognition. Candidates should have the following qualifications: - Currently enrolled in a PhD program in EE, CS, Math, Physics or similar field - Research experience in handwriting recognition or IR - Strong mathematics/probability background - Excellent programming skills (in C and/or C++ and/or Java) - Creativity Our current projects include: - HMM-based, unconstrained, handwriting recognition - Language and grammar modeling - Accurate, high-speed, search methods - Document understanding and processing - Pen computing - Handwritten document retrieval The IBM T.J. Watson Research Center is one of the top industrial laboratories in the world. We offer an exciting research environment with the opportunity to become involved in all aspects of cutting edge technology in the computer industry. We encourage your early reply as positions fill quickly. ______________________ Please send CV's to: Michael P. Perrone mpp at us.ibm.com -or- Michael P. Perrone IBM T.J. Watson Research Center - 36-207 Route 134 Yorktown Heights, NY 10598 914-945-1779 From ddepi001 at umaryland.edu Fri Feb 15 14:29:45 2002 From: ddepi001 at umaryland.edu (Didier A. Depireux Dr) Date: Fri, 15 Feb 2002 14:29:45 -0500 (EST) Subject: Post-Doct: Encoding of Dynamic Spectrum in Auditory Cortex Message-ID: Several post-doctoral positions are available in the Department of Anatomy and Neurobiology of the School of Medicine of the Univ. of Maryland (in Baltimore) to work in the laboratory of Dr. Didier Depireux. The overall goal of the research is to determine how the shape of the acoustic spectrum is represented in the unit responses of auditory cortex of the awake and alert ferret. We develop system models to characterize response features that are extensions of the classical concepts of response areas and impulse response functions. The project could involve correlated psychophysical studies in ferrets and/or human subjects. The successful candidate will have training experience in either electrophysiology in animals or a strong interest in applying quantitative methods to the field of neuroscience. A representative paper of the techniques used can be found at http://www.isr.umd.edu/~didier/torcs.pdf A representative paper of the recording methods and results in cortex is http://www.isr.umd.edu/~didier/1220.pdf I prefer email applications. Please contact me for questions and applications at ddepi001 at umaryland.edu Didier -- Didier A Depireux ddepi001 at umaryland.edu 685 W.Baltimore Str http://neurobiology.umaryland.edu/depireux.htm Anatomy and Neurobiology Phone: 410-706-1272 (off) University of Maryland -1273 (lab) Baltimore MD 21201 USA Fax: 1-301-314-9920 From tewon at salk.edu Fri Feb 15 21:15:40 2002 From: tewon at salk.edu (Te-Won Lee) Date: Fri, 15 Feb 2002 18:15:40 -0800 Subject: ICA-2001 Proceedings Online Message-ID: <000a01c1b68f$d471f060$5293ef84@redmond.corp.microsoft.com> Third International Conference on Independent Component Analysis and Blind Signal Separation, December 9-12, 2001 - San Diego, California, USA. Editors: T.-W. Lee, T.-P. Jung, S. Makeig, and T. J. Sejnowski The complete Program and Proceedings of this conference are available on-line at: http://www.ica2001.org Information on how to obtain a CD ROM and hardcopies of the proceedings can also be found at this URL. Te-Won Te-Won Lee, Ph.D. Institute for Neural Computation - MC0523 University of California, San Diego La Jolla, CA 92039-0523, USA 858-534-9662 office 858-534-2014 fax http://rhythm.ucsd.edu/~tewon From: esann To: "Connectionists at cs.cmu.edu" References: From bogus@does.not.exist.com Fri Feb 15 11:14:43 2002 From: bogus@does.not.exist.com () Date: Fri, 15 Feb 2002 17:14:43 +0100 Subject: ESANN'2002 programme ( European Symposium on Artificial Neural Networks) Message-ID: ---------------------------------------------------- | | | ESANN'2002 | | | | 10th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 24-25-26, 2002 | | | | Preliminary programme | ---------------------------------------------------- The preliminary programme of the ESANN'2002 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. For 10 years the ESANN conference has become a major event in the field of neural computation. ESANN is a human-size conference focusing on fundamental aspects of artificial neural networks (theory, models, algorithms, links with statistics, data analysis, biological background,...). This year, 81 scientific communications will be presented, covering most areas of the neural computation field. The programme of the conference can be found at the URL http://www.dice.ucl.ac.be/esann, together with practical information about the conference venue, registration,... Other information can be obtained by sending an e-mail to esann at dice.ucl.ac.be . ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From harnad at coglit.ecs.soton.ac.uk Sat Feb 16 10:25:40 2002 From: harnad at coglit.ecs.soton.ac.uk (S.Harnad) Date: Sat, 16 Feb 2002 15:25:40 GMT Subject: Budapest Open Access Initiative Message-ID: <200202161525.PAA13050@coglit.ecs.soton.ac.uk> This message is addressed to scholars and scientists and it concerns the Budapest Open Access Initiative (BOAI) http://www.soros.org/openaccess launched on 14 February by George Soros's Open Society Institute. To be useful, research must be used. To be used (read, cited, applied, extended) it must be accessible. There are currently 20,000 peer-reviewed journals of scientific and scholarly research worldwide, publishing over 4 million articles per year, every single one of them given away for free by its researcher-authors and their research-institutions, with the sole goal of maximizing their uptake and usage by further researchers, and hence their impact on worldwide research, to the benefit of learning and of humanity. Yet access to those 4 million annual research articles can only be had for a fee. Hence they are accessible only to the lucky researchers at that minority of the world's research institutions that can pay for them. And even the wealthiest of these institutions can only afford a small and shrinking proportion of those annual 20,000 journals. The result is exactly as if all those 4 million articles had been written for royalties or fees, just the way most of the normal literature is written, rather than having been given away for free by their authors and their institutions for the benefit of research and humanity. As a consequence, other researchers' access to all this work, and hence its potential impact on and benefit to research progress, is being minimized by access tolls that most research institutions and individuals worldwide cannot afford to pay. Those access tolls were necessary, and hence justified, in the Gutenberg era of print-on-paper, with its huge real costs, and no alternatives. But they are no longer necessary or justified, and are instead in direct conflict with what is best for research, researchers, and society, in today's PostGutenberg era of on-line-eprints, when virtually all of those Gutenberg costs have vanished, and those remaining costs can be covered in a way that allows open access. The Budapest Open Access Initiative is dedicated to freeing online access to this all-important but anomalous (because give-away) literature, now that open access has at long last become possible, by (I) providing universities with the means of freeing online access to their own annual peer-reviewed research output (as published in the 20,000 established journals) through institutional self-archiving, as well as by (II) providing support for new alternative journals that offer open online access to their full text contents directly (and for established journals that are committed to making the transition to offering open full-text access online). It is entirely fitting that it should be George Soros's Open Society Institute that launches this initiative to open access to the world's refereed research literature at last. Open access is now accessible, indeed already overdue, at a mounting cost in lost benefits to research and to society while we delay implementing it. What better way to open society than to open access to the fruits of its science and scholarship, already freely donated by its creators, but until now not freely accessible to all of its potential users? Fitting too is the fact that this initiative should originate from a part of the world that has known all too long and all too well the privations of a closed society and access denial. Please have a look at the BOAI at http://www.soros.org/openaccess and, if you or your organization are implementing, or planning to implement either Strategy I or Strategy II, I hope you will sign the BOAI, either as an individual or an organization. Below, I append links to some of the press coverage of the BOAI so far. Sincerely, Stevan Harnad Declan Butler, Soros Offers Access to Science Papers (for Nature) http://makeashorterlink.com/?U21535A6 Ivan Noble, Boost for Research Paper Access (for BBC) http://news.bbc.co.uk/hi/english/sci/tech/newsid_1818000/1818652.stm Michael Smith, Soros Backs Academic Rebels (for UPI) http://www.upi.com/view.cfm?StoryID=12022002-031227-9710r [Alexander Grimwade, Open Societies Need Open Access (The Scientist) http://www.the-scientist.com/yr2002/feb/comm_020218.html ] [Denis Delbecq, L'abordage des revvues scientifiques (Liberation, Paris) http://www.liberation.com/quotidien/semaine/020214-050019088SCIE.html ] [http://slashdot.org/] From klikharev at notes.cc.sunysb.edu Mon Feb 18 07:30:07 2002 From: klikharev at notes.cc.sunysb.edu (Konstantin Likharev) Date: Mon, 18 Feb 2002 07:30:07 -0500 Subject: Postdoctoral position References: <12415.1013994928@ammon.boltz.cs.cmu.edu> Message-ID: <002901c1b878$00175540$9d383181@likharev> I am looking for an outstanding postdoctoral candidate to work at Stony Brook University (www.sunysb.edu) on the development of large scale self-evolving neural networks based on nanoscale latching switches. The basic ideas of this effort are described in our recent IJCNN'01 presentation (see http://rsfq1.physics.sunysb.edu/~likharev/nano/IJCNN'01.pdf). The person is the position shall be responsible for the network design and simulation (using, in particular, our new 162-processor cluster Njal), and is supposed to work in close contact with other members of our multi-disciplinary Stony Brook - centered collaboration. The collaboration is working on all aspects of the development of self-evolving networks, including their conceptual design and globally supervised training (Dr. J. Barhen of ORNL, Prof. M. Bender and myself), CMOS VLSI prototyping (Prof. A. Leuciuc), design of nanoscale single-electron latching switches (Prof. P. Allen and myself), and their molecular implementation (Dr. B. Brunschwig of BNL, Prof. J. Lukens, Prof. A. Mayr). The position is initially for one year, with a nearly-automatic extension for at least one more year if the work is running smoothly. Effort compensation is in the range $35-50K/yr (depending on candidate's credentials) plus a generous fringe benefit package. Stony Brook University is a EO/AA employer. Interested persons should send me their C.V.'s including list of publications and names of 3 references. (No reference letters without a specific request, please.) Additional questions by e-mail are welcome. Regards, K. Likharev ________________________________________________________________ Konstantin K. Likharev Professor of Physics State University of New York at Stony Brook Stony Brook, NY 11794-3800 Phone 631-632-8159 Fax 631-632-4977 E-mail klikharev at notes.cc.sunysb.edu (or likharev at rsfq1.physics.sunysb.edu) Web page http://rsfq1.physics.sunysb.edu/~likharev/personal/index.html From giugliano at pyl.unibe.ch Tue Feb 19 10:36:02 2002 From: giugliano at pyl.unibe.ch (Michele Giugliano) Date: Tue, 19 Feb 2002 16:36:02 +0100 Subject: NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL -- Univ. of Genova (Italy) -- June 10-15, 2002 Message-ID: <017201c1b95b$23a6eff0$9da25c82@physio.unibe.ch> Apologies if you receive this more than once. ***** NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL ***** ***** sponsored by the University of Genova, Italy ***** ***** CALL FOR APPLICATION ***** June 10 - 15 2002 University of Genova, Italy http://130.251.89.117/homepage/workshop2002.htm Registration fee: 50 Euro (students), 100 Euro (others). MAIN GOAL: to understand , modify and use brain plasticity in order to advance Neuroscience at the network level and to inspire new computer architectures. Scientific background: Bioengineering, Electronics, Informatics, Neuroscience. How to reach the main goal: By interfacing in vitro neurons to standard and microelectronic transducers capable to monitor and modify the neuron electrophysiological activity By creating hybrid neuro-electronic systems By developing neuro-prostheses By computer simulating plasticity at the network level By developing neuromorphic silicon neurons INVITED SPEAKERS: Alain Destexhe, Unite de Neuroscience Integratives et Computationelles, CNRS, Gif-sur-Yvette, France Michael Rudolph, Unite de Neuroscience Integratives et Computationelles, CNRS, Gif-sur-Yvette, France Ferdinando Mussa-Ivaldi, Department of Physiology Northwestern University Medical School, Chicago (IL), USA John Nicholls, SISSA, Trieste, Italy Miguel Nicolelis, Department of Neurobiology, Duke University, Durham (NC), USA Stefano Fusi, Institute of Physiology, University of Bern, Bern, Switzerland Michele Giugliano, Institute of Physiology, University of Bern, Bern, Switzerland Giacomo Indiveri, Institute for Neuroinformatics, ETH / University of Zurich, Switzerland Pietro Morasso, Department of Communications, Computer and System Sciences, University of Genova, Italy Rinaldo Poluzzi, ST Microelectronics, Italy Giulio Sandini, Department of Communications, Computer and System Sciences, University of Genova, Italy REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below and send it by email to news2002 at bio_nt.dibe.unige.it . REGISTRATION FORM (Please send it by email to: news2002 at bio_nt.dibe.unige.it ) NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL June 10 - 15 2002 University of Genova, Department of Biophysical and Electronic Engineering V. Opera Pia 11a, 16145 Genova Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ Registration fee: CHECK ONE: ( ) Euro 50 Registration Fee (Student) ( ) Euro 100 Registration Fee (Regular) PREFERRED METHOD OF PAYMENT : [ ] Bank transfer: Bank CARIGE, Agency 41, ABI:6175, CAB: 1472 c/c DIBE - University of Genoa 5341/90. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (VISA), check or cash at the very beginning of the workshop. From genesee at ego.psych.mcgill.ca Tue Feb 19 11:01:05 2002 From: genesee at ego.psych.mcgill.ca (Fred Genesee) Date: Tue, 19 Feb 2002 11:01:05 -0500 Subject: McGill Job Announcement Message-ID: <200202191600.LAA05931@ego.psych.mcgill.ca> McGill University Department of Psychology Canada Research Chair in Psychology of Language The Department of Psychology of McGill University invites applications from exceptional candidates for a Tier II Canada Research Chair in Psychology of Language. The successful applicant will have a tenure-track appointment at the Assistant or junior Associate Professor level. Consideration will be given to candidates with interests in any domain of scientific language research including, acquisition, speech and language perception and processing, neural representation, and language disorders. The Department has excellent facilities for interdisciplinary research through the Centre for Language, Mind, and Brain which links researchers in related academic units at McGill University (Linguistics, Communication Sciences and Disorders, and Education), the Montreal Neurological Institute, and other universities in Montreal. Applicants are expected to have a doctorate in psychology or a closely related field, a record of significant, externally-funded research, an aptitude for undergraduate and graduate teaching and the ability and interest to work collaboratively in an interdisciplinary research environment. Consideration of applications will begin March 1 and continue until suitable candidates have been identified. Applicants should submit a curriculum vitae, a description of research interests and philosophy, a statement of teaching interests and philosophy, selected reprints of publications, and should arrange for three confidential letters of recommendation to be sent to Chair, Psychology of Language Search Committee Department of Psychology McGill University 1205 Dr. Penfield Avenue Montreal, Quebec, Canada H3A 1B1. All qualified candidates are encouraged to apply, however Canadians and permanent residents will be given priority. Psychology Department phone: (514) 398-6022 McGill University fax: (514) 398-4896 1205 Docteur Penfield Ave. Montreal, Quebec Canada H3A 1B1 From m.stetter at mchp.siemens.de Tue Feb 19 11:21:58 2002 From: m.stetter at mchp.siemens.de (Martin Stetter) Date: Tue, 19 Feb 2002 17:21:58 +0100 Subject: Book Announcement: Exploration of Cortical Function Message-ID: <3C727BA6.4139505A@mchp.siemens.de> Dear collegues, I would like to announce my new book: Exploration of Cortical Function Imaging and Modeling Cortical Population Coding Strategies Martin Stetter Exploration of Cortical Function summarizes recent efforts aiming at the revelation of cortical population coding and signal processing strategies. Topics include optical detection techniques of population activity in the submillimeter range, advanced methods for the statistical analysis of these data, and biologically inspired neuronal models for population activities in the framework of optimal coding, statistical learning theory and meanfield recurrent networks. The book covers one complete branch of population-based brain research ranging from methods for data acquisition over data analysis up to neuronal models for the quantification of functional principles. The volume covers an area which is of great current interest to researchers working on cerebral cortex. The combination of models and image analysis techniques to examine the activity of large cohorts of neurons is especially intriguing and prone to considerable debate. Readership is aimed at students and researchers from many disciplines including neuroscience, biology, physics and computer science interested in how an interdisciplinary framework from biology, statistics and computational neuroscience can be used to gather a quantitative understanding of cortical function. Experimentalists may gain insight into statistical and neuronal modeling techniques, whereas theoreticians will find an introductory treatment of neuroanatomy, neurophysiology and measurement techniques. Kluwer Academic Publishers 269 pp., 132 illus. ISBN 1-4020-0435-4 (hardcover) ISBN 1-4020-0436-2 (paperback) -- ================================================================== Dr. Martin Stetter loc : Mch-P 63-418 Siemens AG, CT IC 4 phone : +49-89-636-55734 Corporate Technology fax : +49-89-636-49767 D-81730 Muenchen, Germany mailto: martin.stetter at mchp.siemens.de ================================================================== From nestor at ftnp.ft.uam.es Tue Feb 19 16:13:57 2002 From: nestor at ftnp.ft.uam.es (Nestor Parga Carballeda) Date: Tue, 19 Feb 2002 22:13:57 +0100 (MET) Subject: Systems Neuroscience / Madrid Message-ID: Interested candidates are invited to apply for a "Ramon y Cajal" position for experimental research work in Systems Neuroscience at the group of Computational Neuroscience of the Universidad Autonoma de Madrid, Spain. These are five year positions co-funded by the Spanish Ministery of Science and Technology and the Universities. This is the second year that these contracts are given in Spain and there will be a third call next year. More information about them can be found at the web address: http://www.mcyt.es/cajal/default.htm Applicants are should to submit a research proposal (of no more than 2,000 words). They are expected to work in interaction with the theoretical team of the Computational Neuroscience group. The basic idea is to carry out joint work on information processing in the brain by combining theoretical and experimental approaches. This leaves a rather broad field within which the candidates can make their proposals. Details about the current (theoretical) work of the group can be found at the web site: http://ket.ft.uam.es/~neurociencia/ Apart from the research proposal, applicants should also provide a full CV, a list of all publications and a statement of research interests to: Nestor Parga at one of two following e-mail addresses: nestor at ftnp.ft.uam.es parga at delta.ft.uam.es The Universities should make a decision about the type of projects that they are willing to fund by the end of February. For this reason applications should be sent to the address above preferably before February 25. This is the first stage of the selection process. A formal application and evaluation will be done later this year (see http://www.mcyt.es/cajal/default.htm for details) ------------------------------------------------------------------- | Nestor Parga | | | | Phone : (+34) 91-397-4542 | | Dpto. de Fisica Teorica, C-XI | Fax : (+34) 91-397-3936 | | Universidad Autonoma de Madrid | E-mail: nestor at ftnp.ft.uam.es| | 28049 Madrid, SPAIN | parga at delta.ft.uam.es| | | | http://ket.ft.uam.es/~neurociencia/nestor | ------------------------------------------------------------------- From piuri at fusberta.elet.polimi.it Tue Feb 19 15:35:09 2002 From: piuri at fusberta.elet.polimi.it (Vincenzo Piuri) Date: Tue, 19 Feb 2002 21:35:09 +0100 Subject: CALL FOR PAPERS: DEADLINE EXTENSION TO MARCH 8 !!! VIMS 2002 IN CONJUNCTION WITH IMTC 2002 Message-ID: <5.1.0.14.0.20020219213424.02e23110@pop3.norton.antivirus> VIMS 2002 2002 IEEE INTERNATIONAL SYMPOSIUM ON VIRTUAL AND INTELLIGENT MEASUREMENT SYSTEMS Mt. Alyeska Resort Hotel (nearby Anchorage), AK, USA - 19-20 May 2002 Sponsored by IEEE Instrumentation and Measurement Society & IEEE Neural Network Council With the technical cooperation of International Neural Network Society Instrumentation, Systems, and Automation Society VIMS2002 is held in conjunction with IEEE World Congress on Computational Intelligence - WCCI'02, Honolulu, HW, USA, 12-17 May 2002 IEEE Instrumentation and Measurement Technology Conference - IMTC2002, Anchorage, AK, USA, 21-23 May 2002 >>>> PAPER SUBMISSION DEADLINE: EXTENDED TO 8 MARCH 2002 <<<< ALL DETAILED INFORMATION ARE AVAILABLE AT http://ewh.ieee.org/soc/im/vims/ Virtual environments become highly attractive to afford complex application problems where simulation plays a relevant role to model and analyze the behavior of complex systems, to design innovative solutions for industrial production processes and products, to assess the feasibility and the effectiveness of processes and products. Besides, realization of virtual systems on computer-based environments allows for re-using components and adapting their behaviors to the application needs, hence reducing production cost and time. On the other hand, adaptive and evolving solutions become increasingly more and more relevant in applications requiring an adaptable behavior for the system according to the changing needs of the users, the application, and the environment. Intelligent techniques based on soft-computing (i.e., neural networks, fuzzy logic, and genetic algorithms) have been proved effective to support such an adaptation. Nowadays, integration of different components, realized by using heterogeneous computing paradigms, is important to save investment and exploit the features offered by well-assessed algorithmic approaches. Globalization and the need of distributed sensing, monitoring, and control drive also the inclusion of computer science technologies (e.g., distributed networks, agents, cooperative systems, web, mobile systems, micro- and nano-robots) to achieve the appropriate and modular integration in larger and more complex systems. This issue is becoming relevant in environmental monitoring, power distribution, and many other industrial applications. Intelligent distributed virtual environments will therefore play a key role in the industry as well as in the daily life to support the evolving needs of the users and the economy. Suppliers continually offer more affordable hardware and software components to implement these innovative approaches. Industries, government agencies, and research institutions widely consider and use these techniques. Up to now, analysis and experiments have been performed mainly at qualitative level by scientists and practitioners, aiming to understand the underlying technologies and methodologies, but without any specific focus on the mandatory need of a quantitative assessment and a metrological analysis. The VIMS 2002 symposium is therefore directed to fill this gap in knowledge and practice, especially by focusing on the quantitative aspect of instrumentation and measurement issues. Sessions will cover all aspects of soft computing technologies and virtual environments related to instrumentation and measurement, from the point of view both of theory and practical applications. General Co-Chairs: Vincenzo Piuri, University of Milan, Italy Enrique H. Ruspini, SRI International, USA Technical Program Co-Chairs: Cesare Alippi, Politecnico di Milano, Italy Evangelia Micheli-Tzanakou, Rutgers University, USA Mel Siegel, Carnegie Mellon University, USA Symposium Coordinator: Robert Myers, Myers-Smith Inc., USA From edwin at cs.brandeis.edu Tue Feb 19 18:21:18 2002 From: edwin at cs.brandeis.edu (Edwin de Jong) Date: Tue, 19 Feb 2002 18:21:18 -0500 (EST) Subject: CFP: ICML Workshop on Development of Representations Message-ID: [Apologies if you receive multiple copies of this announcement.] -------------------------------------------------------------------------- CALL FOR PAPERS ICML Workshop Development of Representations July 9th, 2002, Sydney, Australia http://www.demo.cs.brandeis.edu/icml02ws/ DESCRIPTION The representation of a learning problem has long been known to be a major factor in learning performance. The nature of appropriate representations and representational change as a part of the learning process have been studied in a variety of forms in a number of subfields within machine learning, artificial intelligence and, more recently, other communities. Despite this fact, representations are typically hand-coded rather than acquired automatically. The goal of this workshop is to explore problems in the area of automated development of representations and to build ties between the various relevant communities. From juergen at idsia.ch Tue Feb 19 10:14:15 2002 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Tue, 19 Feb 2002 16:14:15 +0100 Subject: optimal predictors Message-ID: <3C726BC7.32E774BF@idsia.ch> There is an optimal way of predicting the future, given past observations. Normally we do not know the true conditional probability distribution p(next event | past). But assume we do know that p is in some set P of distributions. Choose a fixed weight w_q for each q in P such that the w_q add up to 1 (for simplicity, let P be countable). Then construct the Bayesmix M(x) = Sum_q w_q q(x), and predict using M instead of the optimal but unknown p. How wrong is it to do that? The recent exciting work of Marcus Hutter (IDSIA) provides general and sharp (!) loss bounds: Let LM(n) and Lp(n) be the total expected losses of the M-predictor and the p-predictor, respectively, for the first n events. Then LM(n)-Lp(n) is at most of the order of sqrt[Lp(n)]. That is, M is not much worse than p. And in general, no other predictor can do better than that! In particular, if p is deterministic, then the M-predictor soon won't make any errors any more. If P contains ALL computable distributions, then M becomes the celebrated enumerable universal prior. That is, after decades of somewhat stagnating research we now have sharp loss bounds for Solomonoff's universal (but incomputable) induction scheme. Similarly, if we replace M by the Speed Prior S - where S(x) is small if x is hard to compute by any method - we obtain appropriate loss bounds for computable S-based induction. Alternatively, reduce M to what you get if you just add up weighted estimated future finance data probabilities generated by 1000 commercial stock-market prediction software packages. If only one of them happens to work fine (but you do not know which) you still should get rich. Note that the approach is much more general than what is normally done in traditional statistical learning theory, where the often quite unrealistic assumption is that the observations are statistically independent. To learn more, please read Optimality of Universal Bayesian Sequence Prediction for General Loss and Alphabet: ftp://ftp.idsia.ch/pub/techrep/IDSIA-02-02.ps.gz and also check out Hutter's other recent papers at ICML, ECML, NIPS, Int. J. of Foundations of CS: www.idsia.ch/~marcus ------------------------------------------------- Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland juergen at idsia.ch http://www.idsia.ch/~juergen From terry at salk.edu Wed Feb 20 20:27:16 2002 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 20 Feb 2002 17:27:16 -0800 (PST) Subject: NEURAL COMPUTATION 14:3 In-Reply-To: <200201221744.g0MHiB917659@purkinje.salk.edu> Message-ID: <200202210127.g1L1RGo36508@dax.salk.edu> Neural Computation - Contents - Volume 14, Number 3 - March 1, 2002 VIEW What Geometric Visual Hallucinations Tell Us About the Visual Cortex Paul C. Bressloff, Jack D. Cowan, Martin Golubitsky, Peter J. Thomas, and Matthew C. Wiener LETTERS An Amplitude Equation Approach to Contextual Effects in Visual Cortex Paul C. Bressloff and Jack D. Cowan Derivation of the Visual Contrast Response Function by Maximizing Information Rate Allan Gottschalk A Bayesian Framework for Sensory Adaptation Norberto M. Grzywacz and Rosario M. Balboa Analysis of Oscillations in a Reciprocally Inhibitory Network with Synaptic Depression Adam L. Taylor, Garrison W. Cottrell, William B. Kristan, Jr. Activity-Dependent Development of Axonal and Dendritic Delays or, Why Synaptic Transmission Should Be Unreliable Walter Senn, Martin Schneider, and Berthold Ruf Impact of Geometrical Structures on the Output of Neuronal Models: A Theoretical and Numerical Analysis Jianfeng Feng and Guibin Li Sparse On-Line Gaussian Processes Lehel Csato and Manfred Opper Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem Mark Girolami Natural Discriminant Analysis using Interactive Potts Models Jiann-Ming Wu ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2002 - VOLUME 14 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $506 $451.42 $554 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From K.Branney at elsevier.nl Wed Feb 20 09:01:01 2002 From: K.Branney at elsevier.nl (Branney, Kate (ELS)) Date: Wed, 20 Feb 2002 09:01:01 -0500 Subject: Call for papers: Special issue of NEUROCOMPUTING on Bioinformatic s Message-ID: <46414F09B351C64BAA875CE0B37BE07101446312@elsamsvexch02.elsevier.nl> Apologies for cross-postings. CALL FOR PAPERS NEUROCOMPUTING An International Journal published by Elsevier Science B.V., vol. 42-47, 24 issues, in 2002 ISNN 0925-2312, URL: http://www.elsevier.com/locate/neucom Special Issue on Bioinformatics Paper Submission Deadline: July 31st, 2002 Bioinformatics applies -- simply stated -- computational methods to the solution of biological problems. Bioinformatics, genomics, molecular biology, molecular evolution, computational biology, and affine fields are at the intersection between two axes: data sequences/physiology and information technology. Sequences include DNA sequences (gene, genome, organization), molecular evolution, protein structure, folding, function, and interaction, metabolic pathways, regulation signaling networks, physiology and cell biology (interspecies, interaction), as well as ecology and environment. Information technology in this context includes hardware and instrumentation, computation, as well as mathematical and physical models. The intersection between two subfields, one in each axis, generates areas including those known as genome sequencing, proteomics, functional genomics (microarrays, 2D-PAGE, ...), high-tech field ecology, genomic data analysis, statistical genomics, protein structure, prediction, protein dynamics, protein folding and design, data standards, data representations, analytical tools for complex biological data, dynamical systems modeling, as well as computational ecology. Research in these fields comprises property abstraction from the biological system, design and development of data analysis algorithms, as well as of databases and data access web-tools. Genome sequencing and related projects generate vast amounts of data that needs to be analyzed, thus emphasizing the relevance of efficient methods of data analysis and of the whole discipline. The Neurocomputing journal invites original contributions for the forthcoming special issue on Bioinformatics from a broad scope of areas. Some topics relevant to this special issue include, but are not restricted to: -- Theoretical foundations, algorithms, implementations, and complete systems -- Sequence analysis (single, multiple), alignment, annotation, etc. -- Improvements in databases and web-tools for bioinformatics -- Novel metrics and biological data preprocessing for posterior analysis -- Systems biology models and data modeling techniques including statistical inference, stochastic processes, random walks, Markov chains, hidden Markov models, motifs, profiles, dynamic programming, pattern recognition techniques, neural networks, support vector machines, evolutionary models, tree estimation, etc. -- Pathway inference, e.g. to determine where to target a drug using gene expression data and address side effects by providing information on where else a target metabolite appears. -- Key applications in diverse fields including bioinformatics, genomics, molecular biology, molecular evolution, computational biology, drug design, etc. Please send two hardcopies of the manuscript before July 31st, 2002, to: V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130, Pasadena, CA 91116-6130, U.S.A. Street address: 1149 Wotkyns Drive Pasadena, CA 91103, U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net including abstract, keywords, a cover page containing the title and author names, corresponding author name's complete address including telephone, fax, and email address, and clear indication to be a submission to the Special Issue on Bioinformatics. Guest Editors Harvey J. Greenberg Center for Computational Biology University of Colorado at Denver P.O. Box 173364 Denver, CO 80217-3364 Phone: (303) 556-8464 Fax: (303) 556-8550 Email: Harvey.Greenberg at cudenver.edu Lawrence Hunter Center for Computational Pharmacology University of Colorado Health Science Center 4200 E. Ninth Ave. Denver, CO 80262 Phone: (303) 315-1094 Fax: (303) 315-1098 Email: Larry.Hunter at uchsc.edu Satoru Miyano Human Genome Center Institute of Medical Science University of Tokyo 4-6-1 Shirokanedai, Minato-ku, Tokyo 108-8639, Japan. Phone: +81-3-5449-5615 Fax: +81-3-5449-5442 Email: miyano at ims.u-tokyo.ac.jp Ralf Zimmer Praktische Informatik und Bioinformatik Institut fr Informatik LMU Mnchen Theresienstrasse 39 D-80333 Mnchen Phone: +49-89-2180-4447 Fax: +49-89-2180-4054 Email: zimmer at bio.informatik.uni-muenchen.de V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130 Pasadena, CA 91116-6130, U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net From Gustavo.Deco at mchp.siemens.de Thu Feb 21 05:02:31 2002 From: Gustavo.Deco at mchp.siemens.de (Gustavo Deco) Date: Thu, 21 Feb 2002 11:02:31 +0100 Subject: New Book Announcement Message-ID: <3C74C5B7.9C642A94@mchp.siemens.de> NEW BOOK ANNOUNCEMENT "Computational Neuroscience of Vision" Edmund T. Rolls University of Oxford, Department of Experimental Psychology and Gustavo Deco Siemens Corporate Technology, Germany Oxford University Press, 2002 588 pages, numerous figures, 238X168mm ISBN 0-19-852489-7 Hardback ISBN 0-19-852488-9 Paperback This exciting new book describes visual information processing in the brain. The book focusses on the visual information processing and computational operations in the visual system that lead to representations of objects in the brain, and on the mechanisms that underlie attentional processes. In addition to visual processing, it also considers how visual inputs reach and are involved in the computations underlying a wide range of behaviour, including short term memory, long term memory and emotion, thus providing a foundation for understanding the operation of a number of different brain systems. This fascinating book will be of value to all those interested in understanding how the brain works, and in understanding vision, attention, memory, emotion, motivation and action. The book combines a neurocomputational approach with neurophysiological, neuropsychological, and neuroimaging approaches. Readership: Neuroscientists, psychologists, and neuropsychologists interested in vision. Computational neuroscientists. Vision scientists. Contents Preface 1 Introduction 2 The primary visual cortex 3 Extrastriate visual areas 4 The parietal cortex 5 Inferior temporal cortical visual areas 6 Visual attentional mechanisms 7 Neural network models 8 Models of invariant object recognition 9 The cortical neurodynamics of visual attention - a model 10 Visual search: Attentional neurodynamics at work 11 A computational approach to the neuropsychology of visual attention 12 Outputs of visual processing 13 Principles and conclusions Appendix A. Introduction to linear algebra for neural networks Appendix B. Information theory References Index The book can be ordered directly from Oxford University Press, http://www.oup.co.uk/isbn/0-19-852488-9, and is available in bookshops. Updates to the publications cited are available at www.cns.ox.ac.uk From gai at gmdh.kiev.ua Thu Feb 21 12:44:10 2002 From: gai at gmdh.kiev.ua (Gregory Ivakhnenko) Date: Thu, 21 Feb 2002 19:44:10 +0200 Subject: ICIM 2002 - Deadline extended Message-ID: <013301c1baff$77155c80$06bc5dc2@niss.gov.ua> Hello, Because of numerous requests for extension, the deadline for submission of papers to the I International Conference on Inductive Modelling (ICIM 2002) is now March 20, 2002. Information regarding the ICIM'2002 can be found on our website at: http://www.niss.gov.ua/Center/ICIM/index.htm This I International Conference on Inductive Modelling will be held in Lviv, Ukraine, from June 25-28, 2002 and will present the latest results in the growing field of neural networks in data mining and forecasting, pattern recognition and parallel computing. Regards, Gregory Ivakhnenko -- National Institute for Strategic Studies Kyiv, Ukraine http://www.GMDH.net From meesad at okstate.edu Thu Feb 21 15:53:04 2002 From: meesad at okstate.edu (Phayung Meesad) Date: Thu, 21 Feb 2002 14:53:04 -0600 Subject: IJCNN'02: Preliminary Technical Program Message-ID: <003501c1bb19$c1f01dc0$323b4e8b@okstate.edu> Dear prospective participants, We want to update you on some of the activities with regard to the IJCNN=9202 which is a part of the 2002 IEEE World Congress on Computational Intelligence (WCCI2002). The preliminary technical programs are now available on the web at http://www.wcci2002.org under the WCCI Program link. These data are preliminary and may be revised, but can offer you a good picture of the diverse range of papers that will be presented at the WCCI2002 meeting. In addition, under the special technical program link, you'll find the information for the plenary and special lectures, and the tutorials that will be presented on Sunday, May 12. Each of the tutorials has an accompanying abstract so you can learn more about it and its presenter. If you registered but didn't sign up for any tutorials and would like to revise your registration to include one or more tutorials, please let me know. If you registered before Feb. 1, 2002 we'll be pleased to offer the early registration prices for any tutorials that you would like to attend. If you have not already made your hotel reservations at the Hilton Hawaiian Village, please do so at your earliest convenience. There is a link from our main web page at http://www.wcci2002.org for hotel reservations. The hotel is reserving space for the WCCI2002 conference participants. We look forward to seeing you in Honolulu. Sincerely, David Fogel, General Chairman, WCCI2002 Gary Yen and Phayung Meesad, IJCNN=9202 Publicity=20 From dtl at marr.bsee.swin.edu.au Fri Feb 22 02:35:22 2002 From: dtl at marr.bsee.swin.edu.au (Dr David Liley) Date: Fri, 22 Feb 2002 17:35:22 +1000 Subject: Australia (Melbourne) - Postdoctoral position in Theoretical Neurobiology Message-ID: Post-doctoral Research Fellow, Center for Intelligent Systems and Complex Processes, School of Biophysical Sciences and Electrical Engineering, Swinburne University of Technology, A$ 36,460 - 49,337 Available immediately for 2 1/2 years with the possibility of extension A postdoctoral researcher is required for a 3-year Australian Research Council funded project available from April 2002. The project concerns the experimental validation of a physiologically specific mathematical theory of alpha electroencephalographic (8-13 Hz) activity. This theory has been developed by researchers within the Center for Intelligent Systems and Complex Processes and suggests a novel basis for electroencephalographic rhythmogenesis that depends upon local inhibitory-inhibitory neuronal population interactions. This non-linear theory provides good descriptions of scalp recordable alpha activity in the context of plausible physiological and anatomical parameterization and gives rise to relatively specific predictions regarding the form of evoked electroencephalographic activity. Further information on this theory can be found at http://marr.bsee.swin.edu.au It is anticipated that many of the anatomical and physiologically specific parameters can be estimated by curve fitting an analytical white noise fluctuation (linear) spectrum, obtained from the theory, to spontaneous alpha EEG activity. Novel methods based on evolutionary and swarm (collective intelligence) directed search strategies will be developed to robustly and efficiently estimate these parameters. The full non-linear theory will then be used to constrain all of the remaining model parameters by solving a set of boundary value non-linear ordinary differential equations. The validity of these parameter sets will then be assessed by using them to predict the latency and amplitude of the middle and late components of the corresponding evoked cortical EEG activity. This project will be based within the School of Biophysical Sciences and Electrical Engineering which has significant facilities for high performance computing (a 64 node Compaq Alpha cluster), data visualization (including stereoscopic viewing facilities) and high density EEG data collection (64 channel Neuroscan and EGI systems). A strong mathematical and computer modeling background is required in order to parametrically constrain systems of partial differential equations that describe the most pertinent dynamical features of scalp recordable EEG. Practical knowledge of digital signal processing and time series analysis is necessary. Applicants must have a relevant PhD in Physics, Applied Mathematics, Computational Physics or Theoretical Neuroscience. Informal enquiries regarding these positions and the project in general should be addressed to the project co-ordinator, Dr David T J Liley, School of Biophysical Sciences and Electrical Engineering, Swinburne University of Technology, Hawthorn VIC 3122, Australia + 61 3 9214 8812, email: dliley at swin.edu.au, WWW: http://marr.bsee.swin.edu.au. Further details regarding this position (position number 22591) and application instructions are available at http://www.swin.edu.au/corporate/hr/posvac/external.htm Closing Date: 22th March 2002 ----------------------------------------------------------------------- Dr David Liley MBChB, PhD Senior Lecturer in Biophysics School of Biophysical Sciences and Electrical Engineering Swinburne University of Technology P.O. Box 218 Hawthorn VIC 3122 Australia ph: +61 3 9214 8812 fax: +61 3 9819 0856 email: dliley at swin.edu.au WWW: http://marr.bsee.swin.edu.au/~dtl ----------------------------------------------------------------------- From rothschild at cs.haifa.ac.il Fri Feb 22 08:08:39 2002 From: rothschild at cs.haifa.ac.il (Rothschild Institute) Date: Fri, 22 Feb 2002 15:08:39 +0200 (IST) Subject: Postdoc Position and Visitors at the Caesarea Rothschild Institute Message-ID: Please forward to interested candidates. Apologies in advance if you are subscribed to multiple mailinglists. -------------------------------------------------------------------------- The Caesarea Edmond Benjamin de Rothschild Institute for Interdisciplinary Applications of Computer Science invites applications from new postdoctoral graduates and established researchers who wish to visit the University of Haifa for short-term or long-term periods during 2002-04. The Institute was founded in 2001 and has an active program of workshops, seminars and collaborative projects. Focus areas include Combinatorial and Graph Theoretic Algorithms, Artificial Intelligence, Computational Linguistics and Neural Science, CS Applications in Statistics, Multimedia and Education, Vision and Psychology. On our website http://www.rothschild.haifa.ac.il you will find Call for Proposals for visitors workshops new interdisciplinary courses and other information about our programs including Research-in-Pairs. The University of Haifa is located on Mt. Carmel overlooking the Carmel forest and the Mediterranean Sea. POST-DOC POSITION in ALGORITHMIC GRAPH THEORY As part of our ongoing research and our focus year on Applications of Graph Theory and Algorithms, outstanding candidates with a recent doctoral degree in computer science and strong publication record are encouraged to apply. A working knowledge of Hebrew is an asset but is not required. Please send applications to Prof. Martin Golumbic . From nils at nero.uni-bonn.de Fri Feb 22 11:46:09 2002 From: nils at nero.uni-bonn.de (ws-grow@nero.uni-bonn.de) Date: Fri, 22 Feb 2002 17:46:09 +0100 Subject: CFP: SAB'02-Workshop: On Growing up Artifacts that Live Message-ID: <200202221646.g1MGk5u19039@marvin.nero.uni-bonn.de> [Apologies if you receive this message more than once] ########################################################## ### ### ### CALL FOR PAPERS ### ### ### ########################################################## SAB'2002 Workshop ON GROWING UP ARTIFACTS THAT LIVE Basic Principles and Future Trends http://www.nero.uni-bonn.de/ws-grow.html August 10, 2002, Edinburgh, Scotland (UK) To be held in conjunction with SAB'02 Conference http://www.isab.org.uk/sab02 Important dates ========================== 5 April, 2002: Submission of papers, up to 1o pages 3 May, 2002: Notification of acceptance 14 June, 2002: Deadline for camera-ready papers 4-9 August, 2002: SAB'02 Conference 10 August, 2002: Workshop, Edinburgh Call for Participation =========================== One of the most challenging features of living artifacts is the ability to grow. One of the most interesting features of growing is the special capability to grow up. Aim and scope ================= The aim of the workshop is to enlighten basic principles and fundamental requirements to enfore artifacts that can grow up. To "Grow Up" means, that the system starts with a basic, pre-structured set of functionalities and develops its individual capabilities during livetime in close interaction with the environment. A schedule for temporal development will drive the artefact through a a well defined sequence of stages from the infancy state to an individually matured entity. Along this sequence the artefact will learn with respect to, and in interaction with the environment, thus piling up experience, and leading to new qualitative stages of behaviour. Besides adequate learning and adaptaion rules, the organisation of the memory and the modular structure of the system must be featured to enable this ontogenetic process of development. Below you will find a brief summary of theses and principles that are said to lead to a living, up-growing artefact: - One of the most challenging features of living artifacts is the ability to grow. - One of the most interesting features of a growing artefact is the special capability of growing up. - Growing up means the evolution from an infant-like pre-defined state to a fully matured entity. - Growing up requires a special organisational structure of the entire artefact, that allows to grow up. - Growing up requires interaction with the environment, including the interaction with other "living artifacts". - Growing up requires the capability of learning from the experience acquired in interaction with the environment. - Learning from experience requires a specialised structure of the underlying system. - The specialised structure (e.g. systemic architecture) is covering: adaptive structures, learning schemes, organisation of memory and reasoning, ... Fundamentals from psychology, from memory organisation, from theory of learning (machine learning and psychology), underlying systemic architectures enabling the required capabilities, cognition science and behavioural knowledge and further principles are within the scope of the workshop. The workshop will envisage, but not be limited to the topics listed below: - Internal models and representation - Architectures for autonomous agents - Behavioural sequencing - Learning and development - Psychology of learning - Motivation and emotion - Emergent structures and behaviours - Evolutionary and co-evolutionary approaches Not only the state of the art, but actual and novel ideas and future trends are focused by this workshop. Especially unconventional, Blue-Sky like ideas are welcome, and will be considered valuable for presentation and discussion within the workshop. Therefore an open, hopefully, brain storming discussion will be part of the workshop. The talks and the posters will be on an open basis, encouraging scientists to present even unusual ideas. Paper submission and publication ===================================== Papers not exceeding 10 pages in 10pt, one-column format (Springer LNCS style), should be submitted electronically (PDF or PS) as attachment files to the following email address: ws-grow at nero.uni-bonn.de In case electronic submission is causing problems, please contact the organisers. Formatting instructions, including a Latex template: http://www.springer.de/comp/lncs/authors.html All submissions will be reviewed for acceptance as talks or poster presentation by the program committee and the organisers. Authors of selected papers will be asked for an extended paper submission after the workshop for publication. Since the topic of the workshop is aiming beyond state of the art development, involving a variety of different fields, the authors are asked to facilitate the accession to the content of their contribution by including a brief introductory passage at the beginning of the article. Important dates ========================== 5 April, 2002: Submission of papers 3 May, 2002: Notification of acceptance 14 June, 2002: Deadline for camera-ready papers 4-9 August, 2002: SAB'02 Conference 10 August, 2002: Workshop, On Growing up Artifacts that Live Programme / Scientific Committee ====================================== Alois Knoll, Technical University Munich (TUM), Germany Andy M. Tyrell, The University of York, United Kingdom Horst-Michael Gross, Ilmenau Technical University, Germany Tim Pearce, University of Leicester, United Kingdom Ulrich Rueckert, University of Paderborn, Germany Giulio Sandini, University of Genova, Italy Thomas Christaller, Fraunhofer Institute AiS, Germany Bruno Apolloni, University of Milan, Italy Peter Ross , School of Computing, Napier University, Edinburgh, Scotland (UK) Georg Dorffner, Austrian Research Institute for Artificial Intelligence (OFAI), Austria Erich Prem, Austrian Research Institute for Artificial Intelligence (OFAI), Austria David Willshaw, Institute for Adaptive and Neural Computation, The University of Edinburgh, Scotland (UK) Giovanna Morgavi, Istituto per i Circuiti Elettronici, National Research Council (ICECNR), Italy Nils Goerke, Neuroinformatics, University of Bonn, Germany Organisers ================ Nils Goerke Division of Neuroinformatics (NERO), University of Bonn Roemerstr. 164, D-53117 Bonn, Germany http://www.nero.uni-bonn.de E-Mail: goerke at nero.uni-bonn.de Peter Ross School of Computing, Napier University, Edinburgh, Scotland (UK) http://www.soc.napier.ac.uk Georg Dorffner Erich Prem Austrian Research Institute for Artificial Intelligence (OFAI), Vienna, Austria http://www.ai.univie.ac.at/oefai/oefai.html Giovanna Morgavi Istituto per i Circuiti Elettronici, National Research Council, (ICECNR), Genova, Italy http://www.ge.cnr.it David Willshaw Institute for Adaptive and Neural Computation, The University of Edinburgh, Edinburgh, Scotland (UK) http://www.informatics.ed.ac.uk/research/ianc/ PLEASE DISTRIBUTE THIS CALL FOR PAPERS ============================================= Hope to see you in Edinburgh for the workshop Best regards Nils Goerke From dblazis at mbl.edu Fri Feb 22 10:37:15 2002 From: dblazis at mbl.edu (Diana Blazis) Date: Fri, 22 Feb 2002 10:37:15 -0500 Subject: Methods in Computational Neuroscience Message-ID: <5.1.0.14.0.20020222103625.00a48ec0@mail.mbl.edu> 2002 Marine Biological Laboratory Special Topics Course Methods in Computational Neuroscience August 4 - September 1, 2002 Directors: William Bialek, Princeton University Rob de Ruyter, NEC Research Institute. Financial assistance is available for this course Deadline extended to: March 7, 2002 Animals interact with a complex world, encountering a wide variety of challenges: they must gather data about the environment, discover useful structures in these data, store and recall information about past events, plan and guide actions, learn the consequences of these actions, etc. These are, in part, computational problems that are solved by networks of neurons, from roughly 100 cells in a small worm to 100 billion in humans. Careful study of the natural context for these tasks leads to new mathematical formulations of the problems that brains are solving, and these theoretical approaches in turn suggest new experiments to characterize neurons and networks. This interplay between theory and experiment is the central theme of this course. For more information and application forms please visit http://courses.mbl.edu/ or contact Carol Hamel, Admissions Coordinator at 508/289-7401 or admissions at mbl.edu Diana E.J. Blazis, Ph.D. dblazis at mbl.edu Staff Scientist and Director, CASSLS at the Marine Biological Laboratory PH (508) 289-7535 7 MBL Street, Woods Hole, MA 02543 FAX (508) 289-7951 http://www.mbl.edu/CASSLS From cl at andrew.cmu.edu Sun Feb 24 22:40:45 2002 From: cl at andrew.cmu.edu (Christian Lebiere) Date: Sun, 24 Feb 2002 22:40:45 -0500 Subject: postdoctoral positions at CMU Message-ID: <148772.3223579245@[10.0.1.2]> Dear colleagues, I apologize in advance if you received this message more than once. We have several postdoctoral positions available at CMU on the Combined Computational and Behavioral approaches to the study of cognition. Please advise any suitable candidates who might be interested. The application deadline is March 15. The applicants must be US citizens or nationals and should be interested in learning to develop computational models of cognition (or continuing to train in that area.). Below is the list of members of the training grant. In addition to contacting a prospective advisor, interested applicants should let me know that s/he will be sending in an application. Thanks for your help. Sincerely, Lynne Reder Dr. John Anderson Dr. Marlene Behrmann Dr. Patricia Carpenter Dr. Albert Corbett Dr. Bonnie John Dr. Marcel Just Dr. Roberta Klatzky Dr. Kenneth Koedinger Dr. Kenneth Kotovsky Dr. Christian Lebiere Dr. Marsha Lovett Dr. James McClelland Dr. David Plaut Dr. Lynne Reder Dr. Robert Siegler Dr. David Touretzky Dr. Raul Valdes-Perez Lynne M. Reder, Professor Department of Psychology Carnegie Mellon University Pittsburgh, PA 15213 phone: (412)268-3792 fax: (412) 268-2844 email: reder at cmu.edu URL: http://www.andrew.cmu.edu/~reder/reder.html From ericwan at ece.ogi.edu Mon Feb 25 18:38:59 2002 From: ericwan at ece.ogi.edu (Eric Wan) Date: Mon, 25 Feb 2002 15:38:59 -0800 Subject: Postdoc Position in Neural Controls Message-ID: <005f01c1be55$99230030$a85a5f81@ece.ogi.edu> POST-DOCTORAL RESEARCH ASSOCIATE The OGI School or Science and Engineering at OHSU has an opening for a post-doctoral research associate to participate in an interdisciplinary UAV neural controls project. Project overview: This project involves the design and implementation of nonlinear reconfigurable controllers using neural networks that exploit the coupled dynamics between a vehicle model (e.g., helicopter) and adaptive models of the environment. New model-predictive neural control techniques are developed to perform on-line optimization of vehicle control trajectories under dynamic and situational constraints. Now entering the 3rd year of this project, the main focus is on 1) increased simulation realism for ship-based VTOL, and 2) demonstration of the approaches using an instrumented RC helicopter. The successful candidate will work closely with an interdisciplinary team of software and control engineers, with specific responsibility for various aspects pertaining to control design, vehicle and aerodynamic modeling, and system integration. Home page: http://www.cse.ogi.edu/PacSoft/projects/sec/ Requirements: Candidate should have a Ph.D. with an expertise in nonlinear control and/or a strong background in flight dynamics modeling for rotorcraft, including rotor and airframe aerodynamics. Salary range $45,000 - $55,000 plus benefits. Location: OHSU's OGI School of Science and Engineering campus is in Hillsboro, Oregon, approximately 11 miles west of downtown Portland. Sponsor: DARPA Oregon Health & Science University is an Equal Opportunity Employer. Please send inquiries and background information to Prof. Eric A. Wan: ericwan at ece.ogi.edu. Eric A. Wan Associate Professor Department of Electrical and Computer Engineering Center for Spoken Language Understanding OGI School of Science and Engineering, OHSU http://www.ece.ogi.edu/~ericwan/ Note: On July 1, 2001, the Oregon Graduate Institute merged with the Oregon Health & Science University, becoming the OGI School of Science and Engineering at OHSU. From juergen at idsia.ch Tue Feb 26 05:23:45 2002 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Tue, 26 Feb 2002 11:23:45 +0100 Subject: PhD fellowship Message-ID: <3C7B6231.9555447B@idsia.ch> We are seeking a PhD student interested in optimal search algorithms & universal learning algorithms & reinforcement learning in partially observable environments. Please see http://www.idsia.ch/~juergen/phd2002.html ------------------------------------------------- Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland juergen at idsia.ch www.idsia.ch/~juergen From mhc27 at cornell.edu Wed Feb 27 11:38:03 2002 From: mhc27 at cornell.edu (Morten H. Christiansen) Date: Wed, 27 Feb 2002 11:38:03 -0500 Subject: No subject Message-ID: Apologies if you receive more than one copy of this announcement. The following book may be of interest to the readers of this list: Christiansen, M.H. & Chater, N. (Eds.) (2001). Connectionist Psycholinguistics. Westport, CT: Ablex. Description: Setting forth the state of the art, leading researchers present a survey on the fast-developing field of Connectionist Psycholinguistics: using connectionist or "neural" networks, which are inspired by brain architecture, to model empirical data on human language processing. Connectionist psycholinguistics has already had a substantial impact on the study of a wide range of aspects of language processing, ranging from inflectional morphology, to word recognition, to parsing and language production. Christiansen and Chater begin with an extended tutorial overview of Connectionist Psycholinguistics which is followed by the latest research by leading figures in each area of research. The book also focuses on the implications and prospects for connectionist models of language, not just for psycholinguistics, but also for computational and linguistic perspectives on natural language. The interdisciplinary approach will be relevant for, and accessible to psychologists, cognitive scientists, linguists, philosophers, and researchers in artificial intelligence. The book is suitable as a text book for advanced courses in connectionist approaches to language processing. It can also be used as a recommended or complementary text book in graduate and advanced undergraduate courses on the psychology of language. Table of Contents: Preface 1. Connectionist Psycholinguistics: The Very Idea Morten H. Christiansen and Nick Chater PART I: THE STATE OF THE ART 2. Connectionist Psycholinguistics in Perspective Morten H. Christiansen and Nick Chater 3. Simulating Parallel Activation in Spoken Word Recognition M. Gareth Gaskell and William D. Marslen-Wilson 4. A Connectionist Model of English Past Tense and Plural Morphology Kim Plunkett and Patrick Juola 5. Finite Models of Infinite Language: A Connectionist Approach to Recursion Morten H. Christiansen and Nick Chater 6. Dynamic Systems for Sentence Processing Whitney Tabor and Michael K. Tanenhaus 7. Connectionist Models of Language Production: Lexical Access and Grammatical Encoding Gary S. Dell, Franklin Chang, and Zenzi M. Griffin 8. A Connectionist Approach to Word Reading and Acquired Dyslexia: Extension to Sequential Processing David C. Plaut PART II: FUTURE PROSPECTS 9. Constraint Satisfaction in Language Acquisition and Processing Mark S. Seidenberg and Maryellen C. MacDonald 10. Grammar-based Connectionist Approach to Language Paul Smolensky 11. Connectionist Sentence Processing in Perspective Mark Steedman Index About the Editors and Contributors Connectionist psycholinguistics Edited by Morten H. Christiansen and Nick Chater ISBN: 1-56750-595-3 (pbk.) ISBN: 1-56750-594-5 (hc.) 400 pages, figures, tables Ablex Publishing Best regards, Morten Christiansen -- ------------------------------------------------------------------------ Morten H. Christiansen Assistant Professor Phone: +1 (607) 255-3570 Department of Psychology Fax: +1 (607) 255-8433 Cornell University Email: mhc27 at cornell.edu Ithaca, NY 14853 Office: 240 Uris Hall Web: http://www.psych.cornell.edu/faculty/people/Christiansen_Morten.htm Lab Web Site: http://cnl.psych.cornell.edu ------------------------------------------------------------------------ From ps629 at columbia.edu Thu Feb 28 18:25:28 2002 From: ps629 at columbia.edu (Paul Sajda) Date: Thu, 28 Feb 2002 18:25:28 -0500 Subject: BCI Data Competition Message-ID: <3C7EBC67.B5DDE058@columbia.edu> NIPS 2001: Post Workshop Data Competition In an effort to foster development of machine learning techniques and evaluate different algorithms for brain computer interfaces (BCI), we are announcing a data analysis competition. Datasets are available for download from http://newton.bme.columbia.edu/competition.htm . Participants are asked to follow a few simple rules: 1. All data sets should be evaluated single-trial--you should not average across multiple trials. 2. Report your classification results (i.e. labels) on the test set. You can report results for any or all of the datasets. 3. Please submit a short (one page or less) description of the pattern classifier you used. Please include a description of any pre/post processing that you may have done. 4. Use of these datasets implies that the participant agrees to cite the origin of the data in any publication (e.g. see each dataset description for bibTex entry). ALL SUBMISSIONS ARE DUE JUNE 1st 2002 . Announcement of results will be coordinated with the June 2002 BCI Workshop (in upstate NY). Email submissions to nips-bci at newton.bme.columbia.edu. Questions should be directed to nips-bci at newton.bme.columbia.edu. New information on the competition and datasets will be posted periodically on this site. Good Luck - Paul Sajda, Ph.D. Associate Professor Department of Biomedical Engineering Columbia University 351 Engineering Terrace Building, Mail Code 8904 1210 Amsterdam Avenue New York, NY 10027 tel: (212) 854-5279 fax: (212) 854-8725 email: ps629 at columbia.edu http://newton.bme.columbia.edu