From rsun at cecs.missouri.edu Fri Nov 1 15:42:04 2002 From: rsun at cecs.missouri.edu (rsun@cecs.missouri.edu) Date: Fri, 1 Nov 2002 14:42:04 -0600 Subject: new issues of Cognitive Systems Research Message-ID: <200211012042.gA1Kg4a21256@ari1.cecs.missouri.edu> New issues are now available COGNITIVE SYSTEMS RESEARCH Volume 3, Issue 3, Pages 271-554, 2002 =============================================================================== Cognitive Systems Research Volume 3, Issue 3, Pages 271-554 a special issue on situated and embodied cognition edited by Tom Ziemke TABLE OF CONTENTS Introduction to the special issue on situated and embodied cognition, Pages 271-274 Tom Ziemke http://www.sciencedirect.com/science/article/B6W6C-46XHBH0-1/1/3fb9b1689e5f826bcf849a75afcff513 Representation in dynamical and embodied cognition, Pages 275-288 Fred Keijzer http://www.sciencedirect.com/science/article/B6W6C-45H92WR-1/1/7418aabb151835c54aea8d6c8497409c An ecological approach to embodiment and cognition, Pages 289-299 Naoya Hirose http://www.sciencedirect.com/science/article/B6W6C-45HWNG0-1/1/29d27bf1b77832a1191b11c2a4613f0c Semantics, experience and time, Pages 301-337 Stephen E. Robbins http://www.sciencedirect.com/science/article/B6W6C-45M6B2X-1/1/98a589eed632a2f7fa5f959113d1690f When is a cognitive system embodied?, Pages 339-348 Alexander Riegler http://www.sciencedirect.com/science/article/B6W6C-45HFF6Y-2/1/84e99cabef651e9032acc4aaa8056d8b Cognitive task transformations, Pages 349-359 David de Leon http://www.sciencedirect.com/science/article/B6W6C-45HFF6Y-3/1/140319f8b44cb3969aa2b4abd87b6d92 Operationalizing situated cognition and learning, Pages 361-383 Steven M. Kemp http://www.sciencedirect.com/science/article/B6W6C-45KSPCF-1/1/24a1f6377884f0f873a94a99c7eff961 Interfaces of social psychology with situated and embodied cognition, Pages 385-396 Gun R. Semin and Eliot R. Smith http://www.sciencedirect.com/science/article/B6W6C-45H92WR-2/1/cdbdedfa258d6af8f17c982f8fdba684 From Dave_Touretzky at cs.cmu.edu Fri Nov 1 17:48:16 2002 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Fri, 01 Nov 2002 17:48:16 -0500 Subject: graduate training: Center for the Neural Basis of Cognition Message-ID: <3526.1036190896@ammon.boltz.cs.cmu.edu> Graduate Training with the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers an interdisciplinary doctoral training program operated jointly with nine affiliated PhD programs at Carnegie Mellon University and the University of Pittsburgh. Detailed information about this program is available on our web site at http://www.cnbc.cmu.edu The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, statistics, or robotics. For more information about the program, and to obtain application materials, visit our web site at www.cnbc.cmu.edu, or contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu The affiliated PhD programs at the two universities are: Carnegie Mellon University of Pittsburgh Biological Sciences Mathematics Computer Science Neuroscience Computational & Statistical Psychology Learning Psychology Robotics Statistics The CNBC training faculty includes: Eric Ahrens (CMU Biology): MRI studies of the vertebtate nervous system John Anderson (CMU Psychology): models of human cognition German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Alison Barth (CMU Biology): molecular basis of plasticity in neocortex Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Cameron S. Carter (Pitt Psychology/Neuroscience): fMRI and PET attention studies Carson Chow (Pitt Mathematics): spatiotemporal dynamics in neural networks Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Chris Genovese (CMU Statistics): making inferences from scientific data Lori Holt (CMU Psychology): mechanisms of auditory and speech perception John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Satish Iyengar (Pitt Statistics): spike train data analsysis Marcel Just (CMU Psychology): visual thinking, language comprehension Robert Kass (CMU Statistics): transmission of info. by collections of neurons Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Brian MacWhinney (CMU Psychology): models of language acquisition Yoky Matsuoka (CMU Robotics): human motor control and motor learning James McClelland (CMU Psychology): connectionist models of cognition Tom Mitchell (CMU Comp. Sci.): machine learning with application to fMRI Paula Monaghan-Nichols (Pitt Neurobiology): genetic analysis of verteb. CNS devel. Carl Olson (CNBC): spatial representations in primate frontal cortex Charles Perfetti (Pitt Psychology): language and reading processes David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia Lynne Reder (CMU Psychology): models of memory and cognitive processing Erik Reichle (Pitt Psychology): attention and eye movements in reading Jonathan Rubin (Pitt Mathematics): analysis of systems of coupled neurons Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex Peter Strick (Pitt Neurobiology): motor control; basal ganglia and cerebellum Floh Thiels (Pitt Neurosicence): LTP and LTD in hippocampus David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning Nathan Urban (CMU Bioogy): circuitry of the olfactory bulb Valerie Ventura (CMU Statistics): structure of neural firing patterns See http://www.cnbc.cmu.edu for further details. From inaki at cs.utexas.edu Tue Nov 5 16:29:18 2002 From: inaki at cs.utexas.edu (Faustino J. Gomez) Date: Tue, 5 Nov 2002 15:29:18 -0600 Subject: Neuroevolution paper, software, and demo announcement Message-ID: <200211052129.gA5LTIe8020586@laphroaig.cs.utexas.edu> Dear Connectionists, Enforced SubPopulations (ESP) version 3.0 is now available. ESP is a method that uses cooperative coevolution to evolve recurrent neural network for difficult reinforcement learning tasks that require memory. A paper describing the method (abstract below), source code, and animated demo in the double pole balancing task are all available at: http://www.cs.utexas.edu/users/nn/pages/research/ne-methods.html#esp --Faustino J. Gomez and Risto Miikkulainen Paper: ----------------------------------------------------------------------- ROBUST NON-LINEAR CONTROL THROUGH NEUROEVOLUTION. Faustino J. Gomez and Risto Miikkulainen Department of Computer Sciences, The University of Texas at Austin Technical Report TR-AI-02-292, Oct 2002. http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#gomez.tr02-292.ps.gz Abstract: Many complex control problems require sophisticated solutions that are not amenable to traditional controller design. Not only is it difficult to model real world systems, but often it is unclear what kind of behavior is required to solve the task. Reinforcement learning (RL) approaches have made progress by utilizing direct interaction with the task environment, but have so far not scaled well to large state spaces and environments that are not fully observable. In recent years, neuroevolution, the artificial evolution of neural networks, has had remarkable success in tasks that exhibit these two properties, but, like RL methods, requires solutions to be discovered in simulation and then transferred to the real world. To ensure that transfer is possible, evolved controllers need to be robust enough to cope with discrepancies between these two settings. In this paper, we demonstrate how a method called Enforced SubPopulations (ESP), for evolving recurrent neural network controllers, can facilitate this transfer. The method is first compared to a broad range of reinforcement learning algorithms on very difficult versions of the pole balancing problem that involve large (continuous, high-dimensional) state spaces and hidden state. ESP is shown to be significantly more efficient and powerful than the other methods on these tasks. We then present a model-based method that allows controllers evolved in a learned model of the environment to successfully transfer to the real world. We test the method on the most difficult version of the pole balancing task, and show that the appropriate use of noise during evolution can improve transfer significantly by compensating for inaccuracy in the model. Software: ----------------------------------------------------------------------- ESP 3.0 C++ SOURCE CODE http://www.cs.utexas.edu/users/nn/pages/software/abstracts.html#esp-cpp Faustino J. Gomez The ESP package contains source code implementing the Enforced SubPopulations algorithm and the pole balancing domain. The source code is written in C++, and is designed for easy extensibility to new tasks. Documentation for the code in html is available at: http://www.cs.utexas.edu/users/inaki/espdoc/ Demo: ----------------------------------------------------------------------- NON-MARKOV DOUBLE POLE BALANCING http://www.cs.utexas.edu/users/nn/pages/research/espdemo Faustino Gomez The page contains links to movies (in avi and Quicktime) showing the evolution of controllers for the non-Markov double pole balancing problem. The best controller from each generation is shown trying to balance the system using only three of the six state variables (no velocities). From terry at salk.edu Wed Nov 6 19:18:57 2002 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 6 Nov 2002 16:18:57 -0800 (PST) Subject: NEURAL COMPUTATION 14:11 In-Reply-To: <200209032319.g83NJJp43344@purkinje.salk.edu> Message-ID: <200211070018.gA70IvK12809@purkinje.salk.edu> Neural Computation - Contents - Volume 14, Number 11 - November 1, 2002 ARTICLE Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations Wolfgang Maass, Thomas Natschlaeger and Henry Markram NOTE Universal Approximation of Multiple Nonlinear Operators by Neural Networks Andrew D. Back and Tianping Chen LETTERS Long-Term Reward Prediction in TD Models of the Dopamine System Nathaniel D. Daw and David S. Touretzky Invariant Object Recognition in the Visual System with Novel Views of 3D Objects Simon M. Stringer and Edmund T. Rolls Dynamical Working Memory and Timed Responses: The Role of Reverberating Loops in the Olivo-Cerebellar System Werner M. Kistler and Chris I. De Zeeuw Selectively Grouping Neurons in Recurrent Networks of Lateral Inhibition Xiaohui Xie, Richard H. R. Hahnloser, and H. Sebastian Seung An Unsupervised Ensemble Learning Method for Nonlinear Dynamic State-Space Models Harri Valpola and Juha Karhunen Data-Reusing Recurrent Neural Adaptive Filters Danilo Mandic Training A Single Sigmoidal Neuron Is Hard Jiri Sima Two Timescale Analysis of Alopex Algorithm for Optimization P. S. Sastry, M. Magesh, K. P. Unnikrishnan A New Color 3D SFS Methodology Using Neural-Based Color Reflectance Models and Iterative Recursive Method Siu-Yeung Cho and Tommy W. S. Chow ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2002 - VOLUME 14 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $506 $451.42 $554 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From smyth at ics.uci.edu Wed Nov 6 23:48:09 2002 From: smyth at ics.uci.edu (Padhraic Smyth) Date: Wed, 06 Nov 2002 20:48:09 -0800 Subject: new faculty positions at UC Irvine in machine learning and statistics Message-ID: <3DC9F089.2090402@ics.uci.edu> Dear connectionists: The University of California, Irvine currently has open faculty positions in both machine learning and statistics. UCI has a very strong tradition in machine learning and AI and continues to add new faculty in a number of related areas such as bioinformatics, data mining, and computational statistics. Current faculty in Computer Science with interests in these areas include Pierre Baldi, Rina Dechter, David Eppstein, Rick Granger, Dennis Kibler, Rick Lathrop, Eric Mjolsness, Mike Pazzani, and Padhraic Smyth, as well as Hal Stern in the Department of Statistics. The open faculty positions are: A. One faculty position in the Department of Information and Computer Science in the area of Large Scale Data Analysis. We encourage applications from a broad range of "data-driven" research areas, such as machine learning, language modeling, information extraction, bioinformatics, computational vision, etc. For application details please see: http://www.ics.uci.edu/about/jobs/faculty.php The appointment may be made at the pre-tenure or tenured level - applicants at both levels are encouraged to apply. B. Two faculty positions in the newly-formed Department of Statistics, one tenure-track and one tenured. See http://www.stat.uci.edu/ for application information. The department was started this year under new chair, Professor Hal Stern. I strongly encourage readers of this list to apply for either position if interested. Please feel free to contact me directly if you have questions about either position. I would also be happy to chat with prospective applicants at NIPS if you would like to find out more about UCI in general - it is a great place to do learning-related research and you will probably like the weather as well :) all the best Padhraic Smyth Information and Computer Science University of California, Irvine From jjost at mis.mpg.de Thu Nov 7 04:48:44 2002 From: jjost at mis.mpg.de (Juergen Jost) Date: Thu, 7 Nov 2002 10:48:44 +0100 (MET) Subject: Five-year visiting professorship Message-ID: Max Planck Institute for Mathematics in the Sciences Leipzig, Germany Five-year visiting professorship The Max Planck Institute invites applications for a distinguished five-year visiting research professorship in the fields of Neural Networks and Mathematical Cognition Theory. Applicants should have demonstrated outstanding research potential and clear evidence of achievement. Applicants should be under the age of 35. The successful applicant is expected to carry out research in cooperation with other interdisciplinary groups at our Institute. The Institute offers excellent research facilities including a large visitor programme, see http://www.mis.mpg.de/ for further information. Salary will be on the German C2/C3 scale (comparable to an Associate Professorship in North America). Applications should be sent to: Prof. Dr. Eberhard Zeidler Max Planck Institute for Mathematics in the Sciences Inselstrasse 22 D - 04103 Leipzig Germany. The deadline for applications is December 31, 2002. Employment will start on October 1, 2003, or at a mutually agreeable date. Handicapped applicants will be given preference in case of equal qualification. The Max Planck Society as the employer aims at increasing the number of female scientists in fields where underrepresented. Therefore, women are particularly encouraged to apply. From terry at salk.edu Fri Nov 8 21:27:03 2002 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 8 Nov 2002 18:27:03 -0800 (PST) Subject: UCSD Computational Neurobiology Training Program In-Reply-To: <200211070018.gA70IvK12809@purkinje.salk.edu> Message-ID: <200211090227.gA92R3D15246@purkinje.salk.edu> DEADLINE: JANUARY 3, 2003 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/grad/other_compneuro.html The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers who are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Financial support for students enrolled in this training program is available through an NSF Integrative Graduate Education and Research Training (IGERT) award. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, behavioral, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology * Darwin Berg (Neurobiology): Regulation synaptic components, assembly and localization, function and long-term stability. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms * Virginia De Sa (Cognitive Science): Computational basis of perception and learning (both human and machine); multi-sensory integration and contextual influences * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research * Marla Feller (Neurobiology): Mechanisms and function of spontaneous activity in the developing nervous system including the retina, spinal cord, hippocampus and neocortex. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels * David Kleinfeld (Physics): Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. * William Kristan (Neurobiology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies * Scott Makeig (Institue for Neural Computation): Analysis of cognitive event-related brain dynamics and fMRI using time-frequency and Independent Component Analysis * Javier Movellan (Institue for Neural Computation): Sensory fusion and learning algorithms for continuous stochastic systems * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects * Terrence Sejnowski (Salk Institute/Neurobiology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques * Nicholas Spitzer (Neurobiology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of Neurobiology * Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons * Jochen Triesch (Cognitive Science): Sensory integration, visual psychophysics, vision systems and robotics, human-robot interaction, cognitive developmental * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms Requests for application materials should be sent to the University of California, San Diego, Division of Biological Sciences 0348, Graduate Admissions Office, 9500 Gilman Drive, La Jolla, CA, 92093-0348 or to [gradprog at biomail.ucsd.edu]. The deadline for completed application materials, including letters of recommendation, is January 3, 2003. For more information about applying to the UCSD Biology Graduate Program: http://www.biology.ucsd.edu/grad/admissions/index.html From wolpert at hera.ucl.ac.uk Mon Nov 11 04:56:30 2002 From: wolpert at hera.ucl.ac.uk (Daniel Wolpert) Date: Mon, 11 Nov 2002 09:56:30 -0000 Subject: Research fellow in action decoding Message-ID: <002401c28968$9c999470$66463ec1@aphrodite> UNIVERSITY COLLEGE LONDON Institute of Neurology We are seeking a postdoctoral research fellow in neuroscience to work with Professor Chris Frith and Professor Daniel Wolpert on a project studying 'interactions between agents'. This work aims to elucidate the physiological and computational mechanisms by which we use observed movements in order to detect other agents and make inferences about their goals and intentions. A variety of techniques will be used including behavioural studies, EEG and TMS. The candidate should have a PhD or equivalent research experience in relevant fields and experience with programming in Matlab and/or C++ would be advantageous. The post is available with a starting date of 1 January 2003, or nearest convenient date for one year in the first instance, with the possibility of renewal for a second year. Starting salary is up to ?26,255 pa inclusive, depending on experience. Further particulars of the position are on www.hera.ucl.ac.uk Applications (CV and names of three referees and a short statement of research interests) should be returned to Miss E Bertram, Assistant Secretary (Personnel), Institute of Neurology, Queen Square, London WC1N 3BG (fax: +44 20 7278 5069, email: e.bertram at ion.ucl.ac.uk) by November 29th, 2002. Informal enquiries to Professor Frith (c.frith at ion.ucl.ac.uk) or Professor Wolpert (wolpert at ion.ucl.ac.uk). Taking Action for Equality From David.Cohn at acm.org Wed Nov 13 10:55:15 2002 From: David.Cohn at acm.org (David 'Pablo' Cohn) Date: Wed, 13 Nov 2002 07:55:15 -0800 Subject: new paper in JMLR: The Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces Message-ID: [cross-posted to connectionists at the request of the authors - for information on subscribing to the jmlr-announce mailing list, please visit www.jmlr.org] The Journal of Machine Learning Research is pleased to announce the availability of a new paper online at http://www.jmlr.org. ---------------------------------------- The Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces Masashi Sugiyama and Klaus-Robert Muller JMLR 3(Nov):323-359, 2002 Abstract A central problem in learning is selection of an appropriate model. This is typically done by estimating the unknown generalization errors of a set of models to be selected from and then choosing the model with minimal generalization error estimate. In this article, we discuss the problem of model selection and generalization error estimation in the context of kernel regression models, e.g., kernel ridge regression, kernel subset regression or Gaussian process regression. Previously, a non-asymptotic generalization error estimator called the subspace information criterion (SIC) was proposed, that could be successfully applied to finite dimensional subspace models. SIC is an unbiased estimator of the generalization error for the finite sample case under the conditions that the learning target function belongs to a specified reproducing kernel Hilbert space (RKHS) H and the reproducing kernels centered on training sample points span the whole space H. These conditions hold only if dim H < l, where l < infinity is the number of training examples. Therefore, SIC could be applied only to finite dimensional RKHSs. In this paper, we extend the range of applicability of SIC, and show that even if the reproducing kernels centered on training sample points do not span the whole space H, SIC is an unbiased estimator of an essential part of the generalization error. Our extension allows the use of any RKHSs including infinite dimensional ones, i.e., richer function classes commonly used in Gaussian processes, support vector machines or boosting. We further show that when the kernel matrix is invertible, SIC can be expressed in a much simpler form, making its computation highly efficient. In computer simulations on ridge parameter selection with real and artificial data sets, SIC is compared favorably with other standard model selection techniques for instance leave-one-out cross-validation or an empirical Bayesian method. ---------------------------------------- This is the 13th paper in Volume 3. It, and all previous papers, are available electronically at http://www.jmlr.org/ in PostScript and PDF formats. Many are also available in HTML. The papers of Volume 1 and 2 are also available in hardcopy from the MIT Press; please see http://mitpress.mit.edu/JMLR for details. -David Cohn, Managing Editor, Journal of Machine Learning Research From bap at cs.unm.edu Wed Nov 13 18:56:55 2002 From: bap at cs.unm.edu (Barak Pearlmutter) Date: Wed, 13 Nov 2002 16:56:55 -0700 Subject: NIPS*2002 Workshops Abstracts Message-ID: **************************************************************** NIPS*2002 Workshops December 12-14, 2002, Whistler BC, Canada http://www.nips.cc **************************************************************** Workshop Schedule ================= The NIPS*2002 Workshops will be held at the Westin in Whistler BC, Canada, on Fri Dec 13 and Sat Dec 14, with sessions at 7:30-10:00am and 4:00-7:00pm. Two Day Workshops: Fri Dec 13 & Sat Dec 14 Functional Neuroimaging Multi-Agent Learning Propagation on Cyclic Graphs One Day Workshops on Fri Dec 13 Adaptation/Plasticity and Coding Bioinformatics Independent Component Analysis Neuromorphic Engineering Spectral Methods Statistics for Computational Experiments Unreal Data One Day Workshops on Sat Dec 13 Learning Invariant Representations Learning Rankings Negative Results On Learning Kernels Quantum Neural Computing Thalamocortical Processing Universal Learning Algorithms Workshop Descriptions ===================== TWO DAY WORKSHOPS (Friday & Saturday) Propagation Algorithms on Graphs with Cycles: Theory and Applications Shiro Ikeda, Kyushu Institute of Technology, Fukuoka, Japan Toshiyuki Tanaka, Tokyo Metropolitan University, Tokyo, Japan Max Welling, University of Toronto, Toronto, Canada Inference on graphs with cycles (loopy graphs) has drawn much attention in recent years. The problem arises in various fields such as AI, error-correcting codes, statistical physics, and image processing. Although exact inference is often intractable, much progress has been made in solving the problem approximately with local propagation algorithms. The aim of the workshop is to provide an overview of recent developments in methods related to belief propagation. We also encourage discussion of open theoretical problems and new possibilities for applications. Computational Neuroimaging: Foundations, Concepts & Methods Stephen J. Hanson, Rutgers University, Newark, NJ, USA Barak A. Pearlmutter, University of New Mexico, Albuquerque, NM, USA Stephen Strother, University of Minnesota, Minneapolis, MN, USA Lars Kai Hansen, Technical University of Denmark, Lyngby, Denmark Benjamin Martin Bly, Rutgers University, Newark, NJ, USA This workshop will concentrate on the foundations of neuroimaging, including the relation between neural firing and BOLD, fast fMRI, and diffusion methods. The first day includes speakers on new Methods for Multivariate analysis using fMRI especially as they relate to Neural Modeling (ICA, SVM, or other ML methods), which will slip into the next morning, with cognitive neuroscience talks involving Network and specific Neural Modeling approaches to cognitive function on day two. Multi-Agent Learning: Theory and Practice Gerald Tesauro, IBM Research, NY, USA Michael L. Littman, Rutgers University, New Brunswick, NJ, USA Machine learning in a multi-agent system, where learning agents interact with other agents that are also simultaneously learning, poses a radically different set of issues from those arising in normal single-agent learning in a stationary environment. This topic is poorly understood theoretically but seems ripe for progress by building upon many recent advances in RL and in Bayesian, game-theoretic, decision-theoretic, and evolutionary learning. At the same time, learning is increasingly vital in fielded applications of multi-agent systems. Many application domains are envisioned in which teams of software agents or robots learn to cooperate to achieve global objectives. Learning may also be essential in many non-cooperative domains such as economics and finance, where classical game-theoretic solutions are either infeasible or inappropriate. This workshop brings together researchers studying multi-agent learning from a variety of perspectives. Our invited speakers include leading AI theorists, applications developers in fields such as robotics and e-commerce, as well as social scientists studying learning in multi-player human-subject experiments. Slots are also available for contributed talks and/or posters. ONE DAY WORKSHOPS (Friday) The Role of Adaptation/Plasticity in Neuronal Coding Garrett B. Stanley, Harvard University, Cambridge, MA, USA Tai Sing Lee, Carnegie Mellon University, Pittsburgh, PA, USA A ubiquitous characteristic of neuronal processing is the ability to adapt to an ever changing environment on a variety of different time scales. Although the different forms of adaptation/ plasticity have been studied for some time, their role in the encoding process is still not well understood. The most widely utilized measures assume time-invariant encoding dynamics even though mechanisms serving to modify coding properties are continually active in all but the most artificial laboratory conditions. Important questions include: (1) how do encoding dynamics and/or receptive field properties change with time and the statistics of the environment?, (2) what are the underlying sources of these changes?, (3) what are the resulting effects on information transmission and processing in the pathway?, and (4) can the mechanisms of plasticity/adaptation be understood from a behavioral perspective? It is the goal of this workshop to discuss neuronal coding within several different experimental paradigms, in order to explore these issues that have only recently been addressed in the literature. Independent Component Analysis and Beyond Stefan Harmeling, Fraunhofer FIRST, Berlin, Germany Luis Borges de Almeida, INESC ID, Lisbon, Portugal Erkki Oja, HUT, Helsinki, Finland Dinh-Tuan Pham, LMC-IMAG, Grenoble, France Independent component analysis (ICA) aims at extracting unknown hidden factors/components from multivariate data using only the assumption that the unknown factors are mutually independent. Since the introduction of ICA concepts in the early 80s in the context of neural networks and array signal processing, many new successful algorithms have been proposed that are now well-established methods. Since then, diverse applications in telecommunications, biomedical data analysis, feature extraction, speech separation, time-series analysis and data mining have been reported. Notably of special interest for the NIPS community are, first, the application of ICA techniques to process multivariate data from various neuro-physiological recordings and second, the interesting conceptual parallels to information processing in the brain. Recently exciting developments have moved the field towards more general nonlinear or nonindependent source separation paradigms. The goal of the planed workshop is to bring together researchers from the different fields of signal processing, machine learning, statistics and applications to explore these new directions. Spectral Methods in Dimensionality Reduction, Clustering, and Classification Josh Tenenbaum, M.I.T., Cambridge, MA, USA Sam Roweis, University of Toronto, Ontario, Canada Data-driven learning by local or greedy parameter update algorithms is often a painfully slow process fraught with local minima. However, by formulating a learning task as an appropriate algebraic problem, globally optimal solutions may be computed efficiently in closed form via an eigendecomposition. Traditionally, this spectral approach was thought to be applicable only to learning problems with an essentially linear structure, such as principal component analysis or linear discriminant analysis. Recently, researchers in machine learning, statistics, and theoretical computer science have figured out how to cast a number of important nonlinear learning problems in terms amenable to spectral methods. These problems include nonlinear dimensionality reduction, nonparameteric clustering, and nonlinear classification with fully or partially labeled data. Spectral approaches to these problems offer the potential for dramatic improvements in efficiency, accuracy, optimality and reproducibility relative to traditional iterative or greedy learning algorithms. Furthermore, numerical methods for spectral computations are extremely mature and well understood, allowing learning algorithms to benefit from a long history of implementation efficiencies in other fields. The goal of this workshop is to bring together researchers working on spectral approaches across this broad range of problem areas, for a series of talks on state-of-the-art research and discussions of common themes and open questions. Neuromorphic Engineering in the Commercial World Timothy Horiuchi, University of Maryland, College Park, MD, USA Giacomo Indiveri, University-ETH Zurich, Zurich, Switzerland Ralph Etienne-Cummings, University of Maryland, College Park, MD, USA We propose a one-day workshop to discuss strategies, opportunities and success stories in the commercialization of neuromorphic systems. Towards this end, we will be inviting speakers from industry and universities with relevant experience in the field. The discussion will cover a broad range of topics, from visual and auditory processing to olfaction and locomotion, focusing specifically on the key elements and ideas for successfully transitioning from neuroscience to commercialization. Statistical Methods for Computational Experiments in Visual Processing and Computer Vision Ross Beveridge, Colorado State University, Colorado, USA Bruce Draper, Colorado State University, Colorado, USA Geof Givens, Colorado State University, Colorado, USA Ross J. Micheals, NIST, Maryland, USA Jonathon Phillips, DARPA & NIST, Maryland, USA In visual processing and computer vision, computational experiments play a critical role in explaining algorithm and system behavior. Disciplines such as psychophysics and medicine have a long history of designing experiments. Vision researchers are still learning how to use computational experiments to explain how systems behave in complex domains. This workshop will focus on new and better experiment experimental methods in the context of visual processing and computer vision. Unreal Data: Principles of Modeling Nonvectorial Data Alexander J. Smola, Australian National Univ., Canberra, Australia Gunnar Raetsch, Australian National Univ., Canberra, Australia Zoubin Ghahramani, University College London, London, UK A large amount of research in machine learning is concerned with classification and regression for real-valued data which can easily be embedded into a Euclidean vector space. This is in stark contrast with many real world problems, where the data is often a highly structured combination of features, a sequence of symbols, a mixture of different modalities, may have missing variables, etc. To address the problem of learning from non-vectorial data, various methods have been proposed, such as embedding the structures in some metric spaces, the extraction and selection of features, proximity based approaches, parameter constraints in Graphical Models, Inductive Logic Programming, Decision Trees, etc. The goal of this workshop is twofold. Firstly, we hope to make the machine learning community aware of the problems arising from domains where non-vectorspace data abounds and to uncover the pitfalls of mapping such data into vector spaces. Secondly, we will try to find a more uniform structure governing methods for dealing with non-vectorial data and to understand what, if any, are the principles underlying the modeling of non-vectorial data. Machine Learning Techniques for Bioinformatics Colin Campbell, University of Bristol, UK Phil Long, Genome Institute of Singapore This workshop will cover the development and application of machine learning techniques in application to molecular biology. Contributed papers are welcome from any topic relevant to this theme including, but not limited to, analysis of expression data, promoter analysis, protein structure prediction, protein homology detection, detection of splice junctions, and phylogeny, for example. Contributions are most welcome which propose new algorithms or methods, rather than the use of existing techniques. In addition to contributed papers we expect to have several tutorials covering different areas where machine learning techniques are have been successfully applied in this domain. ONE DAY WORKSHOPS (Saturday) Thalamocortical Processing in Audition and Vision Tony Zador, Cold Spring Harbor Lab., Cold Spring Harbor, NY, USA Shihab Shamma, University of Maryland, College Park, MD, USA All sensory information (except olfactory) passes through the thalamus before reaching the cortex. Are the principles governing this thalamocortical transformation shared across sensory modalities? This workshop will investigate this question in the context of audition and vision. Questions include: Do the LGN and MGN play analogous roles in the two sensory modalities? Are the cortical representations of sound and light analogous? Specifically, the idea is to talk about cortical processing (as opposed to purely thalamic), how receptive fields are put together in the cortex, and the implications of these ideas to the nature of information being encoded and extracted at the cortex. Learning of Invariant Representations Konrad Paul Koerding, ETH/UNI Zuerich, Switzerland Bruno. A. Olshausen, U.C. Davis & RNI, CA, USA Much work in recent years has shown that the sensory coding strategies employed in the nervous systems of many animals is well matched to the statistics of their natural environment. For example, it has been shown that lateral inhibition occuring in the retina may be understood in terms of a decorrelation or `whitening' strategy (Srinivasan et al., 1982; Atick & Redlich, 1992), and that the receptive properties of cortical neurons may be understood in terms of sparse coding or ICA (Olshausen & Field, 1996; Bell & Sejnowski, 1997; van Hateren & van der Schaaf, 1998). However, most of these models do not address the question of which properties of the environment are interesting or relevant and which others are behaviourally insignificant. The purpose of this workshop is to focus on unsupervised learning models that attempt to represent features of the environment which are invariant or insensitive to variations such as position, size, or other factors. Quantum Neural Computing Elizabeth C. Behrman, Wichita State University, Wichita, KS, USA James E. Steck, Wichita State University, Wichita, KS, USA Recently there has been a resurgence of interest in quantum computers because of their potential for being very much smaller and very much faster than classical computers, and because of their ability in principle to do hereofore impossible calculations, such as factorization of large numbers in polynomial time. We will explore ways to implement biologically inspired quantum computing in network topologies, thus exploiting both the intrinsic advantages of quantum computing and the adaptability of neural computing. This workshop will follow up on our very successful NIPS 2000 workshop and the IJCNN 2001 Special Session. Aspects/approaches to be explored will include: quantum hardware, e.g., SQUIDs, nmr, trapped ions, quantum dots, and molecular computing; theoretical and practical limits to quantum and quantum neural computing, e.g. noise, error correction, and decoherence; and simulations. Universal Learning Algorithms and Optimal Search Juergen Schmidhuber, IDSIA, Manno-Lugano, Switzerland Marcus Hutter, IDSIA, Manno-Lugano, Switzerland Recent theoretical and practical advances are currently driving a renaissance in the fields of Universal Learners (rooted in Solomonoff's universal induction scheme, 1964) and Optimal Search (rooted in Levin's universal search algorithm, 1973). Both are closely related to the theory of Kolmogorov complexity. The new millennium has brought several significant developments including: Sharp expected loss bounds for universal sequence predictors, theoretically optimal reinforcement learners for general computable environments, computable optimal predictions based on natural priors that take algorithm runtime into account, and practical, bias-optimal, incremental, universal search algorithms. Topics will also include: Practical but general MML/MDL/SRM approaches with theoretical foundation, weighted majority approaches, and no free lunch theorems. On Learning Kernels Nello Cristianini, U.C. Davis, California, USA Tommi Jaakkola, M.I.T., Massachusetts, USA Michael I. Jordan, U.C. Berkeley, California, USA Gert R.G. Lanckriet, U.C. Berkeley, California, USA Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel methods in learning systems. For the past five years, a growing community has been meeting at the NIPS workshops to discuss the latest progress in learning with kernels. Recent research in this area addresses the problem of learning the kernel itself from data. This subfield is becoming an active research area, offering a challenging interplay between statistics, advanced convex optimization and information geometry. It presents a number of interesting open problems. The workshop has two goals. First, it aims at discussing state-of-the-art research on 'learning the kernel', as well as giving an introduction to some of the new techniques used in this subfield. Second, it offers a meeting point for a diverse community of researchers working on kernel methods. As such, contributions from ALL subfields in kernel methods are welcome and will be considered for a poster presentation, with priority to very recent results. Furthermore, contributions on the main theme of learning kernels will be considered for oral presentations. Deadline for submissions: Nov 15, 2002. Negative Results and Open Problems Isabelle Guyon, Clopinet, California, USA In mathematics and theoretical computer science, exhibiting counter examples is part of the established scientific method to rule out wrong hypotheses. Yet, negative results and counter examples are seldom reported in experimental papers, although they can be very valuable. Our workshop will be a forum to freely discuss negative results and introduce the community to challenging open problems. This may include reporting experimental results of principled algorithms that obtain poor performance compared to seemingly dumb heuristics; experimental results that falsify an existing theory; counter examples to a generally admitted conjecture; failure to find a solution to a given problem after various attempts; and failure to demonstrate the advantage of a given method after various attempts. If you have interesting negative results (not inconclusive results) or challenging open problems, you may submit an abstract before November 15, 2002. Beyond Classification and Regression: Learning Rankings, Preferences, Equality Predicates, and Other Structures Rich Caruana, Cornell University, NY, USA Thorsten Joachims, Cornell University, NY, USA Not all supervised learning problems fit the classification/ regression function-learning model. Some problems require predictions other than values or classes. For example, sometimes the magnitude of the values predicted for cases are not important, but the ordering these values induce is important. This workshop addresses supervised learning problems where either the goal of learning or the input to the learner is more complex than in classification and regression. Examples of such problems include learning partial or total orderings, learning equality or match rules, learning to optimize non-standard criteria such as Precision and Recall or ROC Area, using relative preferences as training examples, learning graphs and other structures, and problems that benefit from these approaches (e.g., text retrieval, medical decision making, protein matching). The goal of this one-day workshop is to discuss the current state-of-the-art, and to inspire research on new algorithms and problems. To submit an abstract, see http://www.cs.cornell.edu/People/tj/ranklearn. More extensive information is available on the NIPS web page http://www.nips.cc, which has links to the pages maintained by each individual workshop. The number of workshop proposals was particularly high this year. All together there will be seventeen NIPS*2002 workshops, of which three will last for two days, for a total of twenty workshop-days: a new record. We anticipate a great year not just in the number of workshops and in their quality, but in attendance as well: projections indicate that the workshops may surpass the main conference in total number of participants. From amari at brain.riken.go.jp Thu Nov 14 02:33:31 2002 From: amari at brain.riken.go.jp (Shun-ichi Amari) Date: Thu, 14 Nov 2002 16:33:31 +0900 Subject: NBNI-2002 Message-ID: <003601c28bb0$219edec0$220a0a0a@Abaloneamari> The following is the program of NBNI (Neurobiology and Neuroinformatics workshop). Registration fee of 10,000 yen ispayable at the registration desk. ------------------ NBNI-2002 Fourth Japan-Korea-China-India Joint Workshop on Neurobiology and Neuroinformatics November 25-26, 2002 RIKEN Brain Science Institute, Japan Ohkouchi Hall Seminar Room in BSI Central Building Organized by RIKEN Brain Science Institute Sponsored by KAIST BSRC, KNIH BBRC, KIST BNRC, Korea China Society for Neuroscience National Brain Research Centre, India Organizers: Shun-ichi Amari (RIKEN BSI, Japan) Nobuyuki Nukina (RIKEN BSI, Japan) Chang-Rak Choi (Biomedical Brain Research Center, Korea) Soo-Young Lee (Brain Science Research Center, KAIST, Korea) Tae H. Oh (Brain Neurobiology Research Center, KIST, Korea) Fanji Gu (Fudan University, China) Yizheng Wang (Institute of Neuroscience, China) Vijayalakshmi Ravindranath (National Brain Research Center, India) ------------------ Program November 25 (Monday) 9:30 - 10:00 Registration (Ohkouchi Hall) Opening Ceremony: Ohkouchi Hall (Chair Shun-ichi Amari) 10:00 - 10:10 Opening Address Masao Ito (Director, BSI) 10:10 - 10:20 Welcome Address Shun-ichi Amari (Vice director, BSI) 10:20 - 11:00 Overviews of Activities in Participating Countries Organizers: Shun-ichi Amari, Nobuyuki Nukina, Chang-Rak Choi, Soo-Young Lee, Tae H. Oh, Fanji Gu, Tian-Ming Gao, Vijayalakshmi Ravindranath Plenary Session I: Ohkouchi Hall (Chair Vijayalakshmi Ravindranath) 11:00 - 11:30 Takao K. Hensch (RIKEN BSI, Japan) “Inhibitory circuit control of critical period plasticity in developing visual cortex” 11:30 - 12:00 Tian-Ming Gao (First Military Medical University, China) “Overactivation of potassium channels mediates hippcampal neuronal death induced by ischemi/hypoxi insult” 12:30 - 13:00 Yun-Hee Kim (College of Medicine, Pochon CHA University, Korea) “Reorganization of motor and cognitive network following human brain lesion investigated by functional neuroimaging” Luncheon Meeting 13:00 - 14:00 Chair: Shun-ichi Amari, Nobuyuki Nukina Neurobiology Session I (Ohkouch Hall; Chair Nobuyuki Nukina) 14:00 - 14:30 Masayuki Miura (RIKEN BSI, Japan) “Genetic pathway of neural cell death and degeneration in Drososphila” 14:30 - 15:00 Shyamala Mani (National Brain Research Center, India) “Patterning of the cerebellum in the GAP-43 knockout mouse” 15:00 - 15:30 Young J. Oh (Yonsei University College of Science, Korea) “Mitochondrial and extramitochondrial apoptotic pathways in experimental model of Parkinson’s disease” 15:30 - 16:00 Coffee Break Neurobiology Session II (Ohkouchi Hall; Chair Shyamala Mani) 16:00 - 16:30 Zhi-Wang Li (Tongji Medical College of Huazhong, China) “The action of tachykinins on the primary sensory neurons” 16:30 - 17:00 Sang Eun Kim (Samsung Medical Center, Korea) “Effect of chronic nicotine administration on dopaminergic neurotransmission” Neuroinformatics Session I (BSI Seminar Room; Chair Shiro Usui) 14:00 - 14:30 Michio Sugeno (RIKEN BSI, Japan) “Language-oriented approach to creating the brain” 14:30 - 15:00 Posina Venkata Rayudu (National Brain Research Center, India) “Brain as mathematics” 15:00 - 15:30 Eunjoo Kang (Seoul National University Medical Research Center, Korea) “Cross-modal interactions in speech perception during sentence comprehension: an fMRI study” 15:30 - 16:00 Coffee Break Neuroinformatics Session II (BSI Seminar Room; Chair Shobini L. Rao) 16:00 - 16:30 Yiyuan Tang (Dalian University of Technology, China) “Understanding the brain function through neuroimaging database for Chinese language processing” 16:30 - 17:00 Seung Kee Han (Chungbuk National University, Korea) “Inferring neural connectivity from multiple spike trains” Welcome Reception: Second Floor of The First Restaurant 18:00 - 20:00 November 26 (Tuesday) Plenary Session II: Ohkouchi Hall (Chair Tian-Ming Gao) 10:00 - 10:30 Vijayalakshmi Ravindranath (National Brain Research Center, India) “Towards understanding the pathogenesis of neurodegenerative disorders” 10:30 - 11:00 Tomoki Fukai (Tamagawa University, Japan) “Towards the understanding of biological mechanisms and functional roles of the gamma rhythmic activity” 11:00 - 11:30 Yong-Keun Jung (Kwangju Institute of Science and Technology, Korea) “Ubiquitin conjugating enzyme E2-25K as a novel mediator of amyloid-beta neurotoxicity in Alzheimer’s disease” 11:30 - 12:00 Coffee Break Plenary Session III: Ohkouchi Hall (Chair Soo-Young Lee) 12:00 - 12:30 Shobini L. Rao (National Brain Research Center, India) “Evidence for brain plasticity-cognitive retraining and functional brain imaging” 12:30 - 13:00 Pei-Ji Liang (Shanghai Jiaotong University, China) “Possible mechanism of synaptic plasticity in retinal graded neurons” Luncheon Meeting 13:00 - 14:30 Neurobiology Session III (Ohkouchi Hall: Chair Tae H. Oh) 14:30 - 15:00 Takeshi Iwatsubo (University of Tokyo, Japan) “Formation and function of g-secretase complex” 15:00 - 15:30 Nihar Ranjan Jana (National Brain Research Center, India) “Direct visualization of the expression, selective unclear accumulation, aggregate formation and possible proteolytic processing of the transgene product in a HD exon1-EGFP transgenic mice model” 15:30 - 16:00 Coffee Break Neurobiology Session IV (Ohkouchi Hall: Chair Chang-Rak Choi) 16:00 - 16:30 Rubin Wang (Donghua University, China) “Analysis of dynamics of the phase resetting on the set of the population of neurons” 16:30 - 17:00 Byoung Joo Gwag (Ajou University, Korea) “Mechanisms of selective neuronal death”      Neuroinformatics Session III (BSI Seminar Room: Chair Fanji Gu) 14:30 - 15:00 Aditya Murthy (National Brain Research Center, India) “The role of frontal cortex in overt and covert orienting” 15:00 - 15:30 Lin Xu (Kunming Institute of Zoology, China) “How synaptic plasticity in hippocampus underlies learning and memory” 15:30 - 16:00 Coffee Break Neuroinformatics Session IV (BSI Seminar Room: Chair Yiyuan Tang) 16:00 - 16:30 Masataka Watanabe (University of Tokyo, Japan) “Prefrontal cortex model of selective attention” 16:30 - 17:00 Soo-Young Lee (Korea Advanced Institute of Science and Technology, Korea) “Modeling Human Auditory Pathway for Artificial Auditory Systems in Real-World Noisy Environments” Farewell Party: Second Floor of the Hirosawa Club 18:00 - 20:00 Concluding Address Nobuyuki Nukina (RIKEN BSI, Japan) ----------------------------------- Shun-ichi Amari Vice director RIKEN Brain Science Institute Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687 amari at brain.riken.go.jp; www.bsis.brain.riken.go.jp/ From dgw at MIT.EDU Fri Nov 8 14:39:10 2002 From: dgw at MIT.EDU (David Weininger) Date: Fri, 08 Nov 2002 14:39:10 -0500 Subject: book announcement--Liu Message-ID: <2002110814391018478@outgoing.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/0262122553/ Thank you! Best, David Analog VLSI Circuits and Principles Shih-Chii Liu, J?rg Kramer, Giacomo Indiveri, Tobias Delbr?ck, and Rodney Douglas foreword by Carver A. Mead Neuromorphic engineers work to improve the performance of artificial systems through the development of chips and systems that process information collectively using primarily analog circuits. This book presents the central concepts required for the creative and successful design of analog VLSI circuits. The discussion is weighted toward novel circuits that emulate natural signal processing. Unlike most circuits in commercial or industrial applications, these circuits operate mainly in the subthreshold or weak inversion region. Moreover, their functionality is not limited to linear operations, but also encompasses many interesting nonlinear operations similar to those occurring in natural systems. Topics include device physics, linear and nonlinear circuit forms, translinear circuits, photodetectors, floating-gate devices, noise analysis, and process technology. Shih-Chii Liu, Giacomo Indiveri, and Tobias Delbr?ck are Assistant Professors at the Institute of Neuroinformatics, Zurich, as was the late J?rg Kramer. Rodney Douglas is Director of the Institute of Neuroinformatics and Professor of Neuroinformatics at the University of Zurich. 6 x 9, 472 pp., cloth, ISBN 0-262-12255-3 A Bradford Book ______________________ David Weininger Associate Publicist The MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617 253 2079 617 253 1709 fax http://mitpress.mit.edu From fukushima at karl.teu.ac.jp Thu Nov 14 20:12:51 2002 From: fukushima at karl.teu.ac.jp (Kunihiko FUKUSHIMA) Date: Fri, 15 Nov 2002 10:12:51 +0900 Subject: Call for Papers: Neural Networks for Vision, KES'2003 Message-ID: <4.2.0.58.J.20021115100812.003f97c8@sv1.karl.teu.ac.jp> ============================================================ Call for Papers: Invited Session on ?Neural Networks for Vision --- Biological and Artificial? KES'2003, Oxford, UK ============================================================ 7th International Conference on Knowledge-Based Intelligent Information & Engineering Systems 3, 4 & 5 September 2003, St Anne's College, University of Oxford, U.K. ------------------------------------------ Invited Session on ?Neural Networks for Vision --- Biological and Artificial? Modeling neural networks is important for both understanding the biological brain and obtaining design principles for artificial vision systems of the next generation. This session aims to focus on (1) modeling approach to uncover the mechanism of the biological visual system, and (2) artificial neural networks suggested by the biological visual system. Specific topics of interest include but not limited to: * Biological neural network models for the visual system * Artificial neural networks for vision * Visual pattern recognition using neural networks * Object recognition * Active vision * Selective visual attention * Learning and self-organization of neural networks for vision * Stereoscopic vision, binocular vision * Eye movement and foveation * Early vision * Motion analysis with neural networks * Target detection and tracking * Color vision * Segmentation of images or patterns using neural networks * Completion of imperfect patterns (e.g., partly occluded, or contaminated with noise) * Visual illusion ------------------------------------------ Instructions for Authors Only electronic copies of the papers in Microsoft Word, PDF or Postscript forms are acceptable for review purposes and must be sent to the session chair. However, please note that you will be required to send hard copy of the final version of your paper, if it is accepted; electronic submission of final papers is not allowed. Papers must correspond to the requirements detailed in the Instructions to Authors which is placed on the Conference Web Site, www.bton.ac.uk/kes/kes2003/, or http://www.hotwolf.f9.co.uk/kes/kes2003/ All papers must be presented by one of the authors, who must pay fees. ------------------------------------------ Publication The Conference Proceedings will be published by a major publisher, for example IOS Press of Amsterdam. Extended versions of selected papers will be considered for publication in the International Journal of Knowledge-Based Intelligent Engineering Systems, www.bton.ac.uk/kes/journal/ ------------------------------------------ Important Dates Deadline for submission intention: December 1, 2002 Deadline for receipt of papers by Session Chair: February 1, 2003 Notification of acceptance: March 1, 2003 Camera-ready papers to session chair by: April 1, 2003 (Session Chair must send final camera-ready papers to reach to KES Secretariat by 1 May 2003 or they will not appear in the proceedings). ------------------------------------------ Session Chair: Kunihiko Fukushima Professor Tokyo University of Technology 1404-1, Katakura, Hachioji, Tokyo 192-0982, Japan e-mail: fukushima at karl.teu.ac.jp ------------------------------------------ From ken at phy.ucsf.edu Fri Nov 15 01:37:05 2002 From: ken at phy.ucsf.edu (Ken Miller) Date: Thu, 14 Nov 2002 22:37:05 -0800 Subject: UCSF Postdoctoral/Graduate Fellowships in Theoretical Neurobiology Message-ID: <15828.38417.948955.208183@coltrane.ucsf.edu> FULL INFO: http://www.sloan.ucsf.edu/sloan/sloan-info.html PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR CONTACT ADDRESSES GIVEN BELOW The Sloan-Swartz Center for Theoretical Neurobiology at UCSF solicits applications for pre- and post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in a quantitative field such as mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. The Sloan-Swartz Center offers opportunities to combine theoretical and experimental approaches to understanding the operation of the intact brain. Young scientists with strong theoretical backgrounds will receive scientific training in experimental approaches to understanding the operation of the intact brain. They will learn to integrate their theoretical abilities with these experimental approaches to form a mature research program in integrative neuroscience. The research undertaken by the trainees may be theoretical, experimental, or a combination. Resident Faculty of or frequent visitors to the Sloan-Swartz Center and their research interests include: William Bialek (frequent visitor): Information-theoretic and statistical characterization of, and physical limits to, neural coding and representation Michael Brainard: Mechanisms underlying vocal learning in the songbird; sensorimotor adaptation to alteration of performance-based feedback Allison Doupe: Development of song recognition and production in songbirds Loren Frank: (joining our faculty in summer, 2003): The relationship between behavior and neural activity in the hippocampus and anatomically related cortical areas. Stephen Lisberger: Learning and memory in a simple motor reflex, the vestibulo-ocular reflex, and visual guidance of smooth pursuit eye movements by the cerebral cortex Michael Merzenich: Experience-dependent plasticity underlying learning in the adult cerebral cortex, and the neurological bases of learning disabilities in children Kenneth Miller: Circuitry of the cerebral cortex: its structure, self-organization, and computational function (primarily using cat primary visual cortex as a model system) Philip Sabes: Sensorimotor coordination, adaptation and development of spatially guided behaviors, experience dependent cortical plasticity. Christoph Schreiner: Cortical mechanisms of perception of complex sounds such as speech in adults, and plasticity of speech recognition in children and adults Michael Stryker: Mechanisms that guide development of the visual cortex There are also a number of visiting faculty, including Larry Abbott, Brandeis University; Sebastian Seung, MIT; David Sparks, Baylor University; Steve Zucker, Yale University. TO APPLY for a POSTDOCTORAL position, please send a curriculum vitae, a statement of previous research and research goals, up to three relevant publications, and have two letters of recommendation sent to us. The application deadline is January 31, 2003. Send applications to: Sloan-Swartz Center 2003 Admissions Sloan-Swartz Center for Theoretical Neurobiology at UCSF Department of Physiology University of California 513 Parnassus Ave. San Francisco, CA 94143-0444 PRE-DOCTORAL applicants with strong theoretical training may seek admission into the UCSF Neuroscience Graduate Program as a first-year student. Applicants seeking such admission must apply by Jan. 3, 2003 to be considered for fall, 2003 admission. Application materials for the UCSF Neuroscience Program may be obtained from http://www.ucsf.edu/neurosc/neuro_admissions.html#application or from Pat Vietch Neuroscience Graduate Program Department of Physiology University of California San Francisco San Francisco, CA 94143-0444 neuroscience at phy.ucsf.edu Be sure to include your surface-mail address. The procedure is: make a normal application to the UCSF Neuroscience program; but also alert the Sloan-Swartz Center of your application, by writing to sloan-info at phy.ucsf.edu. If you need more information: -- Consult the Sloan-Swartz Center WWW Home Page: http://www.sloan.ucsf.edu/sloan -- Send e-mail to sloan-info at phy.ucsf.edu -- See also the home page for the W.M. Keck Foundation Center for Integrative Neuroscience, in which the Sloan-Swartz Center is housed: http://www.keck.ucsf.edu/ From herbert.jaeger at ais.fhg.de Fri Nov 15 08:11:28 2002 From: herbert.jaeger at ais.fhg.de (Herbert Jaeger) Date: Fri, 15 Nov 2002 14:11:28 +0100 Subject: R&D positions for RNN applications Message-ID: <3DD4F280.645D2179@ais.fhg.de> The Fraunhofer Institute for Autonomous Intelligent Systems (http://www.ais.fraunhofer.de/index.en.html) is pleased to announce 3 Research Engineer positions for doing research and development using a novel machine learning technique called Echo-State Networks, or ESNs. The positions are for an initial duration of 1 year (starting January 2003) with a possible 6 month extension. We are seeking candidates with degrees in Engineering, Computer Science, Physics or related fields and preferably with some industrial experience. A good command of English is required; German skills would be an additional assett. ESNs are a novel type of recurrent neural network developed and patented in AIS which can be trained extremely efficiently for tasks in nonlinear control, filtering, prediction, pattern recognition and pattern generation. Please consult http://www.ais.fraunhofer.de/INDY/echo_net_research.html for details. The announced positions will aim to apply the ESN technique to a few selected practical problems for eventual commercialization. In this respect, the work will be applied in nature and will be focused on creating demonstrations and working prototypes in a specific application domain. The results of this work will be used to solicit interest from investors and/or technology partners within 18 months. Currently, the following application domains are being investigated more closely: (1) control of strongly nonlinear electrical machines, specifically, fast moving mobile robots and switched reluctance motors, (2) equalization of mildly nonlinear digital communication channels, specifically, high-gain satellite radio transmitters, (3) filtering and prediction of strongly stochastic signals, especially speech signals. The list of examined types of applications is however open and candidates with a background from another area of nonlinear systems engineering are explicitly encouraged to apply. Salary will follow the German categories BAT 2a or 1b according to qualification, age and maritial status (basic salary ranging from 2000 to 3000 Euro per month, plus various add-ons). Fraunhofer AIS is an equal opportunity employer. Please email a CV and covering letter explaining your background and interests to Dr. Herbert Jaeger (herbert.jaeger at ais.fraunhofer.de, http://ais.fraunhofer.de/INDY/herbert/), with a copy to St?phane Beauregard (stephane.beauregard at ais.fraunhofer.de). ------------------------------------------------------------------ Dr. Herbert Jaeger Fraunhofer Institute for Autonomous Intelligent Systems AiS.INDY, Schloss Birlinghoven, D-53754 Sankt Augustin, Germany Phone (+49) 2241-14-2253, Fax (+49) 2241-14-2342 email herbert.jaeger at ais.fraunhofer.de http://www.ais.fraunhofer.de/INDY/herbert/ ------------------------------------------------------------------ From jose at psychology.rutgers.edu Fri Nov 15 07:37:09 2002 From: jose at psychology.rutgers.edu (stephen j. hanson) Date: 15 Nov 2002 07:37:09 -0500 Subject: Computational Neuroscience, Learning, Cognitive Modeling--RUTGERS UNIVERSITY-(Newark Campus) Message-ID: <1037363834.2694.2.camel@vaio> RUTGERS UNIVERSITY- (Newark Campus). PSYCHOLOGY DEPARTMENT The Department of Psychology anticipates making one tenure track, Assistant or Associate Professor level appointment in area of COGNITIVE SCIENCE. In particular we are seeking individuals from one of any of the following THREE areas: LEARNING (Cognitive Modeling), COMPUTATIONAL NEUROSCIENCE, or SOCIAL COGNITION (interests in NEUROIMAGING in any of these areas would also be a plus, since the Department in conjunction with UMDNJ has recently acquired a 3T Neuroimaging Center (see http://www.newark.rutgers.edu/fmri/). The successful candidate is expected to develop and maintain an active, externally funded research program, and to teach at both the graduate and undergraduate levels. Review of applications will begin JANUARY 30th 2002, pending final budgetary approval from the administration. Rutgers University is an equal opportunity/ affirmative action employer. Qualified women and minority candidates are encouraged to apply. Please send a CV, a statement of current and future research interests, and three letters of recommendation to COGNITIVE SCIENCE SEARCH COMMITTEE, Department of Psychology, Rutgers University, Newark, NJ 07102. Email enquires can be made to cogsci at psychology.rutgers.edu. From arno at salk.edu Fri Nov 15 17:52:08 2002 From: arno at salk.edu (Arnaud Delorme) Date: Fri, 15 Nov 2002 14:52:08 -0800 Subject: EEGLAB Toolbox released Message-ID: <3DD57A98.2090905@salk.edu> EEGLAB - Tools for advanced EEG data analysis under Matlab using ICA and time/frequency methods - has been released under the GNU public license for download from: http://sccn.ucsd.edu/eeglab/ EEGLAB is an integrated toolbox of 250 Matlab routines for analyzing and visualizing event-related EEG (or MEG) brain data. EEG, event, and channel location data can be read in a variety of formats. A graphic user interface allows users to explore their data interactively, while global data, event, and channel location structures, plus a command history mechanism ease the transition to writing custom analysis scripts. An extensive .html tutorial and help messages allow users to learn to use all parts of the system. Matlab and binary routines for performing infomax and extended-infomax ICA are included, as is the sample EEG data set used throughout the tutorial. Principal authors: Arnaud Delorme & Scott Makeig Swartz Center for Computational Neuroscience Institute for Neural Computation University of California San Diego eeglab at sccn.ucsd.edu From cmbishop at microsoft.com Fri Nov 15 19:38:40 2002 From: cmbishop at microsoft.com (Christopher Bishop) Date: Sat, 16 Nov 2002 00:38:40 -0000 Subject: AI Statistics conference: FINAL CALL FOR PARTICIPATION Message-ID: <6EDEB53BA6EA96458F3CEC96BB0282D2019DA3A0@tvp-msg-03.europe.corp.microsoft.com> Ninth International Conference on Artificial Intelligence and Statistics January 3-6, 2003, Hyatt Hotel, Key West, Florida http://research.microsoft.com/conferences/AIStats2003/ Deadline for early registration is 1 December 2002. This is the ninth in a series of workshops which have brought together researchers in Artificial Intelligence and in Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encouraged interdisciplinary work. The beginning of January is a very popular time of year for visitors to Key West, and so you are strongly urged to register early and to reserve your accommodation at the substantially reduced conference rate. Invited Speakers: Andrew Blake (Microsoft Research, Cambridge) Bill Freeman (MIT) Zoubin Ghahramani (Gatsby Computational Neuroscience Unit) David Haussler (UCSC) Geoffrey Hinton (University of Toronto) Tommi Jaakkola (MIT) Larwrence Saul (University of Pennsylvania) There has been a record number of submissions to this conference, and after a rigorous review process we have been able to accept 15% of the submissions for oral presentation and 27% for poster presentation. A full programme is available on the web site. Key West provides a superb location for this workshop, and the weather at Key West in January is expected to be very pleasant. The workshop timetable will focus on morning and early evening sessions, allowing ample free time in the afternoons for scientific discussions or to take advantage of local attractions such as scuba diving, snorkelling, wave runners, parasailing, fishing, sailing and golf. Chris Bishop Brendan Frey From tcp1 at leicester.ac.uk Fri Nov 15 06:16:14 2002 From: tcp1 at leicester.ac.uk (Tim Pearce) Date: Fri, 15 Nov 2002 11:16:14 -0000 Subject: PhD Positions In-Reply-To: Message-ID: <001701c28c98$77e773d0$cd6bd28f@rothko> PhD Studentship in Biologically Inspired Robotics A postgraduate researcher is required for an EC-funded project available immediately. The project concerns the development of neuronal models to control an unmanned aerial vehicle (UAV) robot to perform stereotypical moth-like chemotaxis (chemical search) behaviour. The project will develop biologically-inspired sensor, information processing and control systems for a c(hemosensing) UAV. The cUAV will identify and track volatile compounds of different chemical composition in outdooor environments. Its olfactory and sensory-motor systems are to be inspired by the moth, which will be supported by computational neuroscience model development. This development continues our research in artificial and biological olfaction, sensory processing and analysis, neuronal models of learning, real-time behavioural control, and robotics. Further details on the project and the research teams can be found at http://www.le.ac.uk/eg/tcp1/amoth/ The project includes significant funding and opportunities for travel within Europe to visit the laboratories of the participating consortia (in Switzerland, France, and Sweden) and outside Europe to attend international scientific meetings. Applicants should have a strong analytical background, a keen interest in neuroscience, and a good honours degree (at the 2(i) level or higher) in engineering, mathematics or physics. The student will be responsible for development of the experimental set-up for assessing chemical search strategies applied to robots within unsteady laminar/turbulent flow - which will involve programming, simulation, numerical and electronics development. Applicants should have a demonstrated interest in one or more of the following, neuroscience, robotics, and/or artificial intelligence. Some experience of fluid dynamics would be an advantage. Good team skills are essential. The studentship includes a stipend of ?12,000 per year for 3 years and includes full provision for academic fees. Both EU and non-EU nationals may apply. PhD Studentship in Neuroengineering/Computational Neuroscience A postgraduate research position is available on an EC-funded project immediately. The position is to support the EU Network of Excellence in Neuroinformatics - nEUro-IT (details of the network are under construction at http://www.neuro-it.net). The project includes funding and opportunities for travel within Europe to visit educational establishments conducting research related to the interests of the network. Applicants should have a strong analytical background, an interest in neuroscience, and a good honours degree (at the 2(i) level or higher) in engineering, mathematics or physics. As part of their commitment to the Network of Excellence the student will be responsible for development of a database of educational material related to neuroinformatics and neuroengineering within Europe . In addition the student is expected to carry out research in any topic of their choice related to the research of the laboratory (see http://www.le.ac.uk/eg/tcp1/neurolab/ for details) that will be expected to lead to the award of a PhD. Good team skills are essential. The studentship includes a stipend of ?12,000 per year for 3 years and includes full provision for academic fees. Only EU nationals may apply. Further details on the research activities carried in this laboratory can be found at http://www.le.ac.uk/eg/tcp1/neurolab/ The Engineering Department was rated 5A in the Research Assessment Exercise, 2001. Initial enquiries and requests for details of the application process should be addressed to the EU Project Assistant, Mr. John Harrison, Department of Engineering, University of Leicester, Leicester LE1 7RH, United Kingdom, +44 116 252 5384, jlh36 at le.ac.uk Both positions are available immediately - please indicate which position you are interested in when applying . Deadline for applications is 8th December, 2002 with an expected start date early in 2003. -- T.C. Pearce, PhD URL: http://www.leicester.ac.uk/eg/tcp1/neurolab/ Lecturer in Bioengineering E-mail: t.c.pearce at leicester.ac.uk Department of Engineering Tel: +44 (0)116 223 1290 University of Leicester Fax: +44 (0)116 252 2619 Leicester LE1 7RH | PGPKEYiQA/AwUBPQX69wNft0T+Otu0EQL5lwCg9x YQ6uxdk9RyV/UpOfPav+uPK7IAmwbqsPQx5KxgAGvFEvSxPOMw1iNZ From pli at richmond.edu Sat Nov 16 03:20:52 2002 From: pli at richmond.edu (pli) Date: Sat, 16 Nov 2002 03:20:52 -0500 Subject: Postdoc fellowship Message-ID: <3DCD3F0F@webmail.richmond.edu> Dear Colleagues, "Postdoctoral Position in Neural Network Models of Language" Qualified individuals are invited to apply for a postdoctoral fellowship in connectionist modeling of language processing. The fellowship is supported by the National Science Foundation (USA), and provides an annual stipend of around $38,000-$41,000 for a maximum of 3 years. A qualified candidate should hold a Ph.D. degree in an area of cognitive sciences (computer science, psychology, or linguistics) and have experience in neural networks and natural language processing. Technical experiences with C/C++ and the Unix/Linux operating systems are necessary. Familiarity with MatLab is desirable. The successful candidate will join the PI's research team to work on self- organizing models of language, with particular reference to the acquisition, processing, and disorders of the mental lexicon (see the NSF homepage for a summary of the project: https://www.fastlane.nsf.gov/ servlet/showaward?award=0131829. The project will be carried out at the Cognitive Science Laboratory (http://cogsci.richmond.edu/lab.html) in the Department of Psychology at the University of Richmond, where the cognitive area includes faculty in neuroscience, memory and aging, spatial cognition, and psycholinguistics. UR is a highly selective, private university located six miles west of Richmond on a beautiful 350-acre campus (1 hour west of Williamsburg and east of Shanondoah National Park, and 2 hours south of Washington DC). It has been consistently rated as one of America's best universities by US News and World Report. With its over $1-billion endowment and progressive program enhancements, UR provides a congenial research environment. The target date for the start of the position is May 1, 2003. Consideration of applications will begin as soon as possible. Applicants should send a curriculum vitae, a cover letter, and two letters of recommendation via email to pli at richmond.edu. The University of Richmond is an Equal Opportunity Employer. Women and minority candidates are especially encouraged to apply. For ICONIP ?02 Participants: Please see me at session ThuPmRM158 (?Self-organizing feature maps and vector quantization III?) or send me an email note. Ping Li, Ph.D. Department of Psychology University of Richmond, Virginia 23173, USA. Email: pli at richmond.edu Phone: (804) 289-8125 (O), 287-1236 (lab) http://cogsci.richmond.edu/ or http://www.richmond.edu/~pli/ Currently on sabbatical leave at: Division of Speech and Hearing Sciences Faculty of Education, University of Hong Kong, SAR, PRC. Email: liping at hku.hk From Sebastian_Thrun at heaven.learning.cs.cmu.edu Sun Nov 17 11:44:59 2002 From: Sebastian_Thrun at heaven.learning.cs.cmu.edu (Sebastian Thrun) Date: Sun, 17 Nov 2002 11:44:59 -0500 Subject: NIPS*2002 Preproceedings now online Message-ID: The NIPS*2002 Preproceedings are now online at http://nips.cc (follow the link "online preproceedings") The NIPS*2002 preproceedings contain preliminary drafts of most presentations. The final proceedings will be published after the conference, as in previous years. Sebastian Thrun NIPS*2002 Program Chair From d.polani at herts.ac.uk Mon Nov 18 18:46:29 2002 From: d.polani at herts.ac.uk (Daniel Polani) Date: Tue, 19 Nov 2002 00:46:29 +0100 Subject: CfP EVOLVABILITY AND SENSOR EVOLUTION SYMPOSIUM Message-ID: <15833.31701.234159.97257@perm.feis.herts.ac.uk> Please accept our apologies should you receive this call repeatedly. This is a short version of the call. For more information, see http://www.cs.bham.ac.uk/~jfm/evol-sensor.htm or contact the chairs, Julian Miller (j.miller at cs.bham.ac.uk) or Daniel Polani (d.polani at herts.ac.uk) //////////////////////////////////////////////////////////////////////// Call for Papers & Participation: EPSRC Network on Evolvability in Biological & Software Systems EVOLVABILITY AND SENSOR EVOLUTION SYMPOSIUM ------------------------------------------- sponsored by The Natural Computation Research Group (Univ. of Birmingham) The University of Hertfordshire Adaptive Systems Research Group EPSRC 24-25 April 2003 (Thursday-Friday), University of Birmingham, U.K. //////////////////////////////////////////////////////////////////////// SYMPOSIUM AIMS -------------- This EPSRC symposium follows upon the growing awareness from academia, industry, and research communities of the importance of evolvability, tentatively defined as, the capacity of populations to exhibit adaptive heritable variation.In partcular, the symposium focuses on the relation between evolvability and sensor and effector evolution. The symposium aims to encourage a dialogue between various workers in areas that might benefit from a possible common framework addressing evolvability and sensor/effector evolution. The symposium addresses two aspects that are believed to be central in understanding fundamental biological mechanisms, like information information discovery, acquisition, processing and transmission, both on the level of populations and individuals: these mechanisms are evolvability and sensor evolution. Evolvability Darwinian evolution characterized by heritable variation and selection is not by itself sufficient to account for the capacity to vary and inherent phenotypic expressions of fitness. Rigidity of genotype-phenotype mappings, as often used in evolutionary computation, constrains the dynamics of evolution to a small space of possible biological or artificial systems. Open-ended evolution is not possible under such constraints. Evolution, by itself, cannot fully explain the advant of genetic systems, the flexible genotype-phenotype mappings, heritable fitness. This presents a challenge both to biologists seeking to understand the capacity of life to evolve and to computer scientists who seek to harness biological-like robustness and openness in the evolution of artificial systems. Sensor Evolution ---------------- In natural evolution one finds impressive examples of the principle of exploiting and creating new sensory channels and information they carry. Olfactory, tactile, auditory and visual, but also e.g. electrical and even magnetic senses have evolved in a multitude of variants, often utilizing organs not originally "intended" for the purpose they serve at present. Biologically evolving systems are able to adaptively construct their own hardware and software. The new sensors create new ways of giving meaning to and interpreting the world. Many biological sensors reach a degree of structural and functional complexity and of efficiency which is envied by engineers creating man-made sensors. Sensors enable animals to survive in dynamic and unstructured environments, to perceive and react appropriately to features in the biotic and abiotic environment, including members of the own species as well as predators and prey. Synthesizing artificial sensors for hardware or software systems suggests a similar approach taken for generating life-like behaviour, namely using evolutionary techniques in order to explore design spaces and generate sensors which are specifically adapted with respect to environmental and other fitness related constraints. The creation of channels of sensory input and effectory output lead to higher evolvability as new relevance criteria are developed that confer a survival advantage to future offspring. CALL FOR CONTRIBUTIONS ---------------------- We solicit abstracts for poster or oral presentation (appox. 25-30 minute talk) reporting working in this exciting area. Talks should address an interdisciplinary audience, but may nevertheless deal with issues at the cutting edge of research. Send submissions in plain text (ASCII) format only to j.miller at cs.bham.ac.uk. The submission should show author name(s), full addresses, submission title, and an abstract of not more than 500 words. Submissions should include a statement of the preferred mode of presentation: poster / oral. PROGRAM CHAIRS -------------- Julian Miller (University of Birmingham) Daniel Polani (University of Hertfordshire, UK) CO-ORGANIZERS ------------- Chrystopher Nehaniv (University of Hertfordshire) PARTICIPATION ------------- Participation is open to all students, researchers, or industry representatives with interests in evolvability in biological and software systems. Please register by sending an e-mail j.miller at cs.bham.ac.uk giving your name and affiliation. There is no registration fee. Participation is limited to about 60 participants. Non-presenters are welcome to participate if places remain, so please register your interest as early as possible. IMPORTANT DATES --------------- 20 February 2003: Symposium Abstract Submissions Due 7 March 2003: Notification to Authors 24-25 April 2003: Symposium From stefan.wermter at sunderland.ac.uk Tue Nov 19 12:42:20 2002 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Tue, 19 Nov 2002 17:42:20 +0000 Subject: Stipends for MSc Intelligent Systems Message-ID: <3DDA77FC.658842C2@sunderland.ac.uk> Stipends available for MSc Intelligent Systems ---------------------------------- We are pleased to announce that for eligible students we have obtained funding to offer a bursary for our new MSc Intelligent Systems worth up to 6000 pounds or about 14.000 EURO as fee waiver and stipend for eligible EU students. Please forward to students who may be interested The School of Computing and Technology, University of Sunderland is delighted to announce the launch of its new MSc Intelligent Systems programme for 24th February. Building on the School's leading edge research in intelligent systems this masters programme will be funded via the ESF scheme (see below). Intelligent Systems is an exciting field of study for science and industry since the currently existing computing systems have often not yet reached the various aspects of human performance. "Intelligent Systems" is a term to describe software systems and methods, which simulate aspects of intelligent behaviour. The intention is to learn from nature and human performance in order to build more powerful computing systems. The aim is to learn from cognitive science, neuroscience, biology, engineering, and linguistics for building more powerful computational system architectures. In this programme a wide variety of novel and exciting techniques will be taught including neural networks, intelligent robotics, machine learning, natural language processing, vision, evolutionary genetic computing, data mining, information retrieval, Bayesian computing, knowledge-based systems, fuzzy methods, and hybrid intelligent architectures. Programme Structure -------------- The following lectures/modules are available: Neural Networks Intelligent Systems Architectures Learning Agents Evolutionary Computation Cognitive Neural Science Knowledge Based Systems and Data Mining Bayesian Computation Vision and Intelligent Robots Natural Language Processing Dynamics of Adaptive Systems Intelligent Systems Programming Funding up to 6000 pounds (about 14.000Euro) for eligible students ------------------------------ The Bursary Scheme applies to this Masters programme commencing February 2003 and we have obtained funding through the European Social Fund (ESF). ESF support enables the University to waive the normal tuition fee and provide a bursary of 75 per week for 45 weeks for eligible EU students, together up to 6000 pounds or 14000 Euro. For further information in the first instance please see: http://osiris.sund.ac.uk/webedit/allweb/courses/progmode.php?prog=G550A&mode=FT&mode2=&dmode=C For information on applications and start dates contact: gillian.potts at sunderland.ac.uk Tel: 0191 515 2758 For academic information about the programme contact: alfredo.moscardini at sunderland.ac.uk *************************************** Professor Stefan Wermter Chair for Intelligent Systems Informatics Centre School of Computing and Technology University of Sunderland St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ From yaochu.jin at hre-ftr.f.rd.honda.co.jp Tue Nov 19 04:18:21 2002 From: yaochu.jin at hre-ftr.f.rd.honda.co.jp (Yaochu Jin) Date: Tue, 19 Nov 2002 10:18:21 +0100 Subject: Book Announcement Message-ID: <3DDA01DD.43E25C2C@hre-ftr.f.rd.honda.co.jp> A new book titled "Advanced Fuzzy Systems Design and Applications", published by Springer/Physica Verlag (ISBN: 3-7908-1537-3) is coming out in December 16, 2002. Abstract Fuzzy rule systems have found a wide range of applications in many fields of science and technology. Traditionally, fuzzy rules are generated from human expert knowledge or human heuristics for relatively simple systems. In the last few years, data-driven fuzzy rule generation has been very active. Compared to heuristic fuzzy rules, fuzzy rules generated from data are able to extract more profound knowledge for more complex systems. This book presents a number of approaches to the generation of fuzzy rules from data, ranging from the direct fuzzy inference based to neural networks and evolutionary algorithms based fuzzy rule generation. Besides the approximation accuracy, special attention has been paid to the interpretability of the extracted fuzzy rules. In other words, the fuzzy rules generated from data are supposed to be as comprehensible to human beings as those generated from human heuristics. To this end, many aspects of interpretability of fuzzy systems have been discussed, which must be taken into account in the data-driven fuzzy rule generation. In this way, fuzzy rules generated from data are intelligible to human users and therefore, knowledge about unknown systems can be extracted. The other direction of knowledge extraction from data in terms of interpretable fuzzy rules is the incorporation of human knowledge into learning and evolutionary systems with the help of fuzzy logic. In this book, methods for embedding human knowledge, which can be represented either by fuzzy rules or fuzzy preference models, into neural network learning and evolutionary multiobjective optimization have been introduced. Thus, neural networks and evolutionary algorithms are able to take advantage of data as well as human knowledge. In this book, fuzzy rules are designed mainly for modeling, control and optimization. Along with the discussion of the methods, several real-world application examples in the above fields, including robotics, process control and intelligent vehicle systems are described. Illustrative figures are also given to accompany the most important methods and concepts. To make the book self-contained, fundamental theories as well as a few selected advanced topics about fuzzy systems, neural networks and evolutionary algorithms have been provided. Therefore, this book is a valuable reference for researchers, practitioners and students in many fields of science and engineering. Publisher Website: http://www.springer.de/cgi-bin/search_book.pl?isbn=3-7908-1537-3 Amazon: http://www.amazon.com/exec/obidos/tg/detail/-/3790815373/qid=1034592300/sr=8-2/ref=sr_8_2/002-8946315-4188058?v=glance&n=507846 Main Contents Preface Chapter 1 Fuzzy Sets and Fuzzy Systems Chapter 2 Evolutionary Algorithms Chapter 3 Artificial Neural Networks Chapter 4 Conventional Data-driven Fuzzy systems Design Chapter 5 Neural Network Based Fuzzy Systems Design Chapter 6 Evolutionary Design of Fuzzy Systems Chapter 7 Knowledge Discovery by Extracting Interpretable Fuzzy Rules Chapter 8 Fuzzy Knowledge Incorporation into Neural Networks Chapter 9 Fuzzy Preferences Incorporation into Multiobjective Optimization -- -------------------------------------------------- Dr. Yaochu Jin Future Technology Research Honda R&D Europe (D) Carl-Legien-Str. 30 63073 Offenbach/Main GERMANY Tel: +49 69 89011735 Fax: +49 69 89011749 Email: yaochu.jin at hre-ftr.f.rd.honda.co.jp From se37 at cornell.edu Tue Nov 19 16:12:06 2002 From: se37 at cornell.edu (Shimon Edelman) Date: Tue, 19 Nov 2002 16:12:06 -0500 Subject: Cornell Cognitive Studies Program Message-ID: DEADLINE: JANUARY 1, 2003 CORNELL COGNITIVE STUDIES PROGRAM Cornell University, Ithaca, New York http://www.cogstud.cornell.edu Cornell professor Ulric Neisser introduced the term "Cognitive Psychology" in 1967, with a book that gave the name to the field and helped launch the cognitive revolution. According to Neisser, cognitive psychology is the study of how people learn, structure, store and use information. The Cognitive Studies Program, which provides the framework for research into human information processing at Cornell, extends this concept beyond psychology to teaching students, both graduate and undergraduate, the basics and the latest developments in the brain/mind sciences. Faculty members affiliated with the program belong to more than a dozen departments, including Communication, Computer Science, Design and Environmental Analysis, Economics, Education, Human Development, Linguistics, Management, Mathematics, Neurobiology and Behavior, Philosophy, Psychology, and Sociology. Expanding programs in information science, human-computer interaction, computational linguistics and vision, and other related fields are linked to the research activities in Cognitive Studies. In addition to the spontaneous interactions growing out of common interests in the nature of the mind, there are more formally structured aspects to Cognitive Studies at Cornell. These include campus-wide coordination of cognitive studies activities, and a range of courses, seminars, and specially organized and funded symposia and workshops in cognitive sciences. STUDENTS We invite inquiries from candidates from a wide range of backgrounds, including Psychology, Biology, Computer Science, Linguistics and Philosophy. As a standard Cornell requirement, every doctoral student must have two minors, at least one of which must be in an outside graduate field. The Cognitive Studies Program does not have its own Ph.D. program - it encourages students to register in an existing graduate field - but offers a minor that enables individual students to shape programs of interdisciplinary study in conjunction with their major fields. The goal is to give the students much more than a superficial exposure to the goals and methodologies of disciplines other than their own, while recognizing that it is difficult for an individual to acquire deep expertise in all areas. Each program of study in Cognitive Studies is therefore based upon depth in one discipline coupled with an informed appreciation of ideas and tools selected from other disciplines. Students should apply to the Cornell Graduate School for admission into one of the participating departments. Applications can be submitted online (see http://www.gradschool.cornell.edu/grad/app-request.html). FINANCIAL SUPPORT University-sponsored fellowships, typically awarded on the basis of scholastic ability and promise of achievement, are available through many of the graduate fields. These fellowships usually cover full tuition and student health insurance, and provide a nine- or twelve-month living stipend between $13,000 and $20,000. Subsequent multi-year support is often guaranteed through assistantships and/or fellowships. Cornell fellowships are received by 36 percent of entering doctoral students, 13 percent of entering M.A. and M.S. students, and 9 percent of the entering students in professional master's. The application for university-sponsored fellowships is part of the application for admission; no additional form is needed. Another source of funding is through faculty members' research grants. Most Graduate Research Assistants at Cornell receive a stipend, a full tuition fellowship, and health insurance, through Cornell's Student Health Insurance Plan (SHIP). External fellowships, such as from the Hughes Foundation or the National Science Foundation also are available to entering graduate students. Additional information on the application processes can be found at http://www.gradschool.cornell.edu/grad/fellowships/exfellow.html PARTICIPATING FACULTY (Listings include department and graduate field of each member's primary appointment(s), followed by other graduate field memberships) * Kaushik Basu (Economics) - Political economy; knowledge and rationality; labor markets in developing economies; game theory * Lawrence Blume (Economics) - Evolutionary processes in markets and games * John Bowers (Linguistics) - Syntax and semantics of natural language and the relationship between the two * Richard Boyd (Philosophy; Science and Technology Studies) - Philosophy of science; philosophy of psychology; epistemology; philosophy of language; philosophy of mind * Claire Cardie (Computer Science) - Developing corpus-based techniques for understanding and extracting information from natural language texts * Marianella Casasola (Human Development, Latino Studies) - Aspects of infant cognitive development and early word learning and in particular, the interaction between cognition and early language learning * Stephen Ceci (Human Development, Psychology) - Theories of intelligence; cognitive development; children and the law; children's testimonial competence * Morten Christiansen (Psychology) - Statistical learning of complex sequential structure; language acquisition and processing; neural network models of language and statistical learning; neurophysiological (ERP) measures of statistical learning and language; language evolution * Abigail Cohn (Linguistics, Asian Studies, Romance Studies) - Phonetics and phonology, and their interaction * Christopher Collins (Linguistics) - The syntax of African languages; the syntax of English; general issues in syntactic theory * Robert Constable (Computer Science; Dean for Computing and Information Science) - Type theory and automated reasoning * James Cutting (Psychology) - Perception of motion, depth, and layout; event perception; perception of art, cinema and pictures; structural and functional analyses of perceptual stimuli * Richard Darlington (Psychology, Education, Human Development, Public Affairs) - Psychometric theory and behavioral statistics; differential psychology * Timothy DeVoogd (Psychology; Neurobiology and Behavior) - Neural plasticity; neurobiology of avian learning; sex differences in neuroanatomy and behavior; brain evolution * Molly Diesing (Linguistics) - Syntax, and the interface between syntax and semantics * James Dunn (Education) - Human learning and memory; cognitive psychology; alternative educational systems; seniors and adult education; innovative technology transfer; history and systems of psychology * David Dunning (Psychology) - Social cognition: accuracy and error in self and social judgment, motivated reasoning, tacit inference processes in attitudes and stereotypes; psychology and the law: eyewitness identification * David Easley (Economics) - Economics of information; learning from endogenous data; market microstructure; evolution in games and markets * Shimon Edelman (Psychology, Computer Science; Director of Cognitive Studies Program) - Computational theories of visual representation and recognition; Empiricist theories of language; bridging theoretical, behavioral and neurobiological approaches to the study of the brain * Melissa Ferguson (Psychology) - Automatic attitudes, including their sensitivity and flexibility across situations and their impact on subsequent judgment and behavior; the interface of affect, knowledge accessibility, and motivation; social hypothesis testing and decision-making * David Field (Psychology) - Theories and models of sensory coding and visual processing; visual perception; emphasis on understanding the relations between the structure of the natural environment and the representation of that environment by sensory systems * Barbara Finlay (Psychology; Neurobiology and Behavior) - Development and evolution of the nervous system * James Gair (Professor Emeritus of Linguistics) - Linguistic universals and typology, particularly as they relate to universal grammar and linguistic (and mental) representations * Geraldine Gay (Communication, Education) - Cognitive and social issues for the design and use of interactive communication technologies * Thomas Gilovich (Psychology) - Everyday judgment and decision making; critical thinking and belief; egocentrism; optimism, pessimism, satisfaction, and regret; behavioral economics; gambling * Carl Ginet (Philosophy) - Metaphysics; epistemology; philosophy of mind; philosophy of language * Delia Graff (Philosophy) - Philosophy of language, and related areas, such as logic, metaphysics, epistemology, and the philosophy of mind * Bruce Halpern (Psychology; Neurobiology and Behavior) - Human olfaction; human taste and smell; effects of aging on chemosensory psychophysics * Joseph Halpern (Computer Science, Applied Mathematics) - Reasoning about knowledge and uncertainty; qualitative reasoning; (fault-tolerant) distributed computing; logic; game theory * Wayne Harbert (Linguistics, Germanic Studies, Medieval Studies) - Syntactic structures of the Germanic languages and Celtic languages and what they can reveal about the principles of syntactic organization operating in natural language * Ronald Harris-Warrick (Neurobiology and Behavior; Physiology) - Neuromodulation of neural networks; gene cloning of K+ channels * Alan Hedge (Design and Environmental Analysis; Environmental Toxicology) - Human factors and ergonomics; workplace design; indoor environmental quality (IEQ); intelligent buildings * Benjamin Hellie (Philosophy) - Consciousness; perception; predication; and the overlap between these phenomena * Harold Hodes (Philosophy) - Logic; metaphysics; philosophy of language, of mathematics, and of logic * Howard Howland (Neurobiology and Behavior; Physiology; Psychology; Zoology) - Photorefractive methods of determining focusing ability of infants and young children; high-order aberrations of the eye; and physiological optics in various species, particularly myopia and eye growth in chickens * Ronald Hoy (Neurobiology and Behavior; Entomology) - Animal communication; behavior genetics of invertebrates; regeneration and development in invertebrate nervous systems * Daniel Huttenlocher (Computer Science) - Computer vision, computational geometry, interactive document systems, electronic trading systems, and software development methodologies * Alice M. Isen (Management, Psychology) - Affect and cognition * Scott Johnson (Psychology, Human Development) - Visual perception; visual and cognitive development, especially in infancy; computational models of developmental processes; the nativist/empiricist debate, as it pertains to early cognitive and perceptual skills * Robert Johnston (Psychology; Neurobiology and Behavior) - Neural mechanisms of social recognition and memory (i.e., individual, kin, species, etc.); animal communication and social behavior; olfaction, chemical communication and pheromones; comparative cognition/cognitive ethology; evolution of human and animal behavior; hormones and behavior * Barbara Koslowski (Human Development, Psychology) - Cognitive development; scientific reasoning; conceptual development; problem solving and reasoning * Carol Krumhansl (Psychology, Music) - Human perception and cognition; cognitive processes in music perception and memory; experimental, computational, and neuropsychological approaches; music theory * Lillian Lee (Computer Science) - Natural language processing and machine learning * Christiane Linster (Neurobiology and Behavior; Biomedical Engineering) - Neural basis of sensory information processing, using olfaction as a model system * Barbara Lust (Human Development, Asian Studies, Linguistics, Psychology) - Language and mind, especially first language acquisition; linguistic theory of universal grammar; cognitive development * Michael Macy (Sociology) - Collective action; evolutionary game theory; deviance and social control; social psychology; social exchange theory; rational choice * Sally McConnell-Ginet (Linguistics; Feminist, Gender, and Sexuality Studies) - Formal approaches to natural language meaning, especially the syntax-semantics and the semantics-pragmatics interfaces; also work on language, gender, and sexuality interactions * Helene Mialet (Science and Technology Studies) - Sociology and anthropology of science; continental philosophy of science; cognition; notions of subjectivity; self-fashioning; relations between humans and machines; and processes of innovation, discovery and creativity in science/industry * Amanda Miller-Ockhuizen (Linguistics) - Phonetics; phonetics-phonology interface; African languages * Ulric Neisser (Emeritus Professor of Psychology) - Memory (especially recall of life events); and intelligence (especially IQ tests and their social significance) * Anil Nerode (Mathematics, Applied Mathematics, Computer Science) - Logic; recursive functions and computability; theoretical computer science; hybrid systems; multiple agent autonomous control theory * Kathleen O'Connor (Management) - Negotiation; effects of individual cognition and social context on negotiation performance; work group conflicts and decision-making * Michael Owren (Psychology; Neurobiology and Behavior) - Evolutionary psychology of sound, voice, and speech; nonhuman primate vocal communication; speech evolution * H. Kern Reeve (Neurobiology and Behavior) - Developing and testing biologically realistic models of the evolution of cooperation and conflict in animal societies * Elizabeth Adkins Regan (Psychology; Neurobiology and Behavior; Physiology) - Animal social behavior; hormones and behavior; neuroendocrine mechanisms of avian behavior; mate choice and preference * Richard Ripple (Education) - Educational psychology; psychology of adolescence; adult learning and development; the educational psychology of creativity * Steven Robertson (Human Development) - Understanding the emergence and transformation of behavioral organization in early development, its underlying mechanisms, and its functional significance for the fetus and infant * Mats Rooth (Linguistics) - Computational linguistics and natural language semantics * Carol Rosen (Linguistics, Romance Studies) - Helping to build a theory of universal grammar on a broad database; finding out what kinds of formalism can best reveal the regularities in languages * J. Edward Russo (Management) - Marketing; decision-making and decision aiding; consumer behavior; advertising; behavioral science in management * Dawn Schrader (Education; Feminist, Gender, and Sexuality Studies) - Lifespan developmental psychology, especially metacognition; moral, self and intellectual development in late adolescence and adulthood; the relationship between cognition and action; moral education * Bart Selman (Computer Science, Applied Mathematics, Systems Engineering) - Knowledge representation; reasoning and search; algorithms and complexity; planning; machine learning; cognitive science; software agents; connections between computational complexity and statistical physics * Yasuhiro Shirai (Asian Studies, East Asian Literature, Linguistics) - Crosslinguistic study of the acquisition of tense-aspect morphology, particularly of Japanese; typological study of tense-aspect systems; cognitive models of L2 acquisition and use, particularly the connectionist model * Sydney Shoemaker (Philosophy) - Metaphysics and the philosophy of mind * Richard Shore (Mathematics) - Analyzing the structures of relative complexity of computation of functions on the natural numbers * Michael Spivey (Psychology; Human Development; Neurobiology and Behavior) - Information integration, both within and between perceptual/cognitive systems; experimental and computational approaches to: visuolinguistic processing, language comprehension and acquisition, eye movements, visual attention * Zoltan Gendler Szabo (Philosophy) - Philosophy of language; metaphysics; formal semantics; pragmatics * Elise Temple (Human Development, Psychology) - Developmental cognitive neuroscience; exploring the brain mechanisms underlying cognition in the developing brain; focus on the brain mechanisms of reading and language using functional MRI * Francisco Valero-Cuevas (Aerospace Engineering, Biomedical Engineering, Mechanical Engineering) - Neuromuscular biomechanics and control; human and robotic manipulation; surgery simulation * Qi Wang (Human Development) - Development of autobiographical memory, self, and emotion knowledge, as well as their interactions * Elaine Wethington (Human Development, Sociology) - Stress and the protective mechanisms of social support * Jennifer Whiting (Philosophy, Classics) - Personal identity and concepts of the self (both ancient and modern), with special reference to moral psychology and psychopathology * John Whitman (Linguistics, Asian Studies, East Asian Literature) - The problem of language variation: its limits (how much specific subsystems can vary across languages) and predictors (what typological features co-occur systematically) * Stephen Wicker (Electrical and Computer Engineering; Applied Mathematics) - Wireless information networks; artificial intelligence; error control coding * Wendy Williams (Human Development) - Practical intelligence and tacit knowledge in children and adults; educational policy issues; creativity training * Ramin Zabih (Computer Science) - Computer vision; medical imaging * Draga Zec (Linguistics) - Phonological theory; a study of the principles that govern the patterning of sound in individual languages, as well as cross-linguistically Associate Members: * Richard Canfield (Nutritional Science) - Cognitive development and neurotoxicology in human infants and children * Susan Hertz (Linguistics) - Speech synthesis, both as an end in and of itself and as a vehicle to learn more about various aspects of speech, including timing patterns, language universals, perception, intonation, and the phonology-phonetics interface -------- Application materials can be obtained from the Cornell Graduate School (http://www.gradschool.cornell.edu/grad/app-request.html). The deadlines for completed application materials vary by field, the earliest being January 1, 2003. See individual field listings at the Graduate School web site for other dates. For more information about applying to the Cornell Graduate Program: http://www.gradschool.cornell.edu/grad/default.html. For more information about graduate study in Cornell's Cognitive Studies Program, please contact the Director of Graduate Studies and Director of the Program, Shimon Edelman (se37 at cornell.edu), or the Program Coordinator, Linda LeVan (cogst at cornell.edu). ----------------------------------------------------------------------- Shimon Edelman Professor, Dept. of Psychology, 232 Uris Hall Director, Cornell Cognitive Studies Program Cornell University, Ithaca, NY 14853-7601, USA Web: http://kybele.psych.cornell.edu/~edelman Rationalists do it by the rules. Empiricists do it to the rules. From terry at salk.edu Thu Nov 21 19:21:39 2002 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 21 Nov 2002 16:21:39 -0800 (PST) Subject: NEURAL COMPUTATION 14:12 In-Reply-To: <200211070018.gA70IvK12809@purkinje.salk.edu> Message-ID: <200211220021.gAM0LdU25878@purkinje.salk.edu> Neural Computation - Contents - Volume 14, Number 12 - December 1, 2002 REVIEW On Different Facets of Regularization Theory Zhe Chen and Simon Haykin NOTE Notes on Bell-Sejnowski PDF-Matching Neuron Simone Fiori LETTERS Biophysiologically Plausible Implementations of the Maximum Operation Angela J. Yu, Martin A. Giese and Tomaso A. Poggio Self-Regulation Mechanism of Temporally Asymmetric Hebbian Plasticity Narihisa Matsumoto and Masato Okada Associative Memory with Dynamic Synapses Lovorka Pantic, Joaquin J. Torres, Hilbert J. Kappen, Stan C.A.M. Gielen Adaptive Spatiotemporal Receptive Field Estimation in the Visual Pathway Garrett B. Stanley Global Convergence Rate of Recurrently Connected Neural Networks Tianping Chen, Wenlian Lu and Shun-ichi Amari Locality of Global Stochastic Interaction in Directed Acylic Networks Nihat Ay On Unique Representations of Certain Dynaical Systems Produced by Continuous-Time Recurrent Neural Networks Masahiro Kimura Descartes' Rule of Signs for Radial Basis Function Neural Networks Michael Schmitt Approximation Bounds for Some Sparse Kernel Regression Algorithms Tong Zhang ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $95 $101.65 $143 Institution $590 $631.30 $638 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From harris at cnel.ufl.edu Thu Nov 21 03:33:28 2002 From: harris at cnel.ufl.edu (John G. Harris) Date: Thu, 21 Nov 2002 03:33:28 -0500 Subject: Faculty Positions in Neural Engineering at the University of Florida In-Reply-To: <15833.31701.234159.97257@perm.feis.herts.ac.uk> Message-ID: Faculty Positions in Neural Engineering University of Florida Gainesville, FL The newly formed Biomedical Engineering Department at the University of Florida invites applications and nominations for faculty candidates at all levels starting Fall 2003. We are particularly interested in candidates in neural engineering including computational neuroscience, neural imaging and recording, medical image/signal processing and rehabilitative engineering. BME faculty will likely develop collaborations with researchers and clinicians at the McKnight Brain Institute (www.mbi.ufl.edu), the UF College of Medicine, Shands Hospital at UF (the primary teaching hospital for the College of Medicine) and the Malcolm Randall VA Medical Center. For more information about the department and the available positions, please visit: www.bme.ufl.edu. Candidates should send curriculum vitae with the names of at least four references to: Dr. Frank Bova, Chair of Search Committee, Biomedical Engineering Department, University of Florida, P.O. Box 116131, Gainesville, Florida 32611-6131; e-mail: search at bme.ufl.edu; telephone: 352-392-9790. The University of Florida is an Affirmative Action, Equal Opportunity Employer and women and minorities are encouraged to apply. -- John G. Harris Computational NeuroEngineering Lab (www.cnel.ufl.edu) University of Florida P.O. Box 116130 Gainesville, FL 32611-6130 harris at cnel.ufl.edu Phone: (352) 392-2652 From jgama at liacc.up.pt Thu Nov 21 05:21:16 2002 From: jgama at liacc.up.pt (=?iso-8859-1?Q?Jo=E3o?= Gama) Date: Thu, 21 Nov 2002 10:21:16 +0000 Subject: CFP: IDA Special Issue on Adaptive Learning Systems References: <3DDA77FC.658842C2@sunderland.ac.uk> Message-ID: <3DDCB39C.B0B6C5AE@liacc.up.pt> ***************************************************************** CALL FOR PAPERS Intelligent Data Analysis - IOS Press SPECIAL ISSUE on INCREMENTAL LEARNING SYSTEMS CAPABLE OF DEALING WITH CONCEPT DRIFT ***************************************************************** Please distribute this announcement to all interested parties. Special issue Editors: Miroslav Kubat, University of Miami, USA Joo Gama, University of Porto, Portugal Paul Utgoff, University of Massachusetts, USA Suppose the existence of a concept description that has been induced from a set, T, of training examples. Suppose that later another set, T', of examples become available. What is the most effective way to modify the concept so as to reflect the examples from T'? In many real-world learning problems the data flows continuously and learning algorithms should be able to respond to this circumstance. The first requirement of such algorithms is thus incrementality, the ability to incorporate new information. If the process is not strictly stationary, the target concept could gradually change over time, a fact that should be reflected also by the current version of the induced concept description. The ability to react to concept drift can thus be viewed as a natural extension of incremental learning systems. These techniques can be useful for scaling-up learning algorithms to very large datasets. Other types of problems were these techniques could be potentially useful include: user-modelling, control in dynamic environments, web-mining, times series, etc. Most of evaluation methods for machine learning (e.g. cross-validation) assume that examples are independent and identically distributed. This assumption is clear unrealistic in the presence of concept drift. How can we estimate the performance of learning systems under these constrains? The objective of the special issue is to present the current status of algorithms, applications, and evaluation methods for these problems. Relevant techniques include the following (but are not limited to): 1. Incremental, online, real-time, and any-time learning algorithms 2. Algorithms that learn in the presence of concept drift 3. Evaluation Methods for dynamic instance distributions 4. Real world applications that involve online learning 5. Theory on learning under concept drift. Submission Details: We are expecting full papers to describe original, previously unpublished research, be written in English, and not be simultaneously submitted for publication elsewhere (previous publication of partial results at workshops with informal proceedings is allowed). We could also consider the publication of high-quality surveys on these topics. Please submit a PostScript or PDF file of your paper to: jgama at liacc.up.pt Important Dates: Submission Deadline: 1 of February 2003 Author Notification: 1 of July 2003 Final Paper Deadline: 1 of September 2003 Special Issue: _______________________________________________ From baolshausen at ucdavis.edu Fri Nov 22 01:14:27 2002 From: baolshausen at ucdavis.edu (Bruno Olshausen) Date: Thu, 21 Nov 2002 22:14:27 -0800 Subject: postdocs at RNI Message-ID: <3DDDCB43.E6ED1FEC@ucdavis.edu> Postdoctoral Fellowships in Theoretical Neuroscience Redwood Neuroscience Institute Menlo Park, California The Redwood Neuroscience Institute has several immediate openings for postdoctoral fellows with expertise in theoretical neuroscience. Areas of research include large-scale associative memory architectures, temporal sequence learning and prediction, models of thalamo-cortical and cortico-cortical feedback loops, and models of sensory representation. Postdoctoral fellows will work in collaboration with one or more members of the scientific staff at RNI. Candidates should send a CV and a 1-2 page statement of research interests, along with representative publications to jobs at rni.org. RNI is a nonprofit research organization devoted to studying neural models of cognition and perception. PI's include Pentti Kanerva, Bruno Olshausen, Tony Bell, and Fritz Sommer. For further information, please visit our website at http://www.rni.org, or arrange to speak with Bruno or Tony at the upcoming NIPS meeting (email: bolshausen or tbell @rni.org). -- Bruno A. Olshausen (650) 321-8282 x233 Redwood Neuroscience Institute (650) 321-8585 (fax) 1010 El Camino Real http://www.rni.org Menlo Park, CA 94025 & Center for Neuroscience (530) 757-8749 UC Davis (530) 757-8827 (fax) 1544 Newton Ct. baolshausen at ucdavis.edu Davis, CA 95616 http://redwood.ucdavis.edu/bruno From bogus@does.not.exist.com Fri Nov 22 11:17:11 2002 From: bogus@does.not.exist.com () Date: Fri, 22 Nov 2002 16:17:11 -0000 Subject: Postdoctoral Research Fellowship, Cambridge, U.K. Message-ID: <6EDEB53BA6EA96458F3CEC96BB0282D2021DE1A2@tvp-msg-03.europe.corp.microsoft.com> A non-text attachment was scrubbed... Name: not available Type: multipart/mixed Size: 0 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/b347dd6c/attachment.bin From mhb0 at Lehigh.EDU Sat Nov 23 15:39:37 2002 From: mhb0 at Lehigh.EDU (Mark H. Bickhard) Date: Sat, 23 Nov 2002 15:39:37 -0500 Subject: ISI 2003 Second CFP Message-ID: <3DDFE788.67035A8C@lehigh.edu> Interactivist Summer Institute 2003 July 22-26, 2003 Botanical Auditorium Copenhagen, Denmark Join us in exploring the frontiers of understanding of life, mind, and cognition. There is a growing recognition - across many disciplines - that phenomena of life and mind, including cognition and representation, are emergents of far-from-equilibrium, interactive, autonomous systems. Mind and biology, mind and agent, are being re-united. The classical treatment of cognition and representation within a formalist framework of encodingist assumptions is widely recognized as a fruitless maze of blind alleys. From neurobiology to robotics, from cognitive science to philosophy of mind and language, dynamic and interactive alternatives are being explored. Dynamic systems approaches and autonomous agent research join in the effort. The interactivist model offers a theoretical approach to matters of life and mind, ranging from evolutionary- and neuro-biology - including the emergence of biological function ? through representation, perception, motivation, memory, learning and development, emotions, consciousness, language, rationality, sociality, personality and psychopathology. This work has developed interfaces with studies of central nervous system functioning, the ontology of process, autonomous agents, philosophy of science, and all areas of psychology, philosophy, and cognitive science that address the person. The conference will involve both tutorials addressing central parts and aspects of the interactive model, and papers addressing current work of relevance to this general approach. This will be our second Summer Institute; the first was in 2001 at Lehigh University, Bethlehem, PA, USA. The intention is for this Summer Institute to become a traditional biennial meeting where those sharing the core ideas of interactivism will meet and discuss their work, try to reconstruct its historical roots, put forward current research in different fields that fits the interactivist framework, and define research topics for prospective graduate students. People working in philosophy of mind, linguistics, social sciences, artificial intelligence, cognitive robotics, theoretical biology, and other fields related to the sciences of mind are invited to send their paper submission or statement of interest for participation to the organizers. http://www.lehigh.edu/~interact/isi2003/isi2003.html Mark -- Mark H. Bickhard Cognitive Science 17 Memorial Drive East Lehigh University Bethlehem, PA 18015 610-758-3633 mhb0 at lehigh.edu mark.bickhard at lehigh.edu http://www.lehigh.edu/~mhb0/mhb0.html From mhb0 at Lehigh.EDU Sat Nov 23 15:53:36 2002 From: mhb0 at Lehigh.EDU (Mark H. Bickhard) Date: Sat, 23 Nov 2002 15:53:36 -0500 Subject: CFP Epigenetic Robotics Message-ID: <3DDFEACF.1927F45C@lehigh.edu> EPIROB2003--EPIROB2003-EPIROB2003-EPIROB2003 Call for Papers EPIROB2003 EPIROB2003 Third International Workshop on Epigenetic Robotics: EPIROB2003 Modeling Cognitive Development in Robotic Systems EPIROB2003 EPIROB2003 Organizing Committee: EPIROB2003 Luc Berthouze, Christopher G. Prince EPIROB2003 Christian Balkenius, Daniel Bullock, Hideki Kozima, Georgi Stojanov, EPIROB2003 EPIROB2003 er2003 at epigenetic-robotics.org EPIROB2003 EPIROB2003 http://www.epigenetic-robotics.org EPIROB2003 EPIROB2003 August 4th and 5th 2003 EPIROB2003 EPIROB2003--EPIROB2003-EPIROB2003-EPIROB2003 Call for Papers Location: Boston, MA, USA (held after the Cognitive Science Society meeting) **** Deadline for Submission of Papers & Posters: 14 March 2003 **** This workshop focuses on combining developmental psychology and robotics and generally on: (a) the embodiment of the system; (b) its situatedness in a physical and social environment; (c) a prolonged developmental process through which varied and complex cognitive and perceptual structures emerge as a result of an embodied system interacting with a physical and social environment. Invited Speakers Gyцrgy Gergely (Institute for Psychological Research, Hungarian Academy of Sciences, Budapest, Hungary) Rod Grupen (Laboratory for Perceptual Robotics, University of Massachusetts Amherst, MA, USA) Deb Roy (Media Lab, MIT, USA) Submissions Papers not exceeding eight (8) pages should be submitted electronically (PDF or Postscript) as attachment files to Luc Berthouze (Luc.Berthouze at aist.go.jp). Extended abstracts (maximum two pages) can also be submitted, and will be presented as posters (extended abstracts should also be submitted in PDF or Postscript as attachments to Luc Berthouze(Luc.Berthouze at aist.go.jp). Further instructions to authors will be posted on the workshop web page: http://www.epigenetic-robotics.org Publication of Papers Papers will be published in a proceedings, and archived at CogPrints. From amos at infoeng.flinders.edu.au Mon Nov 25 00:41:11 2002 From: amos at infoeng.flinders.edu.au (Amos Omondi) Date: Mon, 25 Nov 2002 16:11:11 +1030 Subject: call for book chapters: Neural Net FPGAs Message-ID: <5.1.0.14.0.20021125161044.00b1fd48@mail.infoeng.flinders.edu.au> CALL FOR BOOK CHAPTERS FPGA Implementations of Neural Networks (Kluwer Academic Publishers, Boston, 2003) The development of neural networks has now reached the stage where they are employed in a large variety of practical contexts. However, to date the majority of such implementations have been in software. While it is generally recognised that hardware implementations could, through performance (and other) advantages, greatly increase the use of neural networks, in the past the relatively high cost of developing ASICs has meant that only a small number of hardware neural-computing devices has gone beyond the research-prototype stage. Now, however, with the appearance of large, dense, highly parallel FPGA circuits, it has now become possible to envisage the realization in hardware of large-scale neural networks, to get high performance at low costs. Nevertheless, the many opportunities offered by FPGAs also come with many challenges. These range from the choice of data representation, to the implementation of specialized functions, through to the realization of massively parallel neural networks; and accompanying these are important secondary issues, such as benchmarking, development tools and technology transfer. All these issues are currently being investigated by a large number of researchers. The proposed book aims to capture the state of the art in these researches. TOPICS Contributions, covering both original research and expository work, are invited on the following topics, in the context of neural networks realized in FPGAs. (Submissions that on other closely relevant topics are also welcome.) Architectures (systolic arrays, SIMD, etc.) Neurocomputers (complete systems) Hardware accelerators Embedded systems Input/output Hybrid systems Reliability Benchmarking and metrics Massive parallelism Interconnection-network topologies Scalability Algorithm-to-architecture mapping Evolutionary computing Novel hardware algorithms Implementation of activation functions Data-representation formats Applications (biometrics, speech, imaging, information-retrieval, control, biomedical, bioinformatics, etc.) Development tools Technology transfer Case studies SUBMISSIONS AND SCHEDULE Submissions should be made to either of the editors, by 15 Feb 2003. They should be in either .ps or .pdf form and must be formatted, as book chapters, according to the publisher's style files, which will be found at http://www.wkap.nl/authors/bookstylefiles. Authors of accepted contributions will be expected to make final submissions by 1 May 2003. Prospective authors are encouraged to indicate their intent before 30 Dec 2002. EDITORS Amos Omondi School of Informatics and Engineering Flinders University Bedford Park, SA 5042 AUSTRALIA e-mail: amos at infoeng.flinders.edu.au Jagath Rajapakse School of Computer Engineering Nanyang Technological University SINGAPORE 639798 e-mail: asjagath at ntu.edu.edu From xmatumo at brain.riken.go.jp Thu Nov 28 01:39:22 2002 From: xmatumo at brain.riken.go.jp (Narihisa MATSUMOTO) Date: Thu, 28 Nov 2002 15:39:22 +0900 Subject: paper available: Self-Regulation Mechanism of TAH Message-ID: <4.3.2-J.20021128151303.04406ae8@smtp.brain.riken.go.jp> Apologies if you receive this e-mail multiple times. Dear, colleagues. I would like to announce the following paper available on the web site: http://www.mns.brain.riken.go.jp/~xmatumo/paper/NComp02.pdf ``Self-Regulation Mechanism of Temporally Asymmetric Hebbian Plasticity'' by N. Matsumoto & M. Okada Neural Computation, vol. 14, no. 12, pp. 2883-2902, 2002 Abstract--------------------------------------------------------- Recent biological experimental findings have shown that synaptic plasticity depends on the relative timing of the pre- and postsynaptic spikes. This determines whether long-term potentiation (LTP) or long-term depression (LTD) is induced. This synaptic plasticity has been called temporally asymmetric Hebbian plasticity (TAH). Many authors have numerically demonstrated that neural networks are capable of storing spatiotemporal patterns. However, the mathematical mechanism of the storage of spatiotemporal patterns is still unknown, and the effect of LTD is particularly unknown. In this article, we employ a simple neural network model and show that interference between LTP and LTD disappears in a sparse coding scheme. On the other hand, the covariance learning rule is known to be indispensable for the storage of sparse patterns. We also show that TAH has the same qualitative effect as the covariance rule when spatiotemporal patterns are embedded in the network. ---------------------------------------------------------------- This shorter version is in Advances in Neural Information Processing Systems 14, pp. 245-252 Sincerely Yours, *********************************************************** Narihisa MATSUMOTO Junior Research Associate Lab. for Mathematical Neuroscience,RIKEN Brain Science Institute,Japan e-mail: xmatumo at brain.riken.go.jp URL: http://www.mns.brain.riken.go.jp/~xmatumo/index.html *********************************************************** From Nada.Lavrac at ijs.si Thu Nov 28 14:15:10 2002 From: Nada.Lavrac at ijs.si (Nada Lavrac) Date: Thu, 28 Nov 2002 20:15:10 +0100 Subject: DMLL: ML journal Special issue on Data Mining Lessons Learned Message-ID: <3DE66B3E.2FDD@ijs.si> Machine Learning Journal: Special Issue on Data Mining Lessons Learned http://www.hpl.hp.com/personal/Tom_Fawcett/DMLL-MLJ-CFP.html Guest editors: Nada Lavrac, Hiroshi Motoda and Tom Fawcett Submission deadline: Monday, 7 April, 2003. Call for Papers Data mining is concerned with finding interesting or valuable patterns in data. Many techniques have emerged for analyzing and visualizing large volumes of data, and what we see in the technical literature are mostly success stories of these techniques. We rarely hear of steps leading to success, failed attempts, or critical representation choices made; and rarely do papers include expert evaluations of achieved results. Insightful analyses of successful and unsuccessful applications are crucial for increasing our understanding of machine learning techniques and their limitations. Challenge problems (such as the KDD Cup, COIL and PTE challenges) have become popular in recent years and have attracted numerous participants. These challenge problems usually involve a single difficult problem domain, and participants are evaluated by how well their entries satisfy a domain expert. The results of such challenges can be a useful source of feedback to the research community. At ICML-2002 a workshop on Data Mining Lessons Learned was held and (http://www.hpl.hp.com/personal/Tom_Fawcett/DMLL-workshop.html) and was well attended. This special issue of the Machine Learning journal follows the main goals of that workshop, which are to gather experience from successful and unsuccessful data mining endeavors, and to extract the lessons learned from them. Goals The aim of this special issue is to collect the experience gained from data mining applications and challenge competitions. We are interested in lessons learned both from successes and from failures. Authors are invited to report on experiences with challenge problems, experiences in engineering representations for practical problems, and in interacting with experts evaluating solutions. We are also interested in why some particular solutions - despite good performance - were not used in practice, or required additional treatment before they could be used. An ideal contribution to this special issue would describe in sufficient detail one problem domain, either an application or a challenge problem. Contributions not desired for this special issue would be papers that report on marginal improvement over existing methods using artificial synthetic data or UCI data involving no expert evaluation. We offer the following content guidelines to authors. 1. For applications studies, we expect a description of the attempts that succeeded or failed, an analysis of the success or failure, and any steps that had to be taken to make the results practically useful (if they were). Ideally an article should support lessons with evidence, experimental or otherwise; and the lessons should generalize to a class of problems. 2. For challenge problems, we will accept either experiences preparing an individual entry or an analysis of a collection of entries. A collective study might analyze factors such as the features of successful approaches that made them appealing to experts. As with applications studies, such articles should support lessons with evidence, and preferably should generalize to a class of problems. Analyses should preferably shed light on why a certain class of method is best applicable to the type of problem addressed. 3. A submission may analyze methodological aspects from individual developments, or may analyze a subfield of machine learning or a set of data mining methods to uncover important and unknown properties of a class of methods or a field as a whole. Again, a paper should support lessons learned with appropriate evidence. We emphasize that articles to appear in this special issue must satisfy the high standards of the Machine Learning journal. Submissions will be evaluated on the following criteria: Novelty: How original is this lesson? Is this the first time this observation has been made, or has it appeared before? Generality: How widely applicable are the observations or conclusions made by this paper? Are they specific to a single project, a single domain, a class of domains, or much of data mining? Significance: How important are the lessons learned? Are they actionable? To what extent could they influence the directions of work in data mining? Support: How strong is the experimental evidence? Are the lessons drawn from a single project, a group of projects, or a thread of work in the community? Clarity: How clear is the paper? How clearly are the lessons expressed? The criteria for novelty, significance and clarity apply not only to the lessons but also to the paper as a whole. Submission Instructions Manuscripts for submission should be prepared according to the instructions at http://www.cs.ualberta.ca/~holte/mlj/ In preparing submissions, authors should follow the standard instructions for the Machine Learning journal at http://www.cs.ualberta.ca/~holte/mlj/initialsubmission.pdf Submissions should be sent via email to Hiroshi Motoda (motoda at ar.sanken.osaka-u.ac.jp), as well as to Kluwer Academic Publishers (jml at wkap.com). In the email please state very clearly that the submission is for the special issue on Data Mining Lessons Learned. From Johan.Suykens at esat.kuleuven.ac.be Fri Nov 29 09:12:13 2002 From: Johan.Suykens at esat.kuleuven.ac.be (Johan Suykens) Date: Fri, 29 Nov 2002 15:12:13 +0100 Subject: LS-SVMs: book announcement Message-ID: <3DE775BD.1080903@esat.kuleuven.ac.be> We are glad to announce the publication of a new book ************************************************************************* J.A.K. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor, J. Vandewalle, Least Squares Support Vector Machines, World Scientific Pub. Co., Singapore, 2002 (ISBN 981-238-151-1) http://www.esat.kuleuven.ac.be/sista/lssvmlab/book.html ************************************************************************* This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics. The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nystr=F6m sampling with active selection of support vectors. The methods are illustrated with several examples. Contents: Introduction Support vector machines Least squares support vector machines, links with Gaussian processes, regularization networks, and kernel FDA Bayesian inference for LS-SVM models Weighted versions and robust statistics Large scale problems: Nystrom sampling, reduced set methods, basis formation and Fixed size LS-SVM LS-SVM for unsupervised learning: support vector machines formulations for kernel PCA. Related methods of kernel CCA. LS-SVM for recurrent networks and control Illustrations and applications Readership: Graduate students and researchers in neural networks; machine learning; data-mining; signal processing; circuit, systems and control theory; pattern recognition; and statistics. Info: 308pp., Publication date: Nov. 2002, ISBN 981-238-151-1 Order information: World Scientific http://www.wspc.com/books/compsci/5089.html http://www.esat.kuleuven.ac.be/sista/lssvmlab/book.html Freely available LS-SVMlab software http://www.esat.kuleuven.ac.be/sista/lssvmlab/ under GNU General Public License [we apologize for receiving multiple copies of this message] From Johan.Suykens at esat.kuleuven.ac.be Fri Nov 29 10:06:14 2002 From: Johan.Suykens at esat.kuleuven.ac.be (Johan Suykens) Date: Fri, 29 Nov 2002 16:06:14 +0100 Subject: LS-SVMlab announcement Message-ID: <3DE78266.2040101@esat.kuleuven.ac.be> We are glad to announce ******************************************************** LS-SVMlab: Least Squares - Support Vector Machines Matlab/C Toolbox ******************************************************** Website: http://www.esat.kuleuven.ac.be/sista/lssvmlab/ Toolbox: Matlab LS-SVMlab1.4 - Linux and Windows Matlab/C code Basic and advanced versions Functional and object oriented interface Tutorial User's Guide (100pp.): Examples and demos Matlab functions with help Solving and handling: Classification, Regression Tuning, cross-validation, fast loo, receiver operating characteristic (ROC) curves Small and unbalanced data sets High dimensional input data Bayesian framework with three levels of inference Probabilistic interpretations, error bars hyperparameter selection, automatic relevance determination (ARD) input selection, model comparison Multi-class encoding/decoding Sparseness Robustness, robust weighting, robust cross-validation Time series prediction Fixed size LS-SVM, Nystrom method, kernel principal component analayis (kPCA), ridge regression Unsupervised learning Large scale problems Related links, publications, presentations and book: http://www.esat.kuleuven.ac.be/sista/lssvmlab/ Contact: LS-SVMlab at esat.kuleuven.ac.be GNU General Public License: The LS-SVMlab software is made available for research purposes only under the GNU General Public License. LS-SVMlab software may not be used for commercial purposes without explicit written permission after contacting LS-SVMlab at esat.kuleuven.ac.be. From rsun at cecs.missouri.edu Fri Nov 1 15:42:04 2002 From: rsun at cecs.missouri.edu (rsun@cecs.missouri.edu) Date: Fri, 1 Nov 2002 14:42:04 -0600 Subject: new issues of Cognitive Systems Research Message-ID: <200211012042.gA1Kg4a21256@ari1.cecs.missouri.edu> New issues are now available COGNITIVE SYSTEMS RESEARCH Volume 3, Issue 3, Pages 271-554, 2002 =============================================================================== Cognitive Systems Research Volume 3, Issue 3, Pages 271-554 a special issue on situated and embodied cognition edited by Tom Ziemke TABLE OF CONTENTS Introduction to the special issue on situated and embodied cognition, Pages 271-274 Tom Ziemke http://www.sciencedirect.com/science/article/B6W6C-46XHBH0-1/1/3fb9b1689e5f826bcf849a75afcff513 Representation in dynamical and embodied cognition, Pages 275-288 Fred Keijzer http://www.sciencedirect.com/science/article/B6W6C-45H92WR-1/1/7418aabb151835c54aea8d6c8497409c An ecological approach to embodiment and cognition, Pages 289-299 Naoya Hirose http://www.sciencedirect.com/science/article/B6W6C-45HWNG0-1/1/29d27bf1b77832a1191b11c2a4613f0c Semantics, experience and time, Pages 301-337 Stephen E. Robbins http://www.sciencedirect.com/science/article/B6W6C-45M6B2X-1/1/98a589eed632a2f7fa5f959113d1690f When is a cognitive system embodied?, Pages 339-348 Alexander Riegler http://www.sciencedirect.com/science/article/B6W6C-45HFF6Y-2/1/84e99cabef651e9032acc4aaa8056d8b Cognitive task transformations, Pages 349-359 David de Leon http://www.sciencedirect.com/science/article/B6W6C-45HFF6Y-3/1/140319f8b44cb3969aa2b4abd87b6d92 Operationalizing situated cognition and learning, Pages 361-383 Steven M. Kemp http://www.sciencedirect.com/science/article/B6W6C-45KSPCF-1/1/24a1f6377884f0f873a94a99c7eff961 Interfaces of social psychology with situated and embodied cognition, Pages 385-396 Gun R. Semin and Eliot R. Smith http://www.sciencedirect.com/science/article/B6W6C-45H92WR-2/1/cdbdedfa258d6af8f17c982f8fdba684 From Dave_Touretzky at cs.cmu.edu Fri Nov 1 17:48:16 2002 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Fri, 01 Nov 2002 17:48:16 -0500 Subject: graduate training: Center for the Neural Basis of Cognition Message-ID: <3526.1036190896@ammon.boltz.cs.cmu.edu> Graduate Training with the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers an interdisciplinary doctoral training program operated jointly with nine affiliated PhD programs at Carnegie Mellon University and the University of Pittsburgh. Detailed information about this program is available on our web site at http://www.cnbc.cmu.edu The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, statistics, or robotics. For more information about the program, and to obtain application materials, visit our web site at www.cnbc.cmu.edu, or contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu The affiliated PhD programs at the two universities are: Carnegie Mellon University of Pittsburgh Biological Sciences Mathematics Computer Science Neuroscience Computational & Statistical Psychology Learning Psychology Robotics Statistics The CNBC training faculty includes: Eric Ahrens (CMU Biology): MRI studies of the vertebtate nervous system John Anderson (CMU Psychology): models of human cognition German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Alison Barth (CMU Biology): molecular basis of plasticity in neocortex Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Cameron S. Carter (Pitt Psychology/Neuroscience): fMRI and PET attention studies Carson Chow (Pitt Mathematics): spatiotemporal dynamics in neural networks Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Chris Genovese (CMU Statistics): making inferences from scientific data Lori Holt (CMU Psychology): mechanisms of auditory and speech perception John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Satish Iyengar (Pitt Statistics): spike train data analsysis Marcel Just (CMU Psychology): visual thinking, language comprehension Robert Kass (CMU Statistics): transmission of info. by collections of neurons Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Brian MacWhinney (CMU Psychology): models of language acquisition Yoky Matsuoka (CMU Robotics): human motor control and motor learning James McClelland (CMU Psychology): connectionist models of cognition Tom Mitchell (CMU Comp. Sci.): machine learning with application to fMRI Paula Monaghan-Nichols (Pitt Neurobiology): genetic analysis of verteb. CNS devel. Carl Olson (CNBC): spatial representations in primate frontal cortex Charles Perfetti (Pitt Psychology): language and reading processes David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia Lynne Reder (CMU Psychology): models of memory and cognitive processing Erik Reichle (Pitt Psychology): attention and eye movements in reading Jonathan Rubin (Pitt Mathematics): analysis of systems of coupled neurons Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex Peter Strick (Pitt Neurobiology): motor control; basal ganglia and cerebellum Floh Thiels (Pitt Neurosicence): LTP and LTD in hippocampus David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning Nathan Urban (CMU Bioogy): circuitry of the olfactory bulb Valerie Ventura (CMU Statistics): structure of neural firing patterns See http://www.cnbc.cmu.edu for further details. From inaki at cs.utexas.edu Tue Nov 5 16:29:18 2002 From: inaki at cs.utexas.edu (Faustino J. Gomez) Date: Tue, 5 Nov 2002 15:29:18 -0600 Subject: Neuroevolution paper, software, and demo announcement Message-ID: <200211052129.gA5LTIe8020586@laphroaig.cs.utexas.edu> Dear Connectionists, Enforced SubPopulations (ESP) version 3.0 is now available. ESP is a method that uses cooperative coevolution to evolve recurrent neural network for difficult reinforcement learning tasks that require memory. A paper describing the method (abstract below), source code, and animated demo in the double pole balancing task are all available at: http://www.cs.utexas.edu/users/nn/pages/research/ne-methods.html#esp --Faustino J. Gomez and Risto Miikkulainen Paper: ----------------------------------------------------------------------- ROBUST NON-LINEAR CONTROL THROUGH NEUROEVOLUTION. Faustino J. Gomez and Risto Miikkulainen Department of Computer Sciences, The University of Texas at Austin Technical Report TR-AI-02-292, Oct 2002. http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#gomez.tr02-292.ps.gz Abstract: Many complex control problems require sophisticated solutions that are not amenable to traditional controller design. Not only is it difficult to model real world systems, but often it is unclear what kind of behavior is required to solve the task. Reinforcement learning (RL) approaches have made progress by utilizing direct interaction with the task environment, but have so far not scaled well to large state spaces and environments that are not fully observable. In recent years, neuroevolution, the artificial evolution of neural networks, has had remarkable success in tasks that exhibit these two properties, but, like RL methods, requires solutions to be discovered in simulation and then transferred to the real world. To ensure that transfer is possible, evolved controllers need to be robust enough to cope with discrepancies between these two settings. In this paper, we demonstrate how a method called Enforced SubPopulations (ESP), for evolving recurrent neural network controllers, can facilitate this transfer. The method is first compared to a broad range of reinforcement learning algorithms on very difficult versions of the pole balancing problem that involve large (continuous, high-dimensional) state spaces and hidden state. ESP is shown to be significantly more efficient and powerful than the other methods on these tasks. We then present a model-based method that allows controllers evolved in a learned model of the environment to successfully transfer to the real world. We test the method on the most difficult version of the pole balancing task, and show that the appropriate use of noise during evolution can improve transfer significantly by compensating for inaccuracy in the model. Software: ----------------------------------------------------------------------- ESP 3.0 C++ SOURCE CODE http://www.cs.utexas.edu/users/nn/pages/software/abstracts.html#esp-cpp Faustino J. Gomez The ESP package contains source code implementing the Enforced SubPopulations algorithm and the pole balancing domain. The source code is written in C++, and is designed for easy extensibility to new tasks. Documentation for the code in html is available at: http://www.cs.utexas.edu/users/inaki/espdoc/ Demo: ----------------------------------------------------------------------- NON-MARKOV DOUBLE POLE BALANCING http://www.cs.utexas.edu/users/nn/pages/research/espdemo Faustino Gomez The page contains links to movies (in avi and Quicktime) showing the evolution of controllers for the non-Markov double pole balancing problem. The best controller from each generation is shown trying to balance the system using only three of the six state variables (no velocities). From terry at salk.edu Wed Nov 6 19:18:57 2002 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 6 Nov 2002 16:18:57 -0800 (PST) Subject: NEURAL COMPUTATION 14:11 In-Reply-To: <200209032319.g83NJJp43344@purkinje.salk.edu> Message-ID: <200211070018.gA70IvK12809@purkinje.salk.edu> Neural Computation - Contents - Volume 14, Number 11 - November 1, 2002 ARTICLE Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations Wolfgang Maass, Thomas Natschlaeger and Henry Markram NOTE Universal Approximation of Multiple Nonlinear Operators by Neural Networks Andrew D. Back and Tianping Chen LETTERS Long-Term Reward Prediction in TD Models of the Dopamine System Nathaniel D. Daw and David S. Touretzky Invariant Object Recognition in the Visual System with Novel Views of 3D Objects Simon M. Stringer and Edmund T. Rolls Dynamical Working Memory and Timed Responses: The Role of Reverberating Loops in the Olivo-Cerebellar System Werner M. Kistler and Chris I. De Zeeuw Selectively Grouping Neurons in Recurrent Networks of Lateral Inhibition Xiaohui Xie, Richard H. R. Hahnloser, and H. Sebastian Seung An Unsupervised Ensemble Learning Method for Nonlinear Dynamic State-Space Models Harri Valpola and Juha Karhunen Data-Reusing Recurrent Neural Adaptive Filters Danilo Mandic Training A Single Sigmoidal Neuron Is Hard Jiri Sima Two Timescale Analysis of Alopex Algorithm for Optimization P. S. Sastry, M. Magesh, K. P. Unnikrishnan A New Color 3D SFS Methodology Using Neural-Based Color Reflectance Models and Iterative Recursive Method Siu-Yeung Cho and Tommy W. S. Chow ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2002 - VOLUME 14 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $506 $451.42 $554 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From smyth at ics.uci.edu Wed Nov 6 23:48:09 2002 From: smyth at ics.uci.edu (Padhraic Smyth) Date: Wed, 06 Nov 2002 20:48:09 -0800 Subject: new faculty positions at UC Irvine in machine learning and statistics Message-ID: <3DC9F089.2090402@ics.uci.edu> Dear connectionists: The University of California, Irvine currently has open faculty positions in both machine learning and statistics. UCI has a very strong tradition in machine learning and AI and continues to add new faculty in a number of related areas such as bioinformatics, data mining, and computational statistics. Current faculty in Computer Science with interests in these areas include Pierre Baldi, Rina Dechter, David Eppstein, Rick Granger, Dennis Kibler, Rick Lathrop, Eric Mjolsness, Mike Pazzani, and Padhraic Smyth, as well as Hal Stern in the Department of Statistics. The open faculty positions are: A. One faculty position in the Department of Information and Computer Science in the area of Large Scale Data Analysis. We encourage applications from a broad range of "data-driven" research areas, such as machine learning, language modeling, information extraction, bioinformatics, computational vision, etc. For application details please see: http://www.ics.uci.edu/about/jobs/faculty.php The appointment may be made at the pre-tenure or tenured level - applicants at both levels are encouraged to apply. B. Two faculty positions in the newly-formed Department of Statistics, one tenure-track and one tenured. See http://www.stat.uci.edu/ for application information. The department was started this year under new chair, Professor Hal Stern. I strongly encourage readers of this list to apply for either position if interested. Please feel free to contact me directly if you have questions about either position. I would also be happy to chat with prospective applicants at NIPS if you would like to find out more about UCI in general - it is a great place to do learning-related research and you will probably like the weather as well :) all the best Padhraic Smyth Information and Computer Science University of California, Irvine From jjost at mis.mpg.de Thu Nov 7 04:48:44 2002 From: jjost at mis.mpg.de (Juergen Jost) Date: Thu, 7 Nov 2002 10:48:44 +0100 (MET) Subject: Five-year visiting professorship Message-ID: Max Planck Institute for Mathematics in the Sciences Leipzig, Germany Five-year visiting professorship The Max Planck Institute invites applications for a distinguished five-year visiting research professorship in the fields of Neural Networks and Mathematical Cognition Theory. Applicants should have demonstrated outstanding research potential and clear evidence of achievement. Applicants should be under the age of 35. The successful applicant is expected to carry out research in cooperation with other interdisciplinary groups at our Institute. The Institute offers excellent research facilities including a large visitor programme, see http://www.mis.mpg.de/ for further information. Salary will be on the German C2/C3 scale (comparable to an Associate Professorship in North America). Applications should be sent to: Prof. Dr. Eberhard Zeidler Max Planck Institute for Mathematics in the Sciences Inselstrasse 22 D - 04103 Leipzig Germany. The deadline for applications is December 31, 2002. Employment will start on October 1, 2003, or at a mutually agreeable date. Handicapped applicants will be given preference in case of equal qualification. The Max Planck Society as the employer aims at increasing the number of female scientists in fields where underrepresented. Therefore, women are particularly encouraged to apply. From terry at salk.edu Fri Nov 8 21:27:03 2002 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 8 Nov 2002 18:27:03 -0800 (PST) Subject: UCSD Computational Neurobiology Training Program In-Reply-To: <200211070018.gA70IvK12809@purkinje.salk.edu> Message-ID: <200211090227.gA92R3D15246@purkinje.salk.edu> DEADLINE: JANUARY 3, 2003 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/grad/other_compneuro.html The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers who are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Financial support for students enrolled in this training program is available through an NSF Integrative Graduate Education and Research Training (IGERT) award. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, behavioral, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology * Darwin Berg (Neurobiology): Regulation synaptic components, assembly and localization, function and long-term stability. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms * Virginia De Sa (Cognitive Science): Computational basis of perception and learning (both human and machine); multi-sensory integration and contextual influences * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research * Marla Feller (Neurobiology): Mechanisms and function of spontaneous activity in the developing nervous system including the retina, spinal cord, hippocampus and neocortex. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels * David Kleinfeld (Physics): Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. * William Kristan (Neurobiology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies * Scott Makeig (Institue for Neural Computation): Analysis of cognitive event-related brain dynamics and fMRI using time-frequency and Independent Component Analysis * Javier Movellan (Institue for Neural Computation): Sensory fusion and learning algorithms for continuous stochastic systems * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects * Terrence Sejnowski (Salk Institute/Neurobiology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques * Nicholas Spitzer (Neurobiology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of Neurobiology * Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons * Jochen Triesch (Cognitive Science): Sensory integration, visual psychophysics, vision systems and robotics, human-robot interaction, cognitive developmental * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms Requests for application materials should be sent to the University of California, San Diego, Division of Biological Sciences 0348, Graduate Admissions Office, 9500 Gilman Drive, La Jolla, CA, 92093-0348 or to [gradprog at biomail.ucsd.edu]. The deadline for completed application materials, including letters of recommendation, is January 3, 2003. For more information about applying to the UCSD Biology Graduate Program: http://www.biology.ucsd.edu/grad/admissions/index.html From wolpert at hera.ucl.ac.uk Mon Nov 11 04:56:30 2002 From: wolpert at hera.ucl.ac.uk (Daniel Wolpert) Date: Mon, 11 Nov 2002 09:56:30 -0000 Subject: Research fellow in action decoding Message-ID: <002401c28968$9c999470$66463ec1@aphrodite> UNIVERSITY COLLEGE LONDON Institute of Neurology We are seeking a postdoctoral research fellow in neuroscience to work with Professor Chris Frith and Professor Daniel Wolpert on a project studying 'interactions between agents'. This work aims to elucidate the physiological and computational mechanisms by which we use observed movements in order to detect other agents and make inferences about their goals and intentions. A variety of techniques will be used including behavioural studies, EEG and TMS. The candidate should have a PhD or equivalent research experience in relevant fields and experience with programming in Matlab and/or C++ would be advantageous. The post is available with a starting date of 1 January 2003, or nearest convenient date for one year in the first instance, with the possibility of renewal for a second year. Starting salary is up to ?26,255 pa inclusive, depending on experience. Further particulars of the position are on www.hera.ucl.ac.uk Applications (CV and names of three referees and a short statement of research interests) should be returned to Miss E Bertram, Assistant Secretary (Personnel), Institute of Neurology, Queen Square, London WC1N 3BG (fax: +44 20 7278 5069, email: e.bertram at ion.ucl.ac.uk) by November 29th, 2002. Informal enquiries to Professor Frith (c.frith at ion.ucl.ac.uk) or Professor Wolpert (wolpert at ion.ucl.ac.uk). Taking Action for Equality From David.Cohn at acm.org Wed Nov 13 10:55:15 2002 From: David.Cohn at acm.org (David 'Pablo' Cohn) Date: Wed, 13 Nov 2002 07:55:15 -0800 Subject: new paper in JMLR: The Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces Message-ID: [cross-posted to connectionists at the request of the authors - for information on subscribing to the jmlr-announce mailing list, please visit www.jmlr.org] The Journal of Machine Learning Research is pleased to announce the availability of a new paper online at http://www.jmlr.org. ---------------------------------------- The Subspace Information Criterion for Infinite Dimensional Hypothesis Spaces Masashi Sugiyama and Klaus-Robert Muller JMLR 3(Nov):323-359, 2002 Abstract A central problem in learning is selection of an appropriate model. This is typically done by estimating the unknown generalization errors of a set of models to be selected from and then choosing the model with minimal generalization error estimate. In this article, we discuss the problem of model selection and generalization error estimation in the context of kernel regression models, e.g., kernel ridge regression, kernel subset regression or Gaussian process regression. Previously, a non-asymptotic generalization error estimator called the subspace information criterion (SIC) was proposed, that could be successfully applied to finite dimensional subspace models. SIC is an unbiased estimator of the generalization error for the finite sample case under the conditions that the learning target function belongs to a specified reproducing kernel Hilbert space (RKHS) H and the reproducing kernels centered on training sample points span the whole space H. These conditions hold only if dim H < l, where l < infinity is the number of training examples. Therefore, SIC could be applied only to finite dimensional RKHSs. In this paper, we extend the range of applicability of SIC, and show that even if the reproducing kernels centered on training sample points do not span the whole space H, SIC is an unbiased estimator of an essential part of the generalization error. Our extension allows the use of any RKHSs including infinite dimensional ones, i.e., richer function classes commonly used in Gaussian processes, support vector machines or boosting. We further show that when the kernel matrix is invertible, SIC can be expressed in a much simpler form, making its computation highly efficient. In computer simulations on ridge parameter selection with real and artificial data sets, SIC is compared favorably with other standard model selection techniques for instance leave-one-out cross-validation or an empirical Bayesian method. ---------------------------------------- This is the 13th paper in Volume 3. It, and all previous papers, are available electronically at http://www.jmlr.org/ in PostScript and PDF formats. Many are also available in HTML. The papers of Volume 1 and 2 are also available in hardcopy from the MIT Press; please see http://mitpress.mit.edu/JMLR for details. -David Cohn, Managing Editor, Journal of Machine Learning Research From bap at cs.unm.edu Wed Nov 13 18:56:55 2002 From: bap at cs.unm.edu (Barak Pearlmutter) Date: Wed, 13 Nov 2002 16:56:55 -0700 Subject: NIPS*2002 Workshops Abstracts Message-ID: **************************************************************** NIPS*2002 Workshops December 12-14, 2002, Whistler BC, Canada http://www.nips.cc **************************************************************** Workshop Schedule ================= The NIPS*2002 Workshops will be held at the Westin in Whistler BC, Canada, on Fri Dec 13 and Sat Dec 14, with sessions at 7:30-10:00am and 4:00-7:00pm. Two Day Workshops: Fri Dec 13 & Sat Dec 14 Functional Neuroimaging Multi-Agent Learning Propagation on Cyclic Graphs One Day Workshops on Fri Dec 13 Adaptation/Plasticity and Coding Bioinformatics Independent Component Analysis Neuromorphic Engineering Spectral Methods Statistics for Computational Experiments Unreal Data One Day Workshops on Sat Dec 13 Learning Invariant Representations Learning Rankings Negative Results On Learning Kernels Quantum Neural Computing Thalamocortical Processing Universal Learning Algorithms Workshop Descriptions ===================== TWO DAY WORKSHOPS (Friday & Saturday) Propagation Algorithms on Graphs with Cycles: Theory and Applications Shiro Ikeda, Kyushu Institute of Technology, Fukuoka, Japan Toshiyuki Tanaka, Tokyo Metropolitan University, Tokyo, Japan Max Welling, University of Toronto, Toronto, Canada Inference on graphs with cycles (loopy graphs) has drawn much attention in recent years. The problem arises in various fields such as AI, error-correcting codes, statistical physics, and image processing. Although exact inference is often intractable, much progress has been made in solving the problem approximately with local propagation algorithms. The aim of the workshop is to provide an overview of recent developments in methods related to belief propagation. We also encourage discussion of open theoretical problems and new possibilities for applications. Computational Neuroimaging: Foundations, Concepts & Methods Stephen J. Hanson, Rutgers University, Newark, NJ, USA Barak A. Pearlmutter, University of New Mexico, Albuquerque, NM, USA Stephen Strother, University of Minnesota, Minneapolis, MN, USA Lars Kai Hansen, Technical University of Denmark, Lyngby, Denmark Benjamin Martin Bly, Rutgers University, Newark, NJ, USA This workshop will concentrate on the foundations of neuroimaging, including the relation between neural firing and BOLD, fast fMRI, and diffusion methods. The first day includes speakers on new Methods for Multivariate analysis using fMRI especially as they relate to Neural Modeling (ICA, SVM, or other ML methods), which will slip into the next morning, with cognitive neuroscience talks involving Network and specific Neural Modeling approaches to cognitive function on day two. Multi-Agent Learning: Theory and Practice Gerald Tesauro, IBM Research, NY, USA Michael L. Littman, Rutgers University, New Brunswick, NJ, USA Machine learning in a multi-agent system, where learning agents interact with other agents that are also simultaneously learning, poses a radically different set of issues from those arising in normal single-agent learning in a stationary environment. This topic is poorly understood theoretically but seems ripe for progress by building upon many recent advances in RL and in Bayesian, game-theoretic, decision-theoretic, and evolutionary learning. At the same time, learning is increasingly vital in fielded applications of multi-agent systems. Many application domains are envisioned in which teams of software agents or robots learn to cooperate to achieve global objectives. Learning may also be essential in many non-cooperative domains such as economics and finance, where classical game-theoretic solutions are either infeasible or inappropriate. This workshop brings together researchers studying multi-agent learning from a variety of perspectives. Our invited speakers include leading AI theorists, applications developers in fields such as robotics and e-commerce, as well as social scientists studying learning in multi-player human-subject experiments. Slots are also available for contributed talks and/or posters. ONE DAY WORKSHOPS (Friday) The Role of Adaptation/Plasticity in Neuronal Coding Garrett B. Stanley, Harvard University, Cambridge, MA, USA Tai Sing Lee, Carnegie Mellon University, Pittsburgh, PA, USA A ubiquitous characteristic of neuronal processing is the ability to adapt to an ever changing environment on a variety of different time scales. Although the different forms of adaptation/ plasticity have been studied for some time, their role in the encoding process is still not well understood. The most widely utilized measures assume time-invariant encoding dynamics even though mechanisms serving to modify coding properties are continually active in all but the most artificial laboratory conditions. Important questions include: (1) how do encoding dynamics and/or receptive field properties change with time and the statistics of the environment?, (2) what are the underlying sources of these changes?, (3) what are the resulting effects on information transmission and processing in the pathway?, and (4) can the mechanisms of plasticity/adaptation be understood from a behavioral perspective? It is the goal of this workshop to discuss neuronal coding within several different experimental paradigms, in order to explore these issues that have only recently been addressed in the literature. Independent Component Analysis and Beyond Stefan Harmeling, Fraunhofer FIRST, Berlin, Germany Luis Borges de Almeida, INESC ID, Lisbon, Portugal Erkki Oja, HUT, Helsinki, Finland Dinh-Tuan Pham, LMC-IMAG, Grenoble, France Independent component analysis (ICA) aims at extracting unknown hidden factors/components from multivariate data using only the assumption that the unknown factors are mutually independent. Since the introduction of ICA concepts in the early 80s in the context of neural networks and array signal processing, many new successful algorithms have been proposed that are now well-established methods. Since then, diverse applications in telecommunications, biomedical data analysis, feature extraction, speech separation, time-series analysis and data mining have been reported. Notably of special interest for the NIPS community are, first, the application of ICA techniques to process multivariate data from various neuro-physiological recordings and second, the interesting conceptual parallels to information processing in the brain. Recently exciting developments have moved the field towards more general nonlinear or nonindependent source separation paradigms. The goal of the planed workshop is to bring together researchers from the different fields of signal processing, machine learning, statistics and applications to explore these new directions. Spectral Methods in Dimensionality Reduction, Clustering, and Classification Josh Tenenbaum, M.I.T., Cambridge, MA, USA Sam Roweis, University of Toronto, Ontario, Canada Data-driven learning by local or greedy parameter update algorithms is often a painfully slow process fraught with local minima. However, by formulating a learning task as an appropriate algebraic problem, globally optimal solutions may be computed efficiently in closed form via an eigendecomposition. Traditionally, this spectral approach was thought to be applicable only to learning problems with an essentially linear structure, such as principal component analysis or linear discriminant analysis. Recently, researchers in machine learning, statistics, and theoretical computer science have figured out how to cast a number of important nonlinear learning problems in terms amenable to spectral methods. These problems include nonlinear dimensionality reduction, nonparameteric clustering, and nonlinear classification with fully or partially labeled data. Spectral approaches to these problems offer the potential for dramatic improvements in efficiency, accuracy, optimality and reproducibility relative to traditional iterative or greedy learning algorithms. Furthermore, numerical methods for spectral computations are extremely mature and well understood, allowing learning algorithms to benefit from a long history of implementation efficiencies in other fields. The goal of this workshop is to bring together researchers working on spectral approaches across this broad range of problem areas, for a series of talks on state-of-the-art research and discussions of common themes and open questions. Neuromorphic Engineering in the Commercial World Timothy Horiuchi, University of Maryland, College Park, MD, USA Giacomo Indiveri, University-ETH Zurich, Zurich, Switzerland Ralph Etienne-Cummings, University of Maryland, College Park, MD, USA We propose a one-day workshop to discuss strategies, opportunities and success stories in the commercialization of neuromorphic systems. Towards this end, we will be inviting speakers from industry and universities with relevant experience in the field. The discussion will cover a broad range of topics, from visual and auditory processing to olfaction and locomotion, focusing specifically on the key elements and ideas for successfully transitioning from neuroscience to commercialization. Statistical Methods for Computational Experiments in Visual Processing and Computer Vision Ross Beveridge, Colorado State University, Colorado, USA Bruce Draper, Colorado State University, Colorado, USA Geof Givens, Colorado State University, Colorado, USA Ross J. Micheals, NIST, Maryland, USA Jonathon Phillips, DARPA & NIST, Maryland, USA In visual processing and computer vision, computational experiments play a critical role in explaining algorithm and system behavior. Disciplines such as psychophysics and medicine have a long history of designing experiments. Vision researchers are still learning how to use computational experiments to explain how systems behave in complex domains. This workshop will focus on new and better experiment experimental methods in the context of visual processing and computer vision. Unreal Data: Principles of Modeling Nonvectorial Data Alexander J. Smola, Australian National Univ., Canberra, Australia Gunnar Raetsch, Australian National Univ., Canberra, Australia Zoubin Ghahramani, University College London, London, UK A large amount of research in machine learning is concerned with classification and regression for real-valued data which can easily be embedded into a Euclidean vector space. This is in stark contrast with many real world problems, where the data is often a highly structured combination of features, a sequence of symbols, a mixture of different modalities, may have missing variables, etc. To address the problem of learning from non-vectorial data, various methods have been proposed, such as embedding the structures in some metric spaces, the extraction and selection of features, proximity based approaches, parameter constraints in Graphical Models, Inductive Logic Programming, Decision Trees, etc. The goal of this workshop is twofold. Firstly, we hope to make the machine learning community aware of the problems arising from domains where non-vectorspace data abounds and to uncover the pitfalls of mapping such data into vector spaces. Secondly, we will try to find a more uniform structure governing methods for dealing with non-vectorial data and to understand what, if any, are the principles underlying the modeling of non-vectorial data. Machine Learning Techniques for Bioinformatics Colin Campbell, University of Bristol, UK Phil Long, Genome Institute of Singapore This workshop will cover the development and application of machine learning techniques in application to molecular biology. Contributed papers are welcome from any topic relevant to this theme including, but not limited to, analysis of expression data, promoter analysis, protein structure prediction, protein homology detection, detection of splice junctions, and phylogeny, for example. Contributions are most welcome which propose new algorithms or methods, rather than the use of existing techniques. In addition to contributed papers we expect to have several tutorials covering different areas where machine learning techniques are have been successfully applied in this domain. ONE DAY WORKSHOPS (Saturday) Thalamocortical Processing in Audition and Vision Tony Zador, Cold Spring Harbor Lab., Cold Spring Harbor, NY, USA Shihab Shamma, University of Maryland, College Park, MD, USA All sensory information (except olfactory) passes through the thalamus before reaching the cortex. Are the principles governing this thalamocortical transformation shared across sensory modalities? This workshop will investigate this question in the context of audition and vision. Questions include: Do the LGN and MGN play analogous roles in the two sensory modalities? Are the cortical representations of sound and light analogous? Specifically, the idea is to talk about cortical processing (as opposed to purely thalamic), how receptive fields are put together in the cortex, and the implications of these ideas to the nature of information being encoded and extracted at the cortex. Learning of Invariant Representations Konrad Paul Koerding, ETH/UNI Zuerich, Switzerland Bruno. A. Olshausen, U.C. Davis & RNI, CA, USA Much work in recent years has shown that the sensory coding strategies employed in the nervous systems of many animals is well matched to the statistics of their natural environment. For example, it has been shown that lateral inhibition occuring in the retina may be understood in terms of a decorrelation or `whitening' strategy (Srinivasan et al., 1982; Atick & Redlich, 1992), and that the receptive properties of cortical neurons may be understood in terms of sparse coding or ICA (Olshausen & Field, 1996; Bell & Sejnowski, 1997; van Hateren & van der Schaaf, 1998). However, most of these models do not address the question of which properties of the environment are interesting or relevant and which others are behaviourally insignificant. The purpose of this workshop is to focus on unsupervised learning models that attempt to represent features of the environment which are invariant or insensitive to variations such as position, size, or other factors. Quantum Neural Computing Elizabeth C. Behrman, Wichita State University, Wichita, KS, USA James E. Steck, Wichita State University, Wichita, KS, USA Recently there has been a resurgence of interest in quantum computers because of their potential for being very much smaller and very much faster than classical computers, and because of their ability in principle to do hereofore impossible calculations, such as factorization of large numbers in polynomial time. We will explore ways to implement biologically inspired quantum computing in network topologies, thus exploiting both the intrinsic advantages of quantum computing and the adaptability of neural computing. This workshop will follow up on our very successful NIPS 2000 workshop and the IJCNN 2001 Special Session. Aspects/approaches to be explored will include: quantum hardware, e.g., SQUIDs, nmr, trapped ions, quantum dots, and molecular computing; theoretical and practical limits to quantum and quantum neural computing, e.g. noise, error correction, and decoherence; and simulations. Universal Learning Algorithms and Optimal Search Juergen Schmidhuber, IDSIA, Manno-Lugano, Switzerland Marcus Hutter, IDSIA, Manno-Lugano, Switzerland Recent theoretical and practical advances are currently driving a renaissance in the fields of Universal Learners (rooted in Solomonoff's universal induction scheme, 1964) and Optimal Search (rooted in Levin's universal search algorithm, 1973). Both are closely related to the theory of Kolmogorov complexity. The new millennium has brought several significant developments including: Sharp expected loss bounds for universal sequence predictors, theoretically optimal reinforcement learners for general computable environments, computable optimal predictions based on natural priors that take algorithm runtime into account, and practical, bias-optimal, incremental, universal search algorithms. Topics will also include: Practical but general MML/MDL/SRM approaches with theoretical foundation, weighted majority approaches, and no free lunch theorems. On Learning Kernels Nello Cristianini, U.C. Davis, California, USA Tommi Jaakkola, M.I.T., Massachusetts, USA Michael I. Jordan, U.C. Berkeley, California, USA Gert R.G. Lanckriet, U.C. Berkeley, California, USA Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel methods in learning systems. For the past five years, a growing community has been meeting at the NIPS workshops to discuss the latest progress in learning with kernels. Recent research in this area addresses the problem of learning the kernel itself from data. This subfield is becoming an active research area, offering a challenging interplay between statistics, advanced convex optimization and information geometry. It presents a number of interesting open problems. The workshop has two goals. First, it aims at discussing state-of-the-art research on 'learning the kernel', as well as giving an introduction to some of the new techniques used in this subfield. Second, it offers a meeting point for a diverse community of researchers working on kernel methods. As such, contributions from ALL subfields in kernel methods are welcome and will be considered for a poster presentation, with priority to very recent results. Furthermore, contributions on the main theme of learning kernels will be considered for oral presentations. Deadline for submissions: Nov 15, 2002. Negative Results and Open Problems Isabelle Guyon, Clopinet, California, USA In mathematics and theoretical computer science, exhibiting counter examples is part of the established scientific method to rule out wrong hypotheses. Yet, negative results and counter examples are seldom reported in experimental papers, although they can be very valuable. Our workshop will be a forum to freely discuss negative results and introduce the community to challenging open problems. This may include reporting experimental results of principled algorithms that obtain poor performance compared to seemingly dumb heuristics; experimental results that falsify an existing theory; counter examples to a generally admitted conjecture; failure to find a solution to a given problem after various attempts; and failure to demonstrate the advantage of a given method after various attempts. If you have interesting negative results (not inconclusive results) or challenging open problems, you may submit an abstract before November 15, 2002. Beyond Classification and Regression: Learning Rankings, Preferences, Equality Predicates, and Other Structures Rich Caruana, Cornell University, NY, USA Thorsten Joachims, Cornell University, NY, USA Not all supervised learning problems fit the classification/ regression function-learning model. Some problems require predictions other than values or classes. For example, sometimes the magnitude of the values predicted for cases are not important, but the ordering these values induce is important. This workshop addresses supervised learning problems where either the goal of learning or the input to the learner is more complex than in classification and regression. Examples of such problems include learning partial or total orderings, learning equality or match rules, learning to optimize non-standard criteria such as Precision and Recall or ROC Area, using relative preferences as training examples, learning graphs and other structures, and problems that benefit from these approaches (e.g., text retrieval, medical decision making, protein matching). The goal of this one-day workshop is to discuss the current state-of-the-art, and to inspire research on new algorithms and problems. To submit an abstract, see http://www.cs.cornell.edu/People/tj/ranklearn. More extensive information is available on the NIPS web page http://www.nips.cc, which has links to the pages maintained by each individual workshop. The number of workshop proposals was particularly high this year. All together there will be seventeen NIPS*2002 workshops, of which three will last for two days, for a total of twenty workshop-days: a new record. We anticipate a great year not just in the number of workshops and in their quality, but in attendance as well: projections indicate that the workshops may surpass the main conference in total number of participants. From amari at brain.riken.go.jp Thu Nov 14 02:33:31 2002 From: amari at brain.riken.go.jp (Shun-ichi Amari) Date: Thu, 14 Nov 2002 16:33:31 +0900 Subject: NBNI-2002 Message-ID: <003601c28bb0$219edec0$220a0a0a@Abaloneamari> The following is the program of NBNI (Neurobiology and Neuroinformatics workshop). Registration fee of 10,000 yen ispayable at the registration desk. ------------------ NBNI-2002 Fourth Japan-Korea-China-India Joint Workshop on Neurobiology and Neuroinformatics November 25-26, 2002 RIKEN Brain Science Institute, Japan Ohkouchi Hall Seminar Room in BSI Central Building Organized by RIKEN Brain Science Institute Sponsored by KAIST BSRC, KNIH BBRC, KIST BNRC, Korea China Society for Neuroscience National Brain Research Centre, India Organizers: Shun-ichi Amari (RIKEN BSI, Japan) Nobuyuki Nukina (RIKEN BSI, Japan) Chang-Rak Choi (Biomedical Brain Research Center, Korea) Soo-Young Lee (Brain Science Research Center, KAIST, Korea) Tae H. Oh (Brain Neurobiology Research Center, KIST, Korea) Fanji Gu (Fudan University, China) Yizheng Wang (Institute of Neuroscience, China) Vijayalakshmi Ravindranath (National Brain Research Center, India) ------------------ Program November 25 (Monday) 9:30 - 10:00 Registration (Ohkouchi Hall) Opening Ceremony: Ohkouchi Hall (Chair Shun-ichi Amari) 10:00 - 10:10 Opening Address Masao Ito (Director, BSI) 10:10 - 10:20 Welcome Address Shun-ichi Amari (Vice director, BSI) 10:20 - 11:00 Overviews of Activities in Participating Countries Organizers: Shun-ichi Amari, Nobuyuki Nukina, Chang-Rak Choi, Soo-Young Lee, Tae H. Oh, Fanji Gu, Tian-Ming Gao, Vijayalakshmi Ravindranath Plenary Session I: Ohkouchi Hall (Chair Vijayalakshmi Ravindranath) 11:00 - 11:30 Takao K. Hensch (RIKEN BSI, Japan) “Inhibitory circuit control of critical period plasticity in developing visual cortex” 11:30 - 12:00 Tian-Ming Gao (First Military Medical University, China) “Overactivation of potassium channels mediates hippcampal neuronal death induced by ischemi/hypoxi insult” 12:30 - 13:00 Yun-Hee Kim (College of Medicine, Pochon CHA University, Korea) “Reorganization of motor and cognitive network following human brain lesion investigated by functional neuroimaging” Luncheon Meeting 13:00 - 14:00 Chair: Shun-ichi Amari, Nobuyuki Nukina Neurobiology Session I (Ohkouch Hall; Chair Nobuyuki Nukina) 14:00 - 14:30 Masayuki Miura (RIKEN BSI, Japan) “Genetic pathway of neural cell death and degeneration in Drososphila” 14:30 - 15:00 Shyamala Mani (National Brain Research Center, India) “Patterning of the cerebellum in the GAP-43 knockout mouse” 15:00 - 15:30 Young J. Oh (Yonsei University College of Science, Korea) “Mitochondrial and extramitochondrial apoptotic pathways in experimental model of Parkinson’s disease” 15:30 - 16:00 Coffee Break Neurobiology Session II (Ohkouchi Hall; Chair Shyamala Mani) 16:00 - 16:30 Zhi-Wang Li (Tongji Medical College of Huazhong, China) “The action of tachykinins on the primary sensory neurons” 16:30 - 17:00 Sang Eun Kim (Samsung Medical Center, Korea) “Effect of chronic nicotine administration on dopaminergic neurotransmission” Neuroinformatics Session I (BSI Seminar Room; Chair Shiro Usui) 14:00 - 14:30 Michio Sugeno (RIKEN BSI, Japan) “Language-oriented approach to creating the brain” 14:30 - 15:00 Posina Venkata Rayudu (National Brain Research Center, India) “Brain as mathematics” 15:00 - 15:30 Eunjoo Kang (Seoul National University Medical Research Center, Korea) “Cross-modal interactions in speech perception during sentence comprehension: an fMRI study” 15:30 - 16:00 Coffee Break Neuroinformatics Session II (BSI Seminar Room; Chair Shobini L. Rao) 16:00 - 16:30 Yiyuan Tang (Dalian University of Technology, China) “Understanding the brain function through neuroimaging database for Chinese language processing” 16:30 - 17:00 Seung Kee Han (Chungbuk National University, Korea) “Inferring neural connectivity from multiple spike trains” Welcome Reception: Second Floor of The First Restaurant 18:00 - 20:00 November 26 (Tuesday) Plenary Session II: Ohkouchi Hall (Chair Tian-Ming Gao) 10:00 - 10:30 Vijayalakshmi Ravindranath (National Brain Research Center, India) “Towards understanding the pathogenesis of neurodegenerative disorders” 10:30 - 11:00 Tomoki Fukai (Tamagawa University, Japan) “Towards the understanding of biological mechanisms and functional roles of the gamma rhythmic activity” 11:00 - 11:30 Yong-Keun Jung (Kwangju Institute of Science and Technology, Korea) “Ubiquitin conjugating enzyme E2-25K as a novel mediator of amyloid-beta neurotoxicity in Alzheimer’s disease” 11:30 - 12:00 Coffee Break Plenary Session III: Ohkouchi Hall (Chair Soo-Young Lee) 12:00 - 12:30 Shobini L. Rao (National Brain Research Center, India) “Evidence for brain plasticity-cognitive retraining and functional brain imaging” 12:30 - 13:00 Pei-Ji Liang (Shanghai Jiaotong University, China) “Possible mechanism of synaptic plasticity in retinal graded neurons” Luncheon Meeting 13:00 - 14:30 Neurobiology Session III (Ohkouchi Hall: Chair Tae H. Oh) 14:30 - 15:00 Takeshi Iwatsubo (University of Tokyo, Japan) “Formation and function of g-secretase complex” 15:00 - 15:30 Nihar Ranjan Jana (National Brain Research Center, India) “Direct visualization of the expression, selective unclear accumulation, aggregate formation and possible proteolytic processing of the transgene product in a HD exon1-EGFP transgenic mice model” 15:30 - 16:00 Coffee Break Neurobiology Session IV (Ohkouchi Hall: Chair Chang-Rak Choi) 16:00 - 16:30 Rubin Wang (Donghua University, China) “Analysis of dynamics of the phase resetting on the set of the population of neurons” 16:30 - 17:00 Byoung Joo Gwag (Ajou University, Korea) “Mechanisms of selective neuronal death”      Neuroinformatics Session III (BSI Seminar Room: Chair Fanji Gu) 14:30 - 15:00 Aditya Murthy (National Brain Research Center, India) “The role of frontal cortex in overt and covert orienting” 15:00 - 15:30 Lin Xu (Kunming Institute of Zoology, China) “How synaptic plasticity in hippocampus underlies learning and memory” 15:30 - 16:00 Coffee Break Neuroinformatics Session IV (BSI Seminar Room: Chair Yiyuan Tang) 16:00 - 16:30 Masataka Watanabe (University of Tokyo, Japan) “Prefrontal cortex model of selective attention” 16:30 - 17:00 Soo-Young Lee (Korea Advanced Institute of Science and Technology, Korea) “Modeling Human Auditory Pathway for Artificial Auditory Systems in Real-World Noisy Environments” Farewell Party: Second Floor of the Hirosawa Club 18:00 - 20:00 Concluding Address Nobuyuki Nukina (RIKEN BSI, Japan) ----------------------------------- Shun-ichi Amari Vice director RIKEN Brain Science Institute Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687 amari at brain.riken.go.jp; www.bsis.brain.riken.go.jp/ From dgw at MIT.EDU Fri Nov 8 14:39:10 2002 From: dgw at MIT.EDU (David Weininger) Date: Fri, 08 Nov 2002 14:39:10 -0500 Subject: book announcement--Liu Message-ID: <2002110814391018478@outgoing.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/0262122553/ Thank you! Best, David Analog VLSI Circuits and Principles Shih-Chii Liu, J?rg Kramer, Giacomo Indiveri, Tobias Delbr?ck, and Rodney Douglas foreword by Carver A. Mead Neuromorphic engineers work to improve the performance of artificial systems through the development of chips and systems that process information collectively using primarily analog circuits. This book presents the central concepts required for the creative and successful design of analog VLSI circuits. The discussion is weighted toward novel circuits that emulate natural signal processing. Unlike most circuits in commercial or industrial applications, these circuits operate mainly in the subthreshold or weak inversion region. Moreover, their functionality is not limited to linear operations, but also encompasses many interesting nonlinear operations similar to those occurring in natural systems. Topics include device physics, linear and nonlinear circuit forms, translinear circuits, photodetectors, floating-gate devices, noise analysis, and process technology. Shih-Chii Liu, Giacomo Indiveri, and Tobias Delbr?ck are Assistant Professors at the Institute of Neuroinformatics, Zurich, as was the late J?rg Kramer. Rodney Douglas is Director of the Institute of Neuroinformatics and Professor of Neuroinformatics at the University of Zurich. 6 x 9, 472 pp., cloth, ISBN 0-262-12255-3 A Bradford Book ______________________ David Weininger Associate Publicist The MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617 253 2079 617 253 1709 fax http://mitpress.mit.edu From fukushima at karl.teu.ac.jp Thu Nov 14 20:12:51 2002 From: fukushima at karl.teu.ac.jp (Kunihiko FUKUSHIMA) Date: Fri, 15 Nov 2002 10:12:51 +0900 Subject: Call for Papers: Neural Networks for Vision, KES'2003 Message-ID: <4.2.0.58.J.20021115100812.003f97c8@sv1.karl.teu.ac.jp> ============================================================ Call for Papers: Invited Session on ?Neural Networks for Vision --- Biological and Artificial? KES'2003, Oxford, UK ============================================================ 7th International Conference on Knowledge-Based Intelligent Information & Engineering Systems 3, 4 & 5 September 2003, St Anne's College, University of Oxford, U.K. ------------------------------------------ Invited Session on ?Neural Networks for Vision --- Biological and Artificial? Modeling neural networks is important for both understanding the biological brain and obtaining design principles for artificial vision systems of the next generation. This session aims to focus on (1) modeling approach to uncover the mechanism of the biological visual system, and (2) artificial neural networks suggested by the biological visual system. Specific topics of interest include but not limited to: * Biological neural network models for the visual system * Artificial neural networks for vision * Visual pattern recognition using neural networks * Object recognition * Active vision * Selective visual attention * Learning and self-organization of neural networks for vision * Stereoscopic vision, binocular vision * Eye movement and foveation * Early vision * Motion analysis with neural networks * Target detection and tracking * Color vision * Segmentation of images or patterns using neural networks * Completion of imperfect patterns (e.g., partly occluded, or contaminated with noise) * Visual illusion ------------------------------------------ Instructions for Authors Only electronic copies of the papers in Microsoft Word, PDF or Postscript forms are acceptable for review purposes and must be sent to the session chair. However, please note that you will be required to send hard copy of the final version of your paper, if it is accepted; electronic submission of final papers is not allowed. Papers must correspond to the requirements detailed in the Instructions to Authors which is placed on the Conference Web Site, www.bton.ac.uk/kes/kes2003/, or http://www.hotwolf.f9.co.uk/kes/kes2003/ All papers must be presented by one of the authors, who must pay fees. ------------------------------------------ Publication The Conference Proceedings will be published by a major publisher, for example IOS Press of Amsterdam. Extended versions of selected papers will be considered for publication in the International Journal of Knowledge-Based Intelligent Engineering Systems, www.bton.ac.uk/kes/journal/ ------------------------------------------ Important Dates Deadline for submission intention: December 1, 2002 Deadline for receipt of papers by Session Chair: February 1, 2003 Notification of acceptance: March 1, 2003 Camera-ready papers to session chair by: April 1, 2003 (Session Chair must send final camera-ready papers to reach to KES Secretariat by 1 May 2003 or they will not appear in the proceedings). ------------------------------------------ Session Chair: Kunihiko Fukushima Professor Tokyo University of Technology 1404-1, Katakura, Hachioji, Tokyo 192-0982, Japan e-mail: fukushima at karl.teu.ac.jp ------------------------------------------ From ken at phy.ucsf.edu Fri Nov 15 01:37:05 2002 From: ken at phy.ucsf.edu (Ken Miller) Date: Thu, 14 Nov 2002 22:37:05 -0800 Subject: UCSF Postdoctoral/Graduate Fellowships in Theoretical Neurobiology Message-ID: <15828.38417.948955.208183@coltrane.ucsf.edu> FULL INFO: http://www.sloan.ucsf.edu/sloan/sloan-info.html PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR CONTACT ADDRESSES GIVEN BELOW The Sloan-Swartz Center for Theoretical Neurobiology at UCSF solicits applications for pre- and post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in a quantitative field such as mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. The Sloan-Swartz Center offers opportunities to combine theoretical and experimental approaches to understanding the operation of the intact brain. Young scientists with strong theoretical backgrounds will receive scientific training in experimental approaches to understanding the operation of the intact brain. They will learn to integrate their theoretical abilities with these experimental approaches to form a mature research program in integrative neuroscience. The research undertaken by the trainees may be theoretical, experimental, or a combination. Resident Faculty of or frequent visitors to the Sloan-Swartz Center and their research interests include: William Bialek (frequent visitor): Information-theoretic and statistical characterization of, and physical limits to, neural coding and representation Michael Brainard: Mechanisms underlying vocal learning in the songbird; sensorimotor adaptation to alteration of performance-based feedback Allison Doupe: Development of song recognition and production in songbirds Loren Frank: (joining our faculty in summer, 2003): The relationship between behavior and neural activity in the hippocampus and anatomically related cortical areas. Stephen Lisberger: Learning and memory in a simple motor reflex, the vestibulo-ocular reflex, and visual guidance of smooth pursuit eye movements by the cerebral cortex Michael Merzenich: Experience-dependent plasticity underlying learning in the adult cerebral cortex, and the neurological bases of learning disabilities in children Kenneth Miller: Circuitry of the cerebral cortex: its structure, self-organization, and computational function (primarily using cat primary visual cortex as a model system) Philip Sabes: Sensorimotor coordination, adaptation and development of spatially guided behaviors, experience dependent cortical plasticity. Christoph Schreiner: Cortical mechanisms of perception of complex sounds such as speech in adults, and plasticity of speech recognition in children and adults Michael Stryker: Mechanisms that guide development of the visual cortex There are also a number of visiting faculty, including Larry Abbott, Brandeis University; Sebastian Seung, MIT; David Sparks, Baylor University; Steve Zucker, Yale University. TO APPLY for a POSTDOCTORAL position, please send a curriculum vitae, a statement of previous research and research goals, up to three relevant publications, and have two letters of recommendation sent to us. The application deadline is January 31, 2003. Send applications to: Sloan-Swartz Center 2003 Admissions Sloan-Swartz Center for Theoretical Neurobiology at UCSF Department of Physiology University of California 513 Parnassus Ave. San Francisco, CA 94143-0444 PRE-DOCTORAL applicants with strong theoretical training may seek admission into the UCSF Neuroscience Graduate Program as a first-year student. Applicants seeking such admission must apply by Jan. 3, 2003 to be considered for fall, 2003 admission. Application materials for the UCSF Neuroscience Program may be obtained from http://www.ucsf.edu/neurosc/neuro_admissions.html#application or from Pat Vietch Neuroscience Graduate Program Department of Physiology University of California San Francisco San Francisco, CA 94143-0444 neuroscience at phy.ucsf.edu Be sure to include your surface-mail address. The procedure is: make a normal application to the UCSF Neuroscience program; but also alert the Sloan-Swartz Center of your application, by writing to sloan-info at phy.ucsf.edu. If you need more information: -- Consult the Sloan-Swartz Center WWW Home Page: http://www.sloan.ucsf.edu/sloan -- Send e-mail to sloan-info at phy.ucsf.edu -- See also the home page for the W.M. Keck Foundation Center for Integrative Neuroscience, in which the Sloan-Swartz Center is housed: http://www.keck.ucsf.edu/ From herbert.jaeger at ais.fhg.de Fri Nov 15 08:11:28 2002 From: herbert.jaeger at ais.fhg.de (Herbert Jaeger) Date: Fri, 15 Nov 2002 14:11:28 +0100 Subject: R&D positions for RNN applications Message-ID: <3DD4F280.645D2179@ais.fhg.de> The Fraunhofer Institute for Autonomous Intelligent Systems (http://www.ais.fraunhofer.de/index.en.html) is pleased to announce 3 Research Engineer positions for doing research and development using a novel machine learning technique called Echo-State Networks, or ESNs. The positions are for an initial duration of 1 year (starting January 2003) with a possible 6 month extension. We are seeking candidates with degrees in Engineering, Computer Science, Physics or related fields and preferably with some industrial experience. A good command of English is required; German skills would be an additional assett. ESNs are a novel type of recurrent neural network developed and patented in AIS which can be trained extremely efficiently for tasks in nonlinear control, filtering, prediction, pattern recognition and pattern generation. Please consult http://www.ais.fraunhofer.de/INDY/echo_net_research.html for details. The announced positions will aim to apply the ESN technique to a few selected practical problems for eventual commercialization. In this respect, the work will be applied in nature and will be focused on creating demonstrations and working prototypes in a specific application domain. The results of this work will be used to solicit interest from investors and/or technology partners within 18 months. Currently, the following application domains are being investigated more closely: (1) control of strongly nonlinear electrical machines, specifically, fast moving mobile robots and switched reluctance motors, (2) equalization of mildly nonlinear digital communication channels, specifically, high-gain satellite radio transmitters, (3) filtering and prediction of strongly stochastic signals, especially speech signals. The list of examined types of applications is however open and candidates with a background from another area of nonlinear systems engineering are explicitly encouraged to apply. Salary will follow the German categories BAT 2a or 1b according to qualification, age and maritial status (basic salary ranging from 2000 to 3000 Euro per month, plus various add-ons). Fraunhofer AIS is an equal opportunity employer. Please email a CV and covering letter explaining your background and interests to Dr. Herbert Jaeger (herbert.jaeger at ais.fraunhofer.de, http://ais.fraunhofer.de/INDY/herbert/), with a copy to St?phane Beauregard (stephane.beauregard at ais.fraunhofer.de). ------------------------------------------------------------------ Dr. Herbert Jaeger Fraunhofer Institute for Autonomous Intelligent Systems AiS.INDY, Schloss Birlinghoven, D-53754 Sankt Augustin, Germany Phone (+49) 2241-14-2253, Fax (+49) 2241-14-2342 email herbert.jaeger at ais.fraunhofer.de http://www.ais.fraunhofer.de/INDY/herbert/ ------------------------------------------------------------------ From jose at psychology.rutgers.edu Fri Nov 15 07:37:09 2002 From: jose at psychology.rutgers.edu (stephen j. hanson) Date: 15 Nov 2002 07:37:09 -0500 Subject: Computational Neuroscience, Learning, Cognitive Modeling--RUTGERS UNIVERSITY-(Newark Campus) Message-ID: <1037363834.2694.2.camel@vaio> RUTGERS UNIVERSITY- (Newark Campus). PSYCHOLOGY DEPARTMENT The Department of Psychology anticipates making one tenure track, Assistant or Associate Professor level appointment in area of COGNITIVE SCIENCE. In particular we are seeking individuals from one of any of the following THREE areas: LEARNING (Cognitive Modeling), COMPUTATIONAL NEUROSCIENCE, or SOCIAL COGNITION (interests in NEUROIMAGING in any of these areas would also be a plus, since the Department in conjunction with UMDNJ has recently acquired a 3T Neuroimaging Center (see http://www.newark.rutgers.edu/fmri/). The successful candidate is expected to develop and maintain an active, externally funded research program, and to teach at both the graduate and undergraduate levels. Review of applications will begin JANUARY 30th 2002, pending final budgetary approval from the administration. Rutgers University is an equal opportunity/ affirmative action employer. Qualified women and minority candidates are encouraged to apply. Please send a CV, a statement of current and future research interests, and three letters of recommendation to COGNITIVE SCIENCE SEARCH COMMITTEE, Department of Psychology, Rutgers University, Newark, NJ 07102. Email enquires can be made to cogsci at psychology.rutgers.edu. From arno at salk.edu Fri Nov 15 17:52:08 2002 From: arno at salk.edu (Arnaud Delorme) Date: Fri, 15 Nov 2002 14:52:08 -0800 Subject: EEGLAB Toolbox released Message-ID: <3DD57A98.2090905@salk.edu> EEGLAB - Tools for advanced EEG data analysis under Matlab using ICA and time/frequency methods - has been released under the GNU public license for download from: http://sccn.ucsd.edu/eeglab/ EEGLAB is an integrated toolbox of 250 Matlab routines for analyzing and visualizing event-related EEG (or MEG) brain data. EEG, event, and channel location data can be read in a variety of formats. A graphic user interface allows users to explore their data interactively, while global data, event, and channel location structures, plus a command history mechanism ease the transition to writing custom analysis scripts. An extensive .html tutorial and help messages allow users to learn to use all parts of the system. Matlab and binary routines for performing infomax and extended-infomax ICA are included, as is the sample EEG data set used throughout the tutorial. Principal authors: Arnaud Delorme & Scott Makeig Swartz Center for Computational Neuroscience Institute for Neural Computation University of California San Diego eeglab at sccn.ucsd.edu From cmbishop at microsoft.com Fri Nov 15 19:38:40 2002 From: cmbishop at microsoft.com (Christopher Bishop) Date: Sat, 16 Nov 2002 00:38:40 -0000 Subject: AI Statistics conference: FINAL CALL FOR PARTICIPATION Message-ID: <6EDEB53BA6EA96458F3CEC96BB0282D2019DA3A0@tvp-msg-03.europe.corp.microsoft.com> Ninth International Conference on Artificial Intelligence and Statistics January 3-6, 2003, Hyatt Hotel, Key West, Florida http://research.microsoft.com/conferences/AIStats2003/ Deadline for early registration is 1 December 2002. This is the ninth in a series of workshops which have brought together researchers in Artificial Intelligence and in Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encouraged interdisciplinary work. The beginning of January is a very popular time of year for visitors to Key West, and so you are strongly urged to register early and to reserve your accommodation at the substantially reduced conference rate. Invited Speakers: Andrew Blake (Microsoft Research, Cambridge) Bill Freeman (MIT) Zoubin Ghahramani (Gatsby Computational Neuroscience Unit) David Haussler (UCSC) Geoffrey Hinton (University of Toronto) Tommi Jaakkola (MIT) Larwrence Saul (University of Pennsylvania) There has been a record number of submissions to this conference, and after a rigorous review process we have been able to accept 15% of the submissions for oral presentation and 27% for poster presentation. A full programme is available on the web site. Key West provides a superb location for this workshop, and the weather at Key West in January is expected to be very pleasant. The workshop timetable will focus on morning and early evening sessions, allowing ample free time in the afternoons for scientific discussions or to take advantage of local attractions such as scuba diving, snorkelling, wave runners, parasailing, fishing, sailing and golf. Chris Bishop Brendan Frey From tcp1 at leicester.ac.uk Fri Nov 15 06:16:14 2002 From: tcp1 at leicester.ac.uk (Tim Pearce) Date: Fri, 15 Nov 2002 11:16:14 -0000 Subject: PhD Positions In-Reply-To: Message-ID: <001701c28c98$77e773d0$cd6bd28f@rothko> PhD Studentship in Biologically Inspired Robotics A postgraduate researcher is required for an EC-funded project available immediately. The project concerns the development of neuronal models to control an unmanned aerial vehicle (UAV) robot to perform stereotypical moth-like chemotaxis (chemical search) behaviour. The project will develop biologically-inspired sensor, information processing and control systems for a c(hemosensing) UAV. The cUAV will identify and track volatile compounds of different chemical composition in outdooor environments. Its olfactory and sensory-motor systems are to be inspired by the moth, which will be supported by computational neuroscience model development. This development continues our research in artificial and biological olfaction, sensory processing and analysis, neuronal models of learning, real-time behavioural control, and robotics. Further details on the project and the research teams can be found at http://www.le.ac.uk/eg/tcp1/amoth/ The project includes significant funding and opportunities for travel within Europe to visit the laboratories of the participating consortia (in Switzerland, France, and Sweden) and outside Europe to attend international scientific meetings. Applicants should have a strong analytical background, a keen interest in neuroscience, and a good honours degree (at the 2(i) level or higher) in engineering, mathematics or physics. The student will be responsible for development of the experimental set-up for assessing chemical search strategies applied to robots within unsteady laminar/turbulent flow - which will involve programming, simulation, numerical and electronics development. Applicants should have a demonstrated interest in one or more of the following, neuroscience, robotics, and/or artificial intelligence. Some experience of fluid dynamics would be an advantage. Good team skills are essential. The studentship includes a stipend of ?12,000 per year for 3 years and includes full provision for academic fees. Both EU and non-EU nationals may apply. PhD Studentship in Neuroengineering/Computational Neuroscience A postgraduate research position is available on an EC-funded project immediately. The position is to support the EU Network of Excellence in Neuroinformatics - nEUro-IT (details of the network are under construction at http://www.neuro-it.net). The project includes funding and opportunities for travel within Europe to visit educational establishments conducting research related to the interests of the network. Applicants should have a strong analytical background, an interest in neuroscience, and a good honours degree (at the 2(i) level or higher) in engineering, mathematics or physics. As part of their commitment to the Network of Excellence the student will be responsible for development of a database of educational material related to neuroinformatics and neuroengineering within Europe . In addition the student is expected to carry out research in any topic of their choice related to the research of the laboratory (see http://www.le.ac.uk/eg/tcp1/neurolab/ for details) that will be expected to lead to the award of a PhD. Good team skills are essential. The studentship includes a stipend of ?12,000 per year for 3 years and includes full provision for academic fees. Only EU nationals may apply. Further details on the research activities carried in this laboratory can be found at http://www.le.ac.uk/eg/tcp1/neurolab/ The Engineering Department was rated 5A in the Research Assessment Exercise, 2001. Initial enquiries and requests for details of the application process should be addressed to the EU Project Assistant, Mr. John Harrison, Department of Engineering, University of Leicester, Leicester LE1 7RH, United Kingdom, +44 116 252 5384, jlh36 at le.ac.uk Both positions are available immediately - please indicate which position you are interested in when applying . Deadline for applications is 8th December, 2002 with an expected start date early in 2003. -- T.C. Pearce, PhD URL: http://www.leicester.ac.uk/eg/tcp1/neurolab/ Lecturer in Bioengineering E-mail: t.c.pearce at leicester.ac.uk Department of Engineering Tel: +44 (0)116 223 1290 University of Leicester Fax: +44 (0)116 252 2619 Leicester LE1 7RH | PGPKEYiQA/AwUBPQX69wNft0T+Otu0EQL5lwCg9x YQ6uxdk9RyV/UpOfPav+uPK7IAmwbqsPQx5KxgAGvFEvSxPOMw1iNZ From pli at richmond.edu Sat Nov 16 03:20:52 2002 From: pli at richmond.edu (pli) Date: Sat, 16 Nov 2002 03:20:52 -0500 Subject: Postdoc fellowship Message-ID: <3DCD3F0F@webmail.richmond.edu> Dear Colleagues, "Postdoctoral Position in Neural Network Models of Language" Qualified individuals are invited to apply for a postdoctoral fellowship in connectionist modeling of language processing. The fellowship is supported by the National Science Foundation (USA), and provides an annual stipend of around $38,000-$41,000 for a maximum of 3 years. A qualified candidate should hold a Ph.D. degree in an area of cognitive sciences (computer science, psychology, or linguistics) and have experience in neural networks and natural language processing. Technical experiences with C/C++ and the Unix/Linux operating systems are necessary. Familiarity with MatLab is desirable. The successful candidate will join the PI's research team to work on self- organizing models of language, with particular reference to the acquisition, processing, and disorders of the mental lexicon (see the NSF homepage for a summary of the project: https://www.fastlane.nsf.gov/ servlet/showaward?award=0131829. The project will be carried out at the Cognitive Science Laboratory (http://cogsci.richmond.edu/lab.html) in the Department of Psychology at the University of Richmond, where the cognitive area includes faculty in neuroscience, memory and aging, spatial cognition, and psycholinguistics. UR is a highly selective, private university located six miles west of Richmond on a beautiful 350-acre campus (1 hour west of Williamsburg and east of Shanondoah National Park, and 2 hours south of Washington DC). It has been consistently rated as one of America's best universities by US News and World Report. With its over $1-billion endowment and progressive program enhancements, UR provides a congenial research environment. The target date for the start of the position is May 1, 2003. Consideration of applications will begin as soon as possible. Applicants should send a curriculum vitae, a cover letter, and two letters of recommendation via email to pli at richmond.edu. The University of Richmond is an Equal Opportunity Employer. Women and minority candidates are especially encouraged to apply. For ICONIP ?02 Participants: Please see me at session ThuPmRM158 (?Self-organizing feature maps and vector quantization III?) or send me an email note. Ping Li, Ph.D. Department of Psychology University of Richmond, Virginia 23173, USA. Email: pli at richmond.edu Phone: (804) 289-8125 (O), 287-1236 (lab) http://cogsci.richmond.edu/ or http://www.richmond.edu/~pli/ Currently on sabbatical leave at: Division of Speech and Hearing Sciences Faculty of Education, University of Hong Kong, SAR, PRC. Email: liping at hku.hk From Sebastian_Thrun at heaven.learning.cs.cmu.edu Sun Nov 17 11:44:59 2002 From: Sebastian_Thrun at heaven.learning.cs.cmu.edu (Sebastian Thrun) Date: Sun, 17 Nov 2002 11:44:59 -0500 Subject: NIPS*2002 Preproceedings now online Message-ID: The NIPS*2002 Preproceedings are now online at http://nips.cc (follow the link "online preproceedings") The NIPS*2002 preproceedings contain preliminary drafts of most presentations. The final proceedings will be published after the conference, as in previous years. Sebastian Thrun NIPS*2002 Program Chair From d.polani at herts.ac.uk Mon Nov 18 18:46:29 2002 From: d.polani at herts.ac.uk (Daniel Polani) Date: Tue, 19 Nov 2002 00:46:29 +0100 Subject: CfP EVOLVABILITY AND SENSOR EVOLUTION SYMPOSIUM Message-ID: <15833.31701.234159.97257@perm.feis.herts.ac.uk> Please accept our apologies should you receive this call repeatedly. This is a short version of the call. For more information, see http://www.cs.bham.ac.uk/~jfm/evol-sensor.htm or contact the chairs, Julian Miller (j.miller at cs.bham.ac.uk) or Daniel Polani (d.polani at herts.ac.uk) //////////////////////////////////////////////////////////////////////// Call for Papers & Participation: EPSRC Network on Evolvability in Biological & Software Systems EVOLVABILITY AND SENSOR EVOLUTION SYMPOSIUM ------------------------------------------- sponsored by The Natural Computation Research Group (Univ. of Birmingham) The University of Hertfordshire Adaptive Systems Research Group EPSRC 24-25 April 2003 (Thursday-Friday), University of Birmingham, U.K. //////////////////////////////////////////////////////////////////////// SYMPOSIUM AIMS -------------- This EPSRC symposium follows upon the growing awareness from academia, industry, and research communities of the importance of evolvability, tentatively defined as, the capacity of populations to exhibit adaptive heritable variation.In partcular, the symposium focuses on the relation between evolvability and sensor and effector evolution. The symposium aims to encourage a dialogue between various workers in areas that might benefit from a possible common framework addressing evolvability and sensor/effector evolution. The symposium addresses two aspects that are believed to be central in understanding fundamental biological mechanisms, like information information discovery, acquisition, processing and transmission, both on the level of populations and individuals: these mechanisms are evolvability and sensor evolution. Evolvability Darwinian evolution characterized by heritable variation and selection is not by itself sufficient to account for the capacity to vary and inherent phenotypic expressions of fitness. Rigidity of genotype-phenotype mappings, as often used in evolutionary computation, constrains the dynamics of evolution to a small space of possible biological or artificial systems. Open-ended evolution is not possible under such constraints. Evolution, by itself, cannot fully explain the advant of genetic systems, the flexible genotype-phenotype mappings, heritable fitness. This presents a challenge both to biologists seeking to understand the capacity of life to evolve and to computer scientists who seek to harness biological-like robustness and openness in the evolution of artificial systems. Sensor Evolution ---------------- In natural evolution one finds impressive examples of the principle of exploiting and creating new sensory channels and information they carry. Olfactory, tactile, auditory and visual, but also e.g. electrical and even magnetic senses have evolved in a multitude of variants, often utilizing organs not originally "intended" for the purpose they serve at present. Biologically evolving systems are able to adaptively construct their own hardware and software. The new sensors create new ways of giving meaning to and interpreting the world. Many biological sensors reach a degree of structural and functional complexity and of efficiency which is envied by engineers creating man-made sensors. Sensors enable animals to survive in dynamic and unstructured environments, to perceive and react appropriately to features in the biotic and abiotic environment, including members of the own species as well as predators and prey. Synthesizing artificial sensors for hardware or software systems suggests a similar approach taken for generating life-like behaviour, namely using evolutionary techniques in order to explore design spaces and generate sensors which are specifically adapted with respect to environmental and other fitness related constraints. The creation of channels of sensory input and effectory output lead to higher evolvability as new relevance criteria are developed that confer a survival advantage to future offspring. CALL FOR CONTRIBUTIONS ---------------------- We solicit abstracts for poster or oral presentation (appox. 25-30 minute talk) reporting working in this exciting area. Talks should address an interdisciplinary audience, but may nevertheless deal with issues at the cutting edge of research. Send submissions in plain text (ASCII) format only to j.miller at cs.bham.ac.uk. The submission should show author name(s), full addresses, submission title, and an abstract of not more than 500 words. Submissions should include a statement of the preferred mode of presentation: poster / oral. PROGRAM CHAIRS -------------- Julian Miller (University of Birmingham) Daniel Polani (University of Hertfordshire, UK) CO-ORGANIZERS ------------- Chrystopher Nehaniv (University of Hertfordshire) PARTICIPATION ------------- Participation is open to all students, researchers, or industry representatives with interests in evolvability in biological and software systems. Please register by sending an e-mail j.miller at cs.bham.ac.uk giving your name and affiliation. There is no registration fee. Participation is limited to about 60 participants. Non-presenters are welcome to participate if places remain, so please register your interest as early as possible. IMPORTANT DATES --------------- 20 February 2003: Symposium Abstract Submissions Due 7 March 2003: Notification to Authors 24-25 April 2003: Symposium From stefan.wermter at sunderland.ac.uk Tue Nov 19 12:42:20 2002 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Tue, 19 Nov 2002 17:42:20 +0000 Subject: Stipends for MSc Intelligent Systems Message-ID: <3DDA77FC.658842C2@sunderland.ac.uk> Stipends available for MSc Intelligent Systems ---------------------------------- We are pleased to announce that for eligible students we have obtained funding to offer a bursary for our new MSc Intelligent Systems worth up to 6000 pounds or about 14.000 EURO as fee waiver and stipend for eligible EU students. Please forward to students who may be interested The School of Computing and Technology, University of Sunderland is delighted to announce the launch of its new MSc Intelligent Systems programme for 24th February. Building on the School's leading edge research in intelligent systems this masters programme will be funded via the ESF scheme (see below). Intelligent Systems is an exciting field of study for science and industry since the currently existing computing systems have often not yet reached the various aspects of human performance. "Intelligent Systems" is a term to describe software systems and methods, which simulate aspects of intelligent behaviour. The intention is to learn from nature and human performance in order to build more powerful computing systems. The aim is to learn from cognitive science, neuroscience, biology, engineering, and linguistics for building more powerful computational system architectures. In this programme a wide variety of novel and exciting techniques will be taught including neural networks, intelligent robotics, machine learning, natural language processing, vision, evolutionary genetic computing, data mining, information retrieval, Bayesian computing, knowledge-based systems, fuzzy methods, and hybrid intelligent architectures. Programme Structure -------------- The following lectures/modules are available: Neural Networks Intelligent Systems Architectures Learning Agents Evolutionary Computation Cognitive Neural Science Knowledge Based Systems and Data Mining Bayesian Computation Vision and Intelligent Robots Natural Language Processing Dynamics of Adaptive Systems Intelligent Systems Programming Funding up to 6000 pounds (about 14.000Euro) for eligible students ------------------------------ The Bursary Scheme applies to this Masters programme commencing February 2003 and we have obtained funding through the European Social Fund (ESF). ESF support enables the University to waive the normal tuition fee and provide a bursary of 75 per week for 45 weeks for eligible EU students, together up to 6000 pounds or 14000 Euro. For further information in the first instance please see: http://osiris.sund.ac.uk/webedit/allweb/courses/progmode.php?prog=G550A&mode=FT&mode2=&dmode=C For information on applications and start dates contact: gillian.potts at sunderland.ac.uk Tel: 0191 515 2758 For academic information about the programme contact: alfredo.moscardini at sunderland.ac.uk *************************************** Professor Stefan Wermter Chair for Intelligent Systems Informatics Centre School of Computing and Technology University of Sunderland St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ From yaochu.jin at hre-ftr.f.rd.honda.co.jp Tue Nov 19 04:18:21 2002 From: yaochu.jin at hre-ftr.f.rd.honda.co.jp (Yaochu Jin) Date: Tue, 19 Nov 2002 10:18:21 +0100 Subject: Book Announcement Message-ID: <3DDA01DD.43E25C2C@hre-ftr.f.rd.honda.co.jp> A new book titled "Advanced Fuzzy Systems Design and Applications", published by Springer/Physica Verlag (ISBN: 3-7908-1537-3) is coming out in December 16, 2002. Abstract Fuzzy rule systems have found a wide range of applications in many fields of science and technology. Traditionally, fuzzy rules are generated from human expert knowledge or human heuristics for relatively simple systems. In the last few years, data-driven fuzzy rule generation has been very active. Compared to heuristic fuzzy rules, fuzzy rules generated from data are able to extract more profound knowledge for more complex systems. This book presents a number of approaches to the generation of fuzzy rules from data, ranging from the direct fuzzy inference based to neural networks and evolutionary algorithms based fuzzy rule generation. Besides the approximation accuracy, special attention has been paid to the interpretability of the extracted fuzzy rules. In other words, the fuzzy rules generated from data are supposed to be as comprehensible to human beings as those generated from human heuristics. To this end, many aspects of interpretability of fuzzy systems have been discussed, which must be taken into account in the data-driven fuzzy rule generation. In this way, fuzzy rules generated from data are intelligible to human users and therefore, knowledge about unknown systems can be extracted. The other direction of knowledge extraction from data in terms of interpretable fuzzy rules is the incorporation of human knowledge into learning and evolutionary systems with the help of fuzzy logic. In this book, methods for embedding human knowledge, which can be represented either by fuzzy rules or fuzzy preference models, into neural network learning and evolutionary multiobjective optimization have been introduced. Thus, neural networks and evolutionary algorithms are able to take advantage of data as well as human knowledge. In this book, fuzzy rules are designed mainly for modeling, control and optimization. Along with the discussion of the methods, several real-world application examples in the above fields, including robotics, process control and intelligent vehicle systems are described. Illustrative figures are also given to accompany the most important methods and concepts. To make the book self-contained, fundamental theories as well as a few selected advanced topics about fuzzy systems, neural networks and evolutionary algorithms have been provided. Therefore, this book is a valuable reference for researchers, practitioners and students in many fields of science and engineering. Publisher Website: http://www.springer.de/cgi-bin/search_book.pl?isbn=3-7908-1537-3 Amazon: http://www.amazon.com/exec/obidos/tg/detail/-/3790815373/qid=1034592300/sr=8-2/ref=sr_8_2/002-8946315-4188058?v=glance&n=507846 Main Contents Preface Chapter 1 Fuzzy Sets and Fuzzy Systems Chapter 2 Evolutionary Algorithms Chapter 3 Artificial Neural Networks Chapter 4 Conventional Data-driven Fuzzy systems Design Chapter 5 Neural Network Based Fuzzy Systems Design Chapter 6 Evolutionary Design of Fuzzy Systems Chapter 7 Knowledge Discovery by Extracting Interpretable Fuzzy Rules Chapter 8 Fuzzy Knowledge Incorporation into Neural Networks Chapter 9 Fuzzy Preferences Incorporation into Multiobjective Optimization -- -------------------------------------------------- Dr. Yaochu Jin Future Technology Research Honda R&D Europe (D) Carl-Legien-Str. 30 63073 Offenbach/Main GERMANY Tel: +49 69 89011735 Fax: +49 69 89011749 Email: yaochu.jin at hre-ftr.f.rd.honda.co.jp From se37 at cornell.edu Tue Nov 19 16:12:06 2002 From: se37 at cornell.edu (Shimon Edelman) Date: Tue, 19 Nov 2002 16:12:06 -0500 Subject: Cornell Cognitive Studies Program Message-ID: DEADLINE: JANUARY 1, 2003 CORNELL COGNITIVE STUDIES PROGRAM Cornell University, Ithaca, New York http://www.cogstud.cornell.edu Cornell professor Ulric Neisser introduced the term "Cognitive Psychology" in 1967, with a book that gave the name to the field and helped launch the cognitive revolution. According to Neisser, cognitive psychology is the study of how people learn, structure, store and use information. The Cognitive Studies Program, which provides the framework for research into human information processing at Cornell, extends this concept beyond psychology to teaching students, both graduate and undergraduate, the basics and the latest developments in the brain/mind sciences. Faculty members affiliated with the program belong to more than a dozen departments, including Communication, Computer Science, Design and Environmental Analysis, Economics, Education, Human Development, Linguistics, Management, Mathematics, Neurobiology and Behavior, Philosophy, Psychology, and Sociology. Expanding programs in information science, human-computer interaction, computational linguistics and vision, and other related fields are linked to the research activities in Cognitive Studies. In addition to the spontaneous interactions growing out of common interests in the nature of the mind, there are more formally structured aspects to Cognitive Studies at Cornell. These include campus-wide coordination of cognitive studies activities, and a range of courses, seminars, and specially organized and funded symposia and workshops in cognitive sciences. STUDENTS We invite inquiries from candidates from a wide range of backgrounds, including Psychology, Biology, Computer Science, Linguistics and Philosophy. As a standard Cornell requirement, every doctoral student must have two minors, at least one of which must be in an outside graduate field. The Cognitive Studies Program does not have its own Ph.D. program - it encourages students to register in an existing graduate field - but offers a minor that enables individual students to shape programs of interdisciplinary study in conjunction with their major fields. The goal is to give the students much more than a superficial exposure to the goals and methodologies of disciplines other than their own, while recognizing that it is difficult for an individual to acquire deep expertise in all areas. Each program of study in Cognitive Studies is therefore based upon depth in one discipline coupled with an informed appreciation of ideas and tools selected from other disciplines. Students should apply to the Cornell Graduate School for admission into one of the participating departments. Applications can be submitted online (see http://www.gradschool.cornell.edu/grad/app-request.html). FINANCIAL SUPPORT University-sponsored fellowships, typically awarded on the basis of scholastic ability and promise of achievement, are available through many of the graduate fields. These fellowships usually cover full tuition and student health insurance, and provide a nine- or twelve-month living stipend between $13,000 and $20,000. Subsequent multi-year support is often guaranteed through assistantships and/or fellowships. Cornell fellowships are received by 36 percent of entering doctoral students, 13 percent of entering M.A. and M.S. students, and 9 percent of the entering students in professional master's. The application for university-sponsored fellowships is part of the application for admission; no additional form is needed. Another source of funding is through faculty members' research grants. Most Graduate Research Assistants at Cornell receive a stipend, a full tuition fellowship, and health insurance, through Cornell's Student Health Insurance Plan (SHIP). External fellowships, such as from the Hughes Foundation or the National Science Foundation also are available to entering graduate students. Additional information on the application processes can be found at http://www.gradschool.cornell.edu/grad/fellowships/exfellow.html PARTICIPATING FACULTY (Listings include department and graduate field of each member's primary appointment(s), followed by other graduate field memberships) * Kaushik Basu (Economics) - Political economy; knowledge and rationality; labor markets in developing economies; game theory * Lawrence Blume (Economics) - Evolutionary processes in markets and games * John Bowers (Linguistics) - Syntax and semantics of natural language and the relationship between the two * Richard Boyd (Philosophy; Science and Technology Studies) - Philosophy of science; philosophy of psychology; epistemology; philosophy of language; philosophy of mind * Claire Cardie (Computer Science) - Developing corpus-based techniques for understanding and extracting information from natural language texts * Marianella Casasola (Human Development, Latino Studies) - Aspects of infant cognitive development and early word learning and in particular, the interaction between cognition and early language learning * Stephen Ceci (Human Development, Psychology) - Theories of intelligence; cognitive development; children and the law; children's testimonial competence * Morten Christiansen (Psychology) - Statistical learning of complex sequential structure; language acquisition and processing; neural network models of language and statistical learning; neurophysiological (ERP) measures of statistical learning and language; language evolution * Abigail Cohn (Linguistics, Asian Studies, Romance Studies) - Phonetics and phonology, and their interaction * Christopher Collins (Linguistics) - The syntax of African languages; the syntax of English; general issues in syntactic theory * Robert Constable (Computer Science; Dean for Computing and Information Science) - Type theory and automated reasoning * James Cutting (Psychology) - Perception of motion, depth, and layout; event perception; perception of art, cinema and pictures; structural and functional analyses of perceptual stimuli * Richard Darlington (Psychology, Education, Human Development, Public Affairs) - Psychometric theory and behavioral statistics; differential psychology * Timothy DeVoogd (Psychology; Neurobiology and Behavior) - Neural plasticity; neurobiology of avian learning; sex differences in neuroanatomy and behavior; brain evolution * Molly Diesing (Linguistics) - Syntax, and the interface between syntax and semantics * James Dunn (Education) - Human learning and memory; cognitive psychology; alternative educational systems; seniors and adult education; innovative technology transfer; history and systems of psychology * David Dunning (Psychology) - Social cognition: accuracy and error in self and social judgment, motivated reasoning, tacit inference processes in attitudes and stereotypes; psychology and the law: eyewitness identification * David Easley (Economics) - Economics of information; learning from endogenous data; market microstructure; evolution in games and markets * Shimon Edelman (Psychology, Computer Science; Director of Cognitive Studies Program) - Computational theories of visual representation and recognition; Empiricist theories of language; bridging theoretical, behavioral and neurobiological approaches to the study of the brain * Melissa Ferguson (Psychology) - Automatic attitudes, including their sensitivity and flexibility across situations and their impact on subsequent judgment and behavior; the interface of affect, knowledge accessibility, and motivation; social hypothesis testing and decision-making * David Field (Psychology) - Theories and models of sensory coding and visual processing; visual perception; emphasis on understanding the relations between the structure of the natural environment and the representation of that environment by sensory systems * Barbara Finlay (Psychology; Neurobiology and Behavior) - Development and evolution of the nervous system * James Gair (Professor Emeritus of Linguistics) - Linguistic universals and typology, particularly as they relate to universal grammar and linguistic (and mental) representations * Geraldine Gay (Communication, Education) - Cognitive and social issues for the design and use of interactive communication technologies * Thomas Gilovich (Psychology) - Everyday judgment and decision making; critical thinking and belief; egocentrism; optimism, pessimism, satisfaction, and regret; behavioral economics; gambling * Carl Ginet (Philosophy) - Metaphysics; epistemology; philosophy of mind; philosophy of language * Delia Graff (Philosophy) - Philosophy of language, and related areas, such as logic, metaphysics, epistemology, and the philosophy of mind * Bruce Halpern (Psychology; Neurobiology and Behavior) - Human olfaction; human taste and smell; effects of aging on chemosensory psychophysics * Joseph Halpern (Computer Science, Applied Mathematics) - Reasoning about knowledge and uncertainty; qualitative reasoning; (fault-tolerant) distributed computing; logic; game theory * Wayne Harbert (Linguistics, Germanic Studies, Medieval Studies) - Syntactic structures of the Germanic languages and Celtic languages and what they can reveal about the principles of syntactic organization operating in natural language * Ronald Harris-Warrick (Neurobiology and Behavior; Physiology) - Neuromodulation of neural networks; gene cloning of K+ channels * Alan Hedge (Design and Environmental Analysis; Environmental Toxicology) - Human factors and ergonomics; workplace design; indoor environmental quality (IEQ); intelligent buildings * Benjamin Hellie (Philosophy) - Consciousness; perception; predication; and the overlap between these phenomena * Harold Hodes (Philosophy) - Logic; metaphysics; philosophy of language, of mathematics, and of logic * Howard Howland (Neurobiology and Behavior; Physiology; Psychology; Zoology) - Photorefractive methods of determining focusing ability of infants and young children; high-order aberrations of the eye; and physiological optics in various species, particularly myopia and eye growth in chickens * Ronald Hoy (Neurobiology and Behavior; Entomology) - Animal communication; behavior genetics of invertebrates; regeneration and development in invertebrate nervous systems * Daniel Huttenlocher (Computer Science) - Computer vision, computational geometry, interactive document systems, electronic trading systems, and software development methodologies * Alice M. Isen (Management, Psychology) - Affect and cognition * Scott Johnson (Psychology, Human Development) - Visual perception; visual and cognitive development, especially in infancy; computational models of developmental processes; the nativist/empiricist debate, as it pertains to early cognitive and perceptual skills * Robert Johnston (Psychology; Neurobiology and Behavior) - Neural mechanisms of social recognition and memory (i.e., individual, kin, species, etc.); animal communication and social behavior; olfaction, chemical communication and pheromones; comparative cognition/cognitive ethology; evolution of human and animal behavior; hormones and behavior * Barbara Koslowski (Human Development, Psychology) - Cognitive development; scientific reasoning; conceptual development; problem solving and reasoning * Carol Krumhansl (Psychology, Music) - Human perception and cognition; cognitive processes in music perception and memory; experimental, computational, and neuropsychological approaches; music theory * Lillian Lee (Computer Science) - Natural language processing and machine learning * Christiane Linster (Neurobiology and Behavior; Biomedical Engineering) - Neural basis of sensory information processing, using olfaction as a model system * Barbara Lust (Human Development, Asian Studies, Linguistics, Psychology) - Language and mind, especially first language acquisition; linguistic theory of universal grammar; cognitive development * Michael Macy (Sociology) - Collective action; evolutionary game theory; deviance and social control; social psychology; social exchange theory; rational choice * Sally McConnell-Ginet (Linguistics; Feminist, Gender, and Sexuality Studies) - Formal approaches to natural language meaning, especially the syntax-semantics and the semantics-pragmatics interfaces; also work on language, gender, and sexuality interactions * Helene Mialet (Science and Technology Studies) - Sociology and anthropology of science; continental philosophy of science; cognition; notions of subjectivity; self-fashioning; relations between humans and machines; and processes of innovation, discovery and creativity in science/industry * Amanda Miller-Ockhuizen (Linguistics) - Phonetics; phonetics-phonology interface; African languages * Ulric Neisser (Emeritus Professor of Psychology) - Memory (especially recall of life events); and intelligence (especially IQ tests and their social significance) * Anil Nerode (Mathematics, Applied Mathematics, Computer Science) - Logic; recursive functions and computability; theoretical computer science; hybrid systems; multiple agent autonomous control theory * Kathleen O'Connor (Management) - Negotiation; effects of individual cognition and social context on negotiation performance; work group conflicts and decision-making * Michael Owren (Psychology; Neurobiology and Behavior) - Evolutionary psychology of sound, voice, and speech; nonhuman primate vocal communication; speech evolution * H. Kern Reeve (Neurobiology and Behavior) - Developing and testing biologically realistic models of the evolution of cooperation and conflict in animal societies * Elizabeth Adkins Regan (Psychology; Neurobiology and Behavior; Physiology) - Animal social behavior; hormones and behavior; neuroendocrine mechanisms of avian behavior; mate choice and preference * Richard Ripple (Education) - Educational psychology; psychology of adolescence; adult learning and development; the educational psychology of creativity * Steven Robertson (Human Development) - Understanding the emergence and transformation of behavioral organization in early development, its underlying mechanisms, and its functional significance for the fetus and infant * Mats Rooth (Linguistics) - Computational linguistics and natural language semantics * Carol Rosen (Linguistics, Romance Studies) - Helping to build a theory of universal grammar on a broad database; finding out what kinds of formalism can best reveal the regularities in languages * J. Edward Russo (Management) - Marketing; decision-making and decision aiding; consumer behavior; advertising; behavioral science in management * Dawn Schrader (Education; Feminist, Gender, and Sexuality Studies) - Lifespan developmental psychology, especially metacognition; moral, self and intellectual development in late adolescence and adulthood; the relationship between cognition and action; moral education * Bart Selman (Computer Science, Applied Mathematics, Systems Engineering) - Knowledge representation; reasoning and search; algorithms and complexity; planning; machine learning; cognitive science; software agents; connections between computational complexity and statistical physics * Yasuhiro Shirai (Asian Studies, East Asian Literature, Linguistics) - Crosslinguistic study of the acquisition of tense-aspect morphology, particularly of Japanese; typological study of tense-aspect systems; cognitive models of L2 acquisition and use, particularly the connectionist model * Sydney Shoemaker (Philosophy) - Metaphysics and the philosophy of mind * Richard Shore (Mathematics) - Analyzing the structures of relative complexity of computation of functions on the natural numbers * Michael Spivey (Psychology; Human Development; Neurobiology and Behavior) - Information integration, both within and between perceptual/cognitive systems; experimental and computational approaches to: visuolinguistic processing, language comprehension and acquisition, eye movements, visual attention * Zoltan Gendler Szabo (Philosophy) - Philosophy of language; metaphysics; formal semantics; pragmatics * Elise Temple (Human Development, Psychology) - Developmental cognitive neuroscience; exploring the brain mechanisms underlying cognition in the developing brain; focus on the brain mechanisms of reading and language using functional MRI * Francisco Valero-Cuevas (Aerospace Engineering, Biomedical Engineering, Mechanical Engineering) - Neuromuscular biomechanics and control; human and robotic manipulation; surgery simulation * Qi Wang (Human Development) - Development of autobiographical memory, self, and emotion knowledge, as well as their interactions * Elaine Wethington (Human Development, Sociology) - Stress and the protective mechanisms of social support * Jennifer Whiting (Philosophy, Classics) - Personal identity and concepts of the self (both ancient and modern), with special reference to moral psychology and psychopathology * John Whitman (Linguistics, Asian Studies, East Asian Literature) - The problem of language variation: its limits (how much specific subsystems can vary across languages) and predictors (what typological features co-occur systematically) * Stephen Wicker (Electrical and Computer Engineering; Applied Mathematics) - Wireless information networks; artificial intelligence; error control coding * Wendy Williams (Human Development) - Practical intelligence and tacit knowledge in children and adults; educational policy issues; creativity training * Ramin Zabih (Computer Science) - Computer vision; medical imaging * Draga Zec (Linguistics) - Phonological theory; a study of the principles that govern the patterning of sound in individual languages, as well as cross-linguistically Associate Members: * Richard Canfield (Nutritional Science) - Cognitive development and neurotoxicology in human infants and children * Susan Hertz (Linguistics) - Speech synthesis, both as an end in and of itself and as a vehicle to learn more about various aspects of speech, including timing patterns, language universals, perception, intonation, and the phonology-phonetics interface -------- Application materials can be obtained from the Cornell Graduate School (http://www.gradschool.cornell.edu/grad/app-request.html). The deadlines for completed application materials vary by field, the earliest being January 1, 2003. See individual field listings at the Graduate School web site for other dates. For more information about applying to the Cornell Graduate Program: http://www.gradschool.cornell.edu/grad/default.html. For more information about graduate study in Cornell's Cognitive Studies Program, please contact the Director of Graduate Studies and Director of the Program, Shimon Edelman (se37 at cornell.edu), or the Program Coordinator, Linda LeVan (cogst at cornell.edu). ----------------------------------------------------------------------- Shimon Edelman Professor, Dept. of Psychology, 232 Uris Hall Director, Cornell Cognitive Studies Program Cornell University, Ithaca, NY 14853-7601, USA Web: http://kybele.psych.cornell.edu/~edelman Rationalists do it by the rules. Empiricists do it to the rules. From terry at salk.edu Thu Nov 21 19:21:39 2002 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 21 Nov 2002 16:21:39 -0800 (PST) Subject: NEURAL COMPUTATION 14:12 In-Reply-To: <200211070018.gA70IvK12809@purkinje.salk.edu> Message-ID: <200211220021.gAM0LdU25878@purkinje.salk.edu> Neural Computation - Contents - Volume 14, Number 12 - December 1, 2002 REVIEW On Different Facets of Regularization Theory Zhe Chen and Simon Haykin NOTE Notes on Bell-Sejnowski PDF-Matching Neuron Simone Fiori LETTERS Biophysiologically Plausible Implementations of the Maximum Operation Angela J. Yu, Martin A. Giese and Tomaso A. Poggio Self-Regulation Mechanism of Temporally Asymmetric Hebbian Plasticity Narihisa Matsumoto and Masato Okada Associative Memory with Dynamic Synapses Lovorka Pantic, Joaquin J. Torres, Hilbert J. Kappen, Stan C.A.M. Gielen Adaptive Spatiotemporal Receptive Field Estimation in the Visual Pathway Garrett B. Stanley Global Convergence Rate of Recurrently Connected Neural Networks Tianping Chen, Wenlian Lu and Shun-ichi Amari Locality of Global Stochastic Interaction in Directed Acylic Networks Nihat Ay On Unique Representations of Certain Dynaical Systems Produced by Continuous-Time Recurrent Neural Networks Masahiro Kimura Descartes' Rule of Signs for Radial Basis Function Neural Networks Michael Schmitt Approximation Bounds for Some Sparse Kernel Regression Algorithms Tong Zhang ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $95 $101.65 $143 Institution $590 $631.30 $638 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From harris at cnel.ufl.edu Thu Nov 21 03:33:28 2002 From: harris at cnel.ufl.edu (John G. Harris) Date: Thu, 21 Nov 2002 03:33:28 -0500 Subject: Faculty Positions in Neural Engineering at the University of Florida In-Reply-To: <15833.31701.234159.97257@perm.feis.herts.ac.uk> Message-ID: Faculty Positions in Neural Engineering University of Florida Gainesville, FL The newly formed Biomedical Engineering Department at the University of Florida invites applications and nominations for faculty candidates at all levels starting Fall 2003. We are particularly interested in candidates in neural engineering including computational neuroscience, neural imaging and recording, medical image/signal processing and rehabilitative engineering. BME faculty will likely develop collaborations with researchers and clinicians at the McKnight Brain Institute (www.mbi.ufl.edu), the UF College of Medicine, Shands Hospital at UF (the primary teaching hospital for the College of Medicine) and the Malcolm Randall VA Medical Center. For more information about the department and the available positions, please visit: www.bme.ufl.edu. Candidates should send curriculum vitae with the names of at least four references to: Dr. Frank Bova, Chair of Search Committee, Biomedical Engineering Department, University of Florida, P.O. Box 116131, Gainesville, Florida 32611-6131; e-mail: search at bme.ufl.edu; telephone: 352-392-9790. The University of Florida is an Affirmative Action, Equal Opportunity Employer and women and minorities are encouraged to apply. -- John G. Harris Computational NeuroEngineering Lab (www.cnel.ufl.edu) University of Florida P.O. Box 116130 Gainesville, FL 32611-6130 harris at cnel.ufl.edu Phone: (352) 392-2652 From jgama at liacc.up.pt Thu Nov 21 05:21:16 2002 From: jgama at liacc.up.pt (=?iso-8859-1?Q?Jo=E3o?= Gama) Date: Thu, 21 Nov 2002 10:21:16 +0000 Subject: CFP: IDA Special Issue on Adaptive Learning Systems References: <3DDA77FC.658842C2@sunderland.ac.uk> Message-ID: <3DDCB39C.B0B6C5AE@liacc.up.pt> ***************************************************************** CALL FOR PAPERS Intelligent Data Analysis - IOS Press SPECIAL ISSUE on INCREMENTAL LEARNING SYSTEMS CAPABLE OF DEALING WITH CONCEPT DRIFT ***************************************************************** Please distribute this announcement to all interested parties. Special issue Editors: Miroslav Kubat, University of Miami, USA Joo Gama, University of Porto, Portugal Paul Utgoff, University of Massachusetts, USA Suppose the existence of a concept description that has been induced from a set, T, of training examples. Suppose that later another set, T', of examples become available. What is the most effective way to modify the concept so as to reflect the examples from T'? In many real-world learning problems the data flows continuously and learning algorithms should be able to respond to this circumstance. The first requirement of such algorithms is thus incrementality, the ability to incorporate new information. If the process is not strictly stationary, the target concept could gradually change over time, a fact that should be reflected also by the current version of the induced concept description. The ability to react to concept drift can thus be viewed as a natural extension of incremental learning systems. These techniques can be useful for scaling-up learning algorithms to very large datasets. Other types of problems were these techniques could be potentially useful include: user-modelling, control in dynamic environments, web-mining, times series, etc. Most of evaluation methods for machine learning (e.g. cross-validation) assume that examples are independent and identically distributed. This assumption is clear unrealistic in the presence of concept drift. How can we estimate the performance of learning systems under these constrains? The objective of the special issue is to present the current status of algorithms, applications, and evaluation methods for these problems. Relevant techniques include the following (but are not limited to): 1. Incremental, online, real-time, and any-time learning algorithms 2. Algorithms that learn in the presence of concept drift 3. Evaluation Methods for dynamic instance distributions 4. Real world applications that involve online learning 5. Theory on learning under concept drift. Submission Details: We are expecting full papers to describe original, previously unpublished research, be written in English, and not be simultaneously submitted for publication elsewhere (previous publication of partial results at workshops with informal proceedings is allowed). We could also consider the publication of high-quality surveys on these topics. Please submit a PostScript or PDF file of your paper to: jgama at liacc.up.pt Important Dates: Submission Deadline: 1 of February 2003 Author Notification: 1 of July 2003 Final Paper Deadline: 1 of September 2003 Special Issue: _______________________________________________ From baolshausen at ucdavis.edu Fri Nov 22 01:14:27 2002 From: baolshausen at ucdavis.edu (Bruno Olshausen) Date: Thu, 21 Nov 2002 22:14:27 -0800 Subject: postdocs at RNI Message-ID: <3DDDCB43.E6ED1FEC@ucdavis.edu> Postdoctoral Fellowships in Theoretical Neuroscience Redwood Neuroscience Institute Menlo Park, California The Redwood Neuroscience Institute has several immediate openings for postdoctoral fellows with expertise in theoretical neuroscience. Areas of research include large-scale associative memory architectures, temporal sequence learning and prediction, models of thalamo-cortical and cortico-cortical feedback loops, and models of sensory representation. Postdoctoral fellows will work in collaboration with one or more members of the scientific staff at RNI. Candidates should send a CV and a 1-2 page statement of research interests, along with representative publications to jobs at rni.org. RNI is a nonprofit research organization devoted to studying neural models of cognition and perception. PI's include Pentti Kanerva, Bruno Olshausen, Tony Bell, and Fritz Sommer. For further information, please visit our website at http://www.rni.org, or arrange to speak with Bruno or Tony at the upcoming NIPS meeting (email: bolshausen or tbell @rni.org). -- Bruno A. Olshausen (650) 321-8282 x233 Redwood Neuroscience Institute (650) 321-8585 (fax) 1010 El Camino Real http://www.rni.org Menlo Park, CA 94025 & Center for Neuroscience (530) 757-8749 UC Davis (530) 757-8827 (fax) 1544 Newton Ct. baolshausen at ucdavis.edu Davis, CA 95616 http://redwood.ucdavis.edu/bruno From bogus@does.not.exist.com Fri Nov 22 11:17:11 2002 From: bogus@does.not.exist.com () Date: Fri, 22 Nov 2002 16:17:11 -0000 Subject: Postdoctoral Research Fellowship, Cambridge, U.K. Message-ID: <6EDEB53BA6EA96458F3CEC96BB0282D2021DE1A2@tvp-msg-03.europe.corp.microsoft.com> A non-text attachment was scrubbed... Name: not available Type: multipart/mixed Size: 0 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/b347dd6c/attachment-0001.bin From mhb0 at Lehigh.EDU Sat Nov 23 15:39:37 2002 From: mhb0 at Lehigh.EDU (Mark H. Bickhard) Date: Sat, 23 Nov 2002 15:39:37 -0500 Subject: ISI 2003 Second CFP Message-ID: <3DDFE788.67035A8C@lehigh.edu> Interactivist Summer Institute 2003 July 22-26, 2003 Botanical Auditorium Copenhagen, Denmark Join us in exploring the frontiers of understanding of life, mind, and cognition. There is a growing recognition - across many disciplines - that phenomena of life and mind, including cognition and representation, are emergents of far-from-equilibrium, interactive, autonomous systems. Mind and biology, mind and agent, are being re-united. The classical treatment of cognition and representation within a formalist framework of encodingist assumptions is widely recognized as a fruitless maze of blind alleys. From neurobiology to robotics, from cognitive science to philosophy of mind and language, dynamic and interactive alternatives are being explored. Dynamic systems approaches and autonomous agent research join in the effort. The interactivist model offers a theoretical approach to matters of life and mind, ranging from evolutionary- and neuro-biology - including the emergence of biological function ? through representation, perception, motivation, memory, learning and development, emotions, consciousness, language, rationality, sociality, personality and psychopathology. This work has developed interfaces with studies of central nervous system functioning, the ontology of process, autonomous agents, philosophy of science, and all areas of psychology, philosophy, and cognitive science that address the person. The conference will involve both tutorials addressing central parts and aspects of the interactive model, and papers addressing current work of relevance to this general approach. This will be our second Summer Institute; the first was in 2001 at Lehigh University, Bethlehem, PA, USA. The intention is for this Summer Institute to become a traditional biennial meeting where those sharing the core ideas of interactivism will meet and discuss their work, try to reconstruct its historical roots, put forward current research in different fields that fits the interactivist framework, and define research topics for prospective graduate students. People working in philosophy of mind, linguistics, social sciences, artificial intelligence, cognitive robotics, theoretical biology, and other fields related to the sciences of mind are invited to send their paper submission or statement of interest for participation to the organizers. http://www.lehigh.edu/~interact/isi2003/isi2003.html Mark -- Mark H. Bickhard Cognitive Science 17 Memorial Drive East Lehigh University Bethlehem, PA 18015 610-758-3633 mhb0 at lehigh.edu mark.bickhard at lehigh.edu http://www.lehigh.edu/~mhb0/mhb0.html From mhb0 at Lehigh.EDU Sat Nov 23 15:53:36 2002 From: mhb0 at Lehigh.EDU (Mark H. Bickhard) Date: Sat, 23 Nov 2002 15:53:36 -0500 Subject: CFP Epigenetic Robotics Message-ID: <3DDFEACF.1927F45C@lehigh.edu> EPIROB2003--EPIROB2003-EPIROB2003-EPIROB2003 Call for Papers EPIROB2003 EPIROB2003 Third International Workshop on Epigenetic Robotics: EPIROB2003 Modeling Cognitive Development in Robotic Systems EPIROB2003 EPIROB2003 Organizing Committee: EPIROB2003 Luc Berthouze, Christopher G. Prince EPIROB2003 Christian Balkenius, Daniel Bullock, Hideki Kozima, Georgi Stojanov, EPIROB2003 EPIROB2003 er2003 at epigenetic-robotics.org EPIROB2003 EPIROB2003 http://www.epigenetic-robotics.org EPIROB2003 EPIROB2003 August 4th and 5th 2003 EPIROB2003 EPIROB2003--EPIROB2003-EPIROB2003-EPIROB2003 Call for Papers Location: Boston, MA, USA (held after the Cognitive Science Society meeting) **** Deadline for Submission of Papers & Posters: 14 March 2003 **** This workshop focuses on combining developmental psychology and robotics and generally on: (a) the embodiment of the system; (b) its situatedness in a physical and social environment; (c) a prolonged developmental process through which varied and complex cognitive and perceptual structures emerge as a result of an embodied system interacting with a physical and social environment. Invited Speakers Gyцrgy Gergely (Institute for Psychological Research, Hungarian Academy of Sciences, Budapest, Hungary) Rod Grupen (Laboratory for Perceptual Robotics, University of Massachusetts Amherst, MA, USA) Deb Roy (Media Lab, MIT, USA) Submissions Papers not exceeding eight (8) pages should be submitted electronically (PDF or Postscript) as attachment files to Luc Berthouze (Luc.Berthouze at aist.go.jp). Extended abstracts (maximum two pages) can also be submitted, and will be presented as posters (extended abstracts should also be submitted in PDF or Postscript as attachments to Luc Berthouze(Luc.Berthouze at aist.go.jp). Further instructions to authors will be posted on the workshop web page: http://www.epigenetic-robotics.org Publication of Papers Papers will be published in a proceedings, and archived at CogPrints. From amos at infoeng.flinders.edu.au Mon Nov 25 00:41:11 2002 From: amos at infoeng.flinders.edu.au (Amos Omondi) Date: Mon, 25 Nov 2002 16:11:11 +1030 Subject: call for book chapters: Neural Net FPGAs Message-ID: <5.1.0.14.0.20021125161044.00b1fd48@mail.infoeng.flinders.edu.au> CALL FOR BOOK CHAPTERS FPGA Implementations of Neural Networks (Kluwer Academic Publishers, Boston, 2003) The development of neural networks has now reached the stage where they are employed in a large variety of practical contexts. However, to date the majority of such implementations have been in software. While it is generally recognised that hardware implementations could, through performance (and other) advantages, greatly increase the use of neural networks, in the past the relatively high cost of developing ASICs has meant that only a small number of hardware neural-computing devices has gone beyond the research-prototype stage. Now, however, with the appearance of large, dense, highly parallel FPGA circuits, it has now become possible to envisage the realization in hardware of large-scale neural networks, to get high performance at low costs. Nevertheless, the many opportunities offered by FPGAs also come with many challenges. These range from the choice of data representation, to the implementation of specialized functions, through to the realization of massively parallel neural networks; and accompanying these are important secondary issues, such as benchmarking, development tools and technology transfer. All these issues are currently being investigated by a large number of researchers. The proposed book aims to capture the state of the art in these researches. TOPICS Contributions, covering both original research and expository work, are invited on the following topics, in the context of neural networks realized in FPGAs. (Submissions that on other closely relevant topics are also welcome.) Architectures (systolic arrays, SIMD, etc.) Neurocomputers (complete systems) Hardware accelerators Embedded systems Input/output Hybrid systems Reliability Benchmarking and metrics Massive parallelism Interconnection-network topologies Scalability Algorithm-to-architecture mapping Evolutionary computing Novel hardware algorithms Implementation of activation functions Data-representation formats Applications (biometrics, speech, imaging, information-retrieval, control, biomedical, bioinformatics, etc.) Development tools Technology transfer Case studies SUBMISSIONS AND SCHEDULE Submissions should be made to either of the editors, by 15 Feb 2003. They should be in either .ps or .pdf form and must be formatted, as book chapters, according to the publisher's style files, which will be found at http://www.wkap.nl/authors/bookstylefiles. Authors of accepted contributions will be expected to make final submissions by 1 May 2003. Prospective authors are encouraged to indicate their intent before 30 Dec 2002. EDITORS Amos Omondi School of Informatics and Engineering Flinders University Bedford Park, SA 5042 AUSTRALIA e-mail: amos at infoeng.flinders.edu.au Jagath Rajapakse School of Computer Engineering Nanyang Technological University SINGAPORE 639798 e-mail: asjagath at ntu.edu.edu From xmatumo at brain.riken.go.jp Thu Nov 28 01:39:22 2002 From: xmatumo at brain.riken.go.jp (Narihisa MATSUMOTO) Date: Thu, 28 Nov 2002 15:39:22 +0900 Subject: paper available: Self-Regulation Mechanism of TAH Message-ID: <4.3.2-J.20021128151303.04406ae8@smtp.brain.riken.go.jp> Apologies if you receive this e-mail multiple times. Dear, colleagues. I would like to announce the following paper available on the web site: http://www.mns.brain.riken.go.jp/~xmatumo/paper/NComp02.pdf ``Self-Regulation Mechanism of Temporally Asymmetric Hebbian Plasticity'' by N. Matsumoto & M. Okada Neural Computation, vol. 14, no. 12, pp. 2883-2902, 2002 Abstract--------------------------------------------------------- Recent biological experimental findings have shown that synaptic plasticity depends on the relative timing of the pre- and postsynaptic spikes. This determines whether long-term potentiation (LTP) or long-term depression (LTD) is induced. This synaptic plasticity has been called temporally asymmetric Hebbian plasticity (TAH). Many authors have numerically demonstrated that neural networks are capable of storing spatiotemporal patterns. However, the mathematical mechanism of the storage of spatiotemporal patterns is still unknown, and the effect of LTD is particularly unknown. In this article, we employ a simple neural network model and show that interference between LTP and LTD disappears in a sparse coding scheme. On the other hand, the covariance learning rule is known to be indispensable for the storage of sparse patterns. We also show that TAH has the same qualitative effect as the covariance rule when spatiotemporal patterns are embedded in the network. ---------------------------------------------------------------- This shorter version is in Advances in Neural Information Processing Systems 14, pp. 245-252 Sincerely Yours, *********************************************************** Narihisa MATSUMOTO Junior Research Associate Lab. for Mathematical Neuroscience,RIKEN Brain Science Institute,Japan e-mail: xmatumo at brain.riken.go.jp URL: http://www.mns.brain.riken.go.jp/~xmatumo/index.html *********************************************************** From Nada.Lavrac at ijs.si Thu Nov 28 14:15:10 2002 From: Nada.Lavrac at ijs.si (Nada Lavrac) Date: Thu, 28 Nov 2002 20:15:10 +0100 Subject: DMLL: ML journal Special issue on Data Mining Lessons Learned Message-ID: <3DE66B3E.2FDD@ijs.si> Machine Learning Journal: Special Issue on Data Mining Lessons Learned http://www.hpl.hp.com/personal/Tom_Fawcett/DMLL-MLJ-CFP.html Guest editors: Nada Lavrac, Hiroshi Motoda and Tom Fawcett Submission deadline: Monday, 7 April, 2003. Call for Papers Data mining is concerned with finding interesting or valuable patterns in data. Many techniques have emerged for analyzing and visualizing large volumes of data, and what we see in the technical literature are mostly success stories of these techniques. We rarely hear of steps leading to success, failed attempts, or critical representation choices made; and rarely do papers include expert evaluations of achieved results. Insightful analyses of successful and unsuccessful applications are crucial for increasing our understanding of machine learning techniques and their limitations. Challenge problems (such as the KDD Cup, COIL and PTE challenges) have become popular in recent years and have attracted numerous participants. These challenge problems usually involve a single difficult problem domain, and participants are evaluated by how well their entries satisfy a domain expert. The results of such challenges can be a useful source of feedback to the research community. At ICML-2002 a workshop on Data Mining Lessons Learned was held and (http://www.hpl.hp.com/personal/Tom_Fawcett/DMLL-workshop.html) and was well attended. This special issue of the Machine Learning journal follows the main goals of that workshop, which are to gather experience from successful and unsuccessful data mining endeavors, and to extract the lessons learned from them. Goals The aim of this special issue is to collect the experience gained from data mining applications and challenge competitions. We are interested in lessons learned both from successes and from failures. Authors are invited to report on experiences with challenge problems, experiences in engineering representations for practical problems, and in interacting with experts evaluating solutions. We are also interested in why some particular solutions - despite good performance - were not used in practice, or required additional treatment before they could be used. An ideal contribution to this special issue would describe in sufficient detail one problem domain, either an application or a challenge problem. Contributions not desired for this special issue would be papers that report on marginal improvement over existing methods using artificial synthetic data or UCI data involving no expert evaluation. We offer the following content guidelines to authors. 1. For applications studies, we expect a description of the attempts that succeeded or failed, an analysis of the success or failure, and any steps that had to be taken to make the results practically useful (if they were). Ideally an article should support lessons with evidence, experimental or otherwise; and the lessons should generalize to a class of problems. 2. For challenge problems, we will accept either experiences preparing an individual entry or an analysis of a collection of entries. A collective study might analyze factors such as the features of successful approaches that made them appealing to experts. As with applications studies, such articles should support lessons with evidence, and preferably should generalize to a class of problems. Analyses should preferably shed light on why a certain class of method is best applicable to the type of problem addressed. 3. A submission may analyze methodological aspects from individual developments, or may analyze a subfield of machine learning or a set of data mining methods to uncover important and unknown properties of a class of methods or a field as a whole. Again, a paper should support lessons learned with appropriate evidence. We emphasize that articles to appear in this special issue must satisfy the high standards of the Machine Learning journal. Submissions will be evaluated on the following criteria: Novelty: How original is this lesson? Is this the first time this observation has been made, or has it appeared before? Generality: How widely applicable are the observations or conclusions made by this paper? Are they specific to a single project, a single domain, a class of domains, or much of data mining? Significance: How important are the lessons learned? Are they actionable? To what extent could they influence the directions of work in data mining? Support: How strong is the experimental evidence? Are the lessons drawn from a single project, a group of projects, or a thread of work in the community? Clarity: How clear is the paper? How clearly are the lessons expressed? The criteria for novelty, significance and clarity apply not only to the lessons but also to the paper as a whole. Submission Instructions Manuscripts for submission should be prepared according to the instructions at http://www.cs.ualberta.ca/~holte/mlj/ In preparing submissions, authors should follow the standard instructions for the Machine Learning journal at http://www.cs.ualberta.ca/~holte/mlj/initialsubmission.pdf Submissions should be sent via email to Hiroshi Motoda (motoda at ar.sanken.osaka-u.ac.jp), as well as to Kluwer Academic Publishers (jml at wkap.com). In the email please state very clearly that the submission is for the special issue on Data Mining Lessons Learned. From Johan.Suykens at esat.kuleuven.ac.be Fri Nov 29 09:12:13 2002 From: Johan.Suykens at esat.kuleuven.ac.be (Johan Suykens) Date: Fri, 29 Nov 2002 15:12:13 +0100 Subject: LS-SVMs: book announcement Message-ID: <3DE775BD.1080903@esat.kuleuven.ac.be> We are glad to announce the publication of a new book ************************************************************************* J.A.K. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor, J. Vandewalle, Least Squares Support Vector Machines, World Scientific Pub. Co., Singapore, 2002 (ISBN 981-238-151-1) http://www.esat.kuleuven.ac.be/sista/lssvmlab/book.html ************************************************************************* This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics. The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nystr=F6m sampling with active selection of support vectors. The methods are illustrated with several examples. Contents: Introduction Support vector machines Least squares support vector machines, links with Gaussian processes, regularization networks, and kernel FDA Bayesian inference for LS-SVM models Weighted versions and robust statistics Large scale problems: Nystrom sampling, reduced set methods, basis formation and Fixed size LS-SVM LS-SVM for unsupervised learning: support vector machines formulations for kernel PCA. Related methods of kernel CCA. LS-SVM for recurrent networks and control Illustrations and applications Readership: Graduate students and researchers in neural networks; machine learning; data-mining; signal processing; circuit, systems and control theory; pattern recognition; and statistics. Info: 308pp., Publication date: Nov. 2002, ISBN 981-238-151-1 Order information: World Scientific http://www.wspc.com/books/compsci/5089.html http://www.esat.kuleuven.ac.be/sista/lssvmlab/book.html Freely available LS-SVMlab software http://www.esat.kuleuven.ac.be/sista/lssvmlab/ under GNU General Public License [we apologize for receiving multiple copies of this message] From Johan.Suykens at esat.kuleuven.ac.be Fri Nov 29 10:06:14 2002 From: Johan.Suykens at esat.kuleuven.ac.be (Johan Suykens) Date: Fri, 29 Nov 2002 16:06:14 +0100 Subject: LS-SVMlab announcement Message-ID: <3DE78266.2040101@esat.kuleuven.ac.be> We are glad to announce ******************************************************** LS-SVMlab: Least Squares - Support Vector Machines Matlab/C Toolbox ******************************************************** Website: http://www.esat.kuleuven.ac.be/sista/lssvmlab/ Toolbox: Matlab LS-SVMlab1.4 - Linux and Windows Matlab/C code Basic and advanced versions Functional and object oriented interface Tutorial User's Guide (100pp.): Examples and demos Matlab functions with help Solving and handling: Classification, Regression Tuning, cross-validation, fast loo, receiver operating characteristic (ROC) curves Small and unbalanced data sets High dimensional input data Bayesian framework with three levels of inference Probabilistic interpretations, error bars hyperparameter selection, automatic relevance determination (ARD) input selection, model comparison Multi-class encoding/decoding Sparseness Robustness, robust weighting, robust cross-validation Time series prediction Fixed size LS-SVM, Nystrom method, kernel principal component analayis (kPCA), ridge regression Unsupervised learning Large scale problems Related links, publications, presentations and book: http://www.esat.kuleuven.ac.be/sista/lssvmlab/ Contact: LS-SVMlab at esat.kuleuven.ac.be GNU General Public License: The LS-SVMlab software is made available for research purposes only under the GNU General Public License. LS-SVMlab software may not be used for commercial purposes without explicit written permission after contacting LS-SVMlab at esat.kuleuven.ac.be.