From terry at salk.edu Thu Dec 1 17:02:35 2005 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 01 Dec 2005 14:02:35 -0800 Subject: Connectionists: UCSD Computational Neurobiology Graduate Training Message-ID: DEADLINE: DECEMBER 15, 2005 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/grad/CN_overview.html The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers who are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, behavioral, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology * Darwin Berg (Neurobiology): Regulation synaptic components, assembly and localization, function and long-term stability. * Geoffrey Boynton (Salk Institute): Visual psychophysics; fMRI recordings from visual cortex. * Gert Cauwenberghs (Biology): Neuromorphic Engineering; analog VLSI chips; wireless recording and nanoscale instrumentation for neural systems; large-scale cortical modeling. * EJ Chichilnisky (Salk Institute): Retinal multielectrode recording; neural coding, visual perception. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms * Virginia De Sa (Cognitive Science): Computational basis of perception and learning (both human and machine); multi-sensory integration and contextual influences * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research * Dan Feldman (Biology): Cortical plasticity; spike-teime dependent synaptic plasticity; sensory coding inthe whisker system. * Marla Feller (Neurobiology): Mechanisms and function of spontaneous activity in the developing nervous system including the retina, spinal cord, hippocampus and neocortex. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels * David Kleinfeld (Physics): Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. * William Kristan (Neurobiology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies * Scott Makeig (Institute for Neural Computation): Analysis of cognitive event-related brain dynamics and fMRI using time-frequency and Independent Component Analysis * Javier Movellan (Institute for Neural Computation): Sensory fusion and learning algorithms for continuous stochastic systems * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects * Pamela Reinagel (Biology): Sensory and neural coding; natural scene statistics; recordings from the visual system of cats and rodents. * Massimo Scanziani (Biology): Neural circuits in the somotosensory cortex; physiology of synaptic transmission; inhibitory mechanisms. * Terrence Sejnowski (Salk Institute/Neurobiology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques * Nicholas Spitzer (Neurobiology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of Neurobiology * Charles Stevens (Salk Institute): Synaptic physiology; theoretical models of neuroanatomical scaling. * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms Requests for application materials should be sent to the University of California, San Diego, Division of Biological Sciences 0348, Graduate Admissions Office, 9500 Gilman Drive, La Jolla, CA, 92093-0348 or to [gradprog at biomail.ucsd.edu]. The deadline for completed application materials, including letters of recommendation, is December 15, 2005. For more information about applying to the UCSD Biology Graduate Program. A preapplication is not required for the Computational Neurobiology Program. http://www.biology.ucsd.edu/grad/admissions/index.html From juergen at idsia.ch Thu Dec 1 11:31:40 2005 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Thu, 1 Dec 2005 17:31:40 +0100 Subject: Connectionists: postdoc @ IDSIA, Switzerland Message-ID: <01c72f931e55e5957a3e6a9054f3140e@idsia.ch> We are seeking an outstanding postdoc with experience / interest in topics such as sequence learning algorithms, adaptive robotics, recurrent neural networks (RNN), sequential active vision, hidden Markov models, dynamic Bayes nets and other Bayesian approaches, universal learning machines, Kolmogorov complexity / algorithmic information theory, artificial evolution, in particular RNN evolution, support vector machines (especially recurrent ones), reinforcement learning, curiosity- driven learning. Goal: to advance the state of the art in sequence learning in general, and to build vison-based robots and other agents that learn to solve challenging tasks. More: http://www.idsia.ch/~juergen/postdoc2006.html JS From maass at igi.tu-graz.ac.at Thu Dec 1 12:00:39 2005 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Thu, 01 Dec 2005 18:00:39 +0100 Subject: Connectionists: Feedback in neural circuits (OR: How to expand liquid computing) Message-ID: <438F2C37.6010006@igi.tu-graz.ac.at> The paper Computational Aspects of Feedback in Neural Circuits by Wolfgang Maass, Prashant Joshi , and Eduardo Sontag is now available from the homepages of the authors. There will be a talk and poster on it at NIPS 2005 (under the title "Principles of real-time computing with feedback applied to cortical microcircuit models"). Abstract: It had previously been shown that generic cortical microcircuit models can perform complex real-time computations on continuous input streams, provided that these computations can be carried out with a rapidly fading memory. We investigate in this article the computational capability of such circuits in the more realistic case where not only readout neurons, but in addition a few neurons within the circuit have been trained for specific tasks. This is essentially equivalent to the case where the output of trained readout neurons is fed back into the circuit. We show that this new model overcomes the limitation of a rapidly fading memory. In fact, we prove that in the idealized case without noise it can carry out any conceivable digital or analog computation on time-varying inputs. But even with noise the resulting computational model can perform a large class of biologically relevant real-time computations that require a non-fading memory. We demonstrate these computational implications of feedback both theoretically and through computer simulations of detailed cortical microcircuit models. We show that the application of simple learning procedures (such as linear regression or perceptron learning) enables such circuits, in spite of their complex inherent dynamics, to represent time over behaviorally relevant long time spans, to integrate evidence from incoming spike trains over longer periods of time, and to process new information contained in such spike trains in diverse ways according to the current internal state of the circuit. In particular we show that such generic cortical microcircuits with feedback provide a new model for working memory that is consistent with a large set of biological constraints. Although this article examines primarily the computational role of feedback in circuits of neurons, the mathematical principles on which its analysis is based apply to a large variety of dynamical systems. Hence they may also throw new light on the computational role of feedback in other complex biological dynamical systems, such as for example genetic regulatory networks. -- Wolfgang Maass Professor of Computer Science Technische Universitaet Graz http://www.igi.tugraz.at/maass/ From t.heskes at science.ru.nl Thu Dec 1 14:41:36 2005 From: t.heskes at science.ru.nl (Tom Heskes) Date: Thu, 01 Dec 2005 20:41:36 +0100 Subject: Connectionists: Neurocomputing volume 69 (issues 4-6) Message-ID: <438F51F0.6020607@science.ru.nl> Neurocomputing Volume 69, Issues 4-6 (January 2006) ------- FULL LENGTH PAPERS Bifurcations in Morris?Lecar neuron model Kunichika Tsumoto, Hiroyuki Kitajima, Tetsuya Yoshinaga, Kazuyuki Aihara and Hiroshi Kawakami Using temporal binding for hierarchical recruitment of conjunctive concepts over delayed lines Cengiz G?nay and Anthony S. Maida Adaptive conjugate gradient algorithm for perceptron training G. Nagaraja and R.P. Jagadeesh Chandra Bose Tumor tissue identification based on gene expression data using DWT feature extraction and PNN classifier Guangmin Sun, Xiaoying Dong and Guandong Xu Mathematical modeling and computational analysis of neuronal cell images: Application to dendritic arborization of Golgi-impregnated neurons in dorsal horns of the rat spinal cord D. Ristanovi?, B.D. Stefanovi?, N.T. Milo?evi?, M. Grgurevi? and J.B. Stankovi? Exponential stability and periodic oscillatory of bi-directional associative memory neural network involving delays Hongyong Zhao Time-series prediction using a local linear wavelet neural network Yuehui Chen, Bo Yang and Jiwen Dong Locally recurrent neural networks for long-term wind speed and power prediction T.G. Barbounis and J.B. Theocharis Separation of water artifacts in 2D NOESY protein spectra using congruent matrix pencils K. Stadlthanner, A.M. Tom?, F.J. Theis, E.W. Lang, W. Gronwald and H.R. Kalbitzer Dynamic temperature modeling of continuous annealing furnace using GGAP-RBF neural network Shaoyuan Li, Qing Chen and Guang-Bin Huang Biologically motivated vergence control system using human-like selective attention model Sang-Bok Choi, Bum-Soo Jung, Sang-Woo Ban, Hirotaka Niitsuma and Minho Lee Local regularization assisted orthogonal least squares regression S. Chen A new approach to fuzzy classifier systems and its application in self-generating neuro-fuzzy systems Mu-Chun Su, Chien-Hsing Chou, Eugene Lai and Jonathan Lee ------- JOURNAL SITE: http://www.elsevier.com/locate/neucom SCIENCE DIRECT: http://www.sciencedirect.com/science/issue/5660-2006-999309995-612478 From baolshausen at berkeley.edu Fri Dec 2 15:28:05 2005 From: baolshausen at berkeley.edu (Bruno Olshausen) Date: Fri, 02 Dec 2005 12:28:05 -0800 Subject: Connectionists: Graduate program in neuroscience - UC Berkeley Message-ID: <4390AE55.60102@berkeley.edu> GRADUATE PROGRAM IN NEUROSCIENCE - UC BERKELEY ** Application deadline: December 15, 2005 ** The Graduate Program in Neuroscience at UC Berkeley is currently accepting applications for admission for the 2006-2007 academic year. There are numerous opportunities for students interested in focusing on computational and theoretical approaches within the context of an interdisciplinary neuroscience training program. Faculty supporting this area include: Martin Banks - Visual space perception, psychophysics, virtual reality Jose Carmena - Brain-machine interfaces, sensorimotor control, learning Yang Dan - Information processing in thalamus and cortex Jack Gallant - Neural mechanisms of visual form perception and attention Tom Griffiths - Computational models of cognition Stanley Klein - Computational models of spatial vision, psychophysics Harold Lecar - Theoretical biophysics, network models Bruno Olshausen - Models of visual cortex, scene analysis Fritz Sommer - Network models of associative memory and learning Frederic Theunissen - Neural mechanisms of complex sound recognition Frank Werblin - Information processing in the retina Faculty in other programs also pursuing computational/theoretical approaches to neuroscience questions include Michael Gastpar, EECS - Neural coding, information theory Jitendra Malik, EECS - Models of early vision and object recognition Alva Noe, Philosophy - Theories of perception and sensorimotor loops Lokendra Shastri, ICSI - Models of episodic memory in hippocampus Bin Yu, Statistics - Neural coding, image statistics In addition, the newly established Redwood Center for Theoretical Neuroscience provides a central workspace for theoreticians and organizes a weekly seminar series, workshops, and hosts visiting scholars. See http://redwood.berkeley.edu For further information and details of the application processs see http://neuroscience.berkeley.edu/grad.php and please note the application deadline above. -- Bruno A. Olshausen Director, Redwood Center for Theoretical Neuroscience and Associate Professor, Helen Wills Neuroscience Institute and School of Optometry, UC Berkeley 132 Barker Hall, #3190, Berkeley, CA 94720-3190 (510) 643-1472 / 4952 (fax) http://redwood.berkeley.edu From ccchow at pitt.edu Fri Dec 2 10:42:59 2005 From: ccchow at pitt.edu (Carson Chow) Date: Fri, 2 Dec 2005 10:42:59 -0500 Subject: Connectionists: Positions at NIH Message-ID: <5c7190cc1c09fed9fd20c51a9a9835c1@pitt.edu> > > > The National Institute of Diabetes and Digestive and Kidney Diseases > (NIDDK), National Institutes of Health (NIH), Department of Health and > Human Services, invites applications for tenured or tenure track > positions in the Laboratory of Biological Modeling.? The Laboratory is > currently comprised of scientists who use computational approaches to > understand cell biological and physiological systems.? Specific areas > of > research interest will include mathematical modeling at the > subcellular, > cellular, tissue and system levels. Excellent computational facilities > and resources for rapid achievement of research goals are available. > LBM is in close proximity in particular to the NIDDK Computational > Chemistry Core Facility, engaged in molecular modeling. The position > offers unparalleled opportunities for interdisciplinary collaboration > within NIDDK and throughout NIH.? For further information about > NIDDK, see http://www.niddk.nih.gov.? Candidates must have a Ph. D., M. > D., or equivalent degree in the physical or biomedical sciences.? He or > she should have an outstanding record of research accomplishments in > mathematical modeling and will be expected to propose and pursue an > innovative and independent research program.? Applicants should send a > curriculum vitae and list of publications, copies of three major > publications, a plan for future research, and three letters of > reference > to Dr. Robert Tycko, Chair of the Search Committee, Laboratory of > Chemical Physics, Building 5, Rm 112, 5 Memorial Drive, NIH, Bethesda, > MD 20892-0520, tel: 301-402-8272, fax:301-496-0825, > email:tycko at helix.nih.gov.? (A closing date has not been set, but it > would be best to apply before the end of January, 2006.) > > HHS and NIH are Equal Opportunity Employers > > Position Description: > > The successful candidate will establish an independent group with > research interests focused on mathematical modeling at the subcellular, > cellular, tissue, or organism levels.? Other members of the Laboratory > of Biological Modeling (http://lbm.niddk.nih.gov) conduct basic > research > on a wide variety of topics including insulin secretion (A. Sherman), > insulin action (A. Sherman, C. Chow), metabolism (K. Hall), adipocyte > differentiation (V. Periwal), , calcium homeostasis (A. Sherman), and > neuroscience (C. Chow), all relevant to diabetes and obesity. > Interaction is expected with experimental laboratories or other > computational groups in NIDDK or other NIH institutes. From Dave_Touretzky at cs.cmu.edu Sat Dec 3 02:27:07 2005 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Sat, 03 Dec 2005 02:27:07 -0500 Subject: Connectionists: graduate training in the neural basis of cognition Message-ID: <405.1133594827@ammon.boltz.cs.cmu.edu> Graduate Training at the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers an interdisciplinary doctoral training program operated jointly with elevent affiliated PhD programs at Carnegie Mellon University and the University of Pittsburgh. Detailed information about this program is available on our web site at http://www.cnbc.cmu.edu The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, statistics, or robotics. For more information about the program, and to obtain application materials, visit our web site at www.cnbc.cmu.edu, or contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu Web: http://www.cnbc.cmu.edu The affiliated PhD programs at the two universities are: Carnegie Mellon University of Pittsburgh Biological Sciences BioEngineering Biomedical Engineering Mathematics Computer Science Neuroscience Computational & Statistical Psychology Learning Psychology Robotics Statistics The CNBC training faculty includes: Eric Ahrens (CMU Biology): MRI studies of the vertebrate nervous system Susan Amara (Pitt Neurobiology): neurotransmitter transport and binding John Anderson (CMU Psychology): models of human cognition Galia Avidan (CMU Psychology): fMRI studies of object and face recognition German Barrionuevo (Pitt Neuroscience): hippocampus and prefrontal cortex Alison Barth (CMU Biology): molecular basis of plasticity in neocortex Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Guoqiang Bi (Pitt Neurocience): activity-dependent synaptic modification J. Patrick Card (Pitt Neuroscience): transneuronal tracing of neural circuits Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Justin Crowley (CMU Biology): development of visual cortex Tracy Cui (Pitt BioEngineering): biosensors, neural microlectrode arrays Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Neeraj Gandhi (Pitt Neuroscience): neural control of movement Chris Genovese (CMU Statistics): making inferences from scientific data Lori Holt (CMU Psychology): mechanisms of auditory and speech perception John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Satish Iyengar (Pitt Statistics): spike train data analsysis Jon Johnson (Pitt Neuroscience): ligand-gated ion channels; NMDA receptor Marcel Just (CMU Psychology): visual thinking, language comprehension Karl Kandler (Pitt Neurobiology): neural development; inhibitory pathways Robert Kass (CMU Statistics): transmission of info. by collections of neurons Seog-Gi Kim (Pitt Neurobiology): technology and biophysics of fMRI Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Beatriz Luna (Pitt Pschology): developmental psychology and fMRI Brian MacWhinney (CMU Psychology): models of language acquisition Yoky Matsuoka (CMU Robotics): human motor control and motor learning James McClelland (CMU Psychology): connectionist models of cognition Steve Meriney (Pitt Neuroscience): mechanisms of synaptic plasticity Nancy Minshew (Pitt Neurobiology): cognitive and neural basis of autism Tom Mitchell (CMU Comp. Sci.): machine learning with application to fMRI Bita Moghaddam (Pitt Neuroscience): prefrontal cortex and psychiatric disorders Paula Monaghan-Nichols (Pitt Neurobiology): genetic analysis of verteb. CNS devel. Carl Olson (CNBC): spatial representations in primate frontal cortex Charles Perfetti (Pitt Psychology): language and reading processes David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia Lynne Reder (CMU Psychology): models of memory and cognitive processing Erik Reichle (Pitt Psychology): attention and eye movements in reading Jonathan Rubin (Pitt Mathematics): analysis of systems of coupled neurons Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Andrew Schwartz (Pitt Bioengineering): motor control, neural prostheses Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Greg Siegle (Pitt Psychology): emotion and cognition; cognitive modeling Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex Marc Sommer (Pitt Neuroscience): neural circuitry controlling eye movements Peter Strick (Pitt Neurobiology): motor control; basal ganglia and cerebellum Floh Thiels (Pitt Neurosicence): LTP and LTD in hippocampus Erik Thiessen (Pitt Psychology): child language development Natasha Tokowicz (Pitt Psychology): language learning; bilingualism David Touretzky (CMU Comp. Sci.): hippocampal modeling, cognitive robotics Nathan Urban (CMU Bioogy): circuitry of the olfactory bulb Valerie Ventura (CMU Statistics): structure of neural firing patterns Mark Wheeler (Pitt Psychology): fMRI studies of memory and cognition Nick Yeung (CMU Psychology): neural mechanisms of attention Please see http://www.cnbc.cmu.edu for further details. From matthias.hein at tuebingen.mpg.de Sat Dec 3 11:30:58 2005 From: matthias.hein at tuebingen.mpg.de (Matthias Hein) Date: Sat, 3 Dec 2005 17:30:58 +0100 Subject: Connectionists: PhD-studentship in Learning Theory Message-ID: <008301c5f826$f0e2ac80$1d28260a@kongo> PhD-studentship in Learning Theory/Learning Algorithms at the MPI for Biological Cybernetics A Position for a PhD-studentship in Learning Theory/Learning Algorithms is available in B. Sch?lkopf's Empirical Inference department at the Max Planck Institute in Tuebingen, Germany (see http://www.kyb.tuebingen.mpg.de/bs). We invite applications of candidates with an outstanding academic record in particular a strong mathematical background. Max Planck Institutes are publicly funded research labs with an emphasis on excellence in basic research. Tuebingen is a university town in southern Germany, see http://www.tuebingen.de/kultur/english/index.html for some pictures. If you are interested and you are attending NIPS 2005 please contact Matthias Hein mh at tuebingen.mpg.de so that one can arrange an informal interview there. Otherwise please send inquiries and applications, including a CV (with complete lists of marks, copies of transcripts etc.) and a short statement of research interests to sabrina.nielebock at tuebingen.mpg.de or Sabrina Nielebock Max Planck Institute for Biological Cybernetics Spemannstr. 38 72076 Tuebingen Germany Tel. +49 7071 601 551 Fax +49 7071 601 552 In addition, please arrange for two letters of reference to be sent directly to the address above. Applications will be considered immediately and until the positions are filled. ************************************************* Matthias Hein Max-Planck-Institute for Biological Cybernetics Spemannstrasse 38 72076 T?bingen Tel: 07071 - 601 559 Fax:07071 - 601 552 mail: matthias.hein at tuebingen.mpg.de ************************************************* From jdc at Princeton.EDU Tue Dec 6 20:22:55 2005 From: jdc at Princeton.EDU (Jonathan D. Cohen) Date: Tue, 6 Dec 2005 20:22:55 -0500 Subject: Connectionists: Faculty Position at new Princeton University Institute Message-ID: <824EDA80-CEF4-4A19-8C0F-3CCB48DE47EA@princeton.edu> Princeton University is seeking to make the first of several anticipated new faculty appointments in neuroscience, as part of its new Institute in this area and its growing focus on quantitative approaches to understanding neural coding and dynamics at the systems level. The position is for an Assistant Professor, to begin in September 2006, for a theorist in systems and/or cognitive neuroscience. The appointment will be joint between the Institute and a department appropriate to the individual?s background and interests, with possibilities including (but not limited to) Psychology, Molecular Biology, Mathematics, Physics, Electrical Engineering or Computer Science. Applicants should be prepared to teach both an undergraduate and a graduate level course in neuroscience. Please send a curriculum vitae, a one-page research description, and three letters of recommendation to the Search Committee, Neuroscience Institute, Princeton University, Princeton, NJ 08544, or by email to search at neuroscience.princeton.edu. Materials should be submitted as soon as possible. Applications will be considered on a rolling basis, and the search will remain open until the position is filled. Princeton is an equal opportunity, affirmative action employer. For information about applying to Princeton and how to self-identify, please link to http:// web.princeton.edu/sites/dof/ApplicantsInfo.htm From paul.cisek at umontreal.ca Wed Dec 7 11:47:36 2005 From: paul.cisek at umontreal.ca (Paul Cisek) Date: Wed, 7 Dec 2005 11:47:36 -0500 Subject: Connectionists: International Symposium on Computational Neuroscience - Montreal, Canada, May 8-9, 2006 Message-ID: <006a01c5fb4d$ed9b03e0$2de4cc84@Engram> FIRST ANNOUNCEMENT AND CALL FOR POSTERS ------------------------------------------------------------------- XXVIIIth International Symposium COMPUTATIONAL NEUROSCIENCE: >From theory to neurons and back again May 8-9, 2006 University of Montr?al Montr?al, Qu?bec, Canada ------------------------------------------------------------------- The 28th International Symposium of the Groupe de recherche sur le syst?me nerveux central et le Centre de recherche en sciences neurologiques will be held on May 8-9, 2006, at the University of Montr?al. The objectives of this symposium are to illustrate the power and utility of computational approaches to address fundamental issues of brain function from the level of single cells to that of large systems, as well as to discuss how computational and more traditional physiological methods complement one another. The symposium will include presentations on computational models of sensory and motor systems, learning processes, and information coding. Registration is now open. Please visit http://www.grsnc.umontreal.ca/XXVIIIs/ for information. Submissions are invited for a limited number of poster presentations. Authors of select posters will be invited to contribute a short chapter to a special issue of the book series Progress in Brain Research. Deadline for poster submissions: Friday, March 31, 2006. Conference speakers: Larry Abbott Yoshua Bengio Catherine Carr Paul Cisek Simon Giszter Sten Grillner Stephen Grossberg Geoffrey Hinton Len Maler Eve Marder James McClelland David McCrea Bruce McNaughton Alexandre Pouget Stephen Scott Michael Shadlen Reza Shadmehr Robert Shapley Daniel Wolpert Sponsors: Canadian Institute for Advanced Research (CIAR) Canadian Institutes of Health Research (CIHR) Groupe de recherche sur le syst?me nervaux central (GRSNC) Fonds de la recherche en sant? du Qu?bec (FRSQ) Universit? de Montr?al (CEDAR) From rsun at rpi.edu Sat Dec 3 11:04:52 2005 From: rsun at rpi.edu (Professor Ron Sun) Date: Sat, 3 Dec 2005 11:04:52 -0500 Subject: Connectionists: Ph.D program in Cognitive Science at RPI Message-ID: <8D291EEE-03D7-4372-88FB-BEBD7C1E72C3@rpi.edu> I am looking for a few Ph.D students. The Ph.D program of the Cognitive Science department at RPI is accepting applications. Graduate assistantships and other forms of financial support for graduate students are available. Prospective graduate students with interests in Cognitive Science, especially in learning and skill acquisition and in the relationship between cognition and sociality, are encouraged to apply. Prospective applicants should have background in computer science (the equivalent of a BS in computer science), and have some prior exposure to psychology, artificial intelligence, connectionist models (neural networks), multi-agent systems, and other related areas. Students with a Master's degree already completed are preferred. RPI is a top-tier research university. The CogSci department has identified the Ph.D program and research as its primary missions. The department is conducting research in a number of areas: cognitive modeling, human and machine learning, multi-agent interactions and social simulation, neural networks and connectionist models, human and machine reasoning, cognitive engineering, and so on. See the Web page below regarding my research: http://www.cogsci.rpi.edu/~rsun For the application procedure, see http://www.cogsci.rpi.edu/ The application deadline is Jan.15, 2005. If you decide to apply, follow the official procedure as outlined on the Web page. Send me a short email (in plain text) AFTER you have completed the application. ======================================================== Professor Ron Sun Cognitive Science Department Rensselaer Polytechnic Institute 110 Eighth Street, Carnegie 302A Troy, NY 12180, USA phone: 518-276-3409 fax: 518-276-3017 email: rsun at rpi.edu web: http://www.cogsci.rpi.edu/~rsun ======================================================= From sethu.vijayakumar at ed.ac.uk Wed Dec 7 09:28:25 2005 From: sethu.vijayakumar at ed.ac.uk (Sethu Vijayakumar) Date: Wed, 07 Dec 2005 14:28:25 +0000 Subject: Connectionists: Preprint: Incremental Online Learning in High Dimensions Message-ID: <4396F189.5030106@ed.ac.uk> The following paper is available for download from: http://homepages.inf.ed.ac.uk/svijayak/publications/vijayakumar-NeuCom2005.pdf or as featured (free) article from the MIT Press website: http://mitpress.mit.edu/catalog/item/default.asp?sid=22065875-6E38-4AAA-BB11-E00879BDE665&ttype=4&tid=31 Incremental Online Learning in High Dimensions, Neural Computation, vol. 17, no. 12, pp. 2602-2634 (2005) Locally weighted projection regression (LWPR) is a new algorithm for incremental nonlinear function approximation in high-dimensional spaces with redundant and irrelevant input dimensions. At its core, it employs nonparametric regression with locally linear models. In order to stay computationally efficient and numerically robust, each local model performs the regression analysis with a small number of univariate regressions in selected directions in input space in the spirit of partial least squares regression. We discuss when and how local learning techniques can successfully work in high-dimensional spaces and review the various techniques for local dimensionality reduction before finally deriving the LWPR algorithm. The properties of LWPR are that it (1) learns rapidly with second-order learning methods based on incremental training, (2) uses statistically sound stochastic leave-one-out cross validation for learning without the need to memorize training data, (3) adjusts its weighting kernels based on only local information in order to minimize the danger of negative interference of incremental learning, (4) has a computational complexity that is linear in the number of inputs, and (5) can deal with a large number of?possibly redundant?inputs, as shown in various empirical evaluations with up to 90 dimensional data sets. For a probabilistic interpretation, predictive variance and confidence intervals are derived. To our knowledge, LWPR is the first truly incremental spatially localized learning method that can successfully and efficiently operate in very high-dimensional spaces. Software (MATLAB/C++) implementation of the LWPR algorithm can be found at: http://homepages.inf.ed.ac.uk/svijayak/software/LWPR/ -- ------------------------------------------------------------------ Sethu Vijayakumar, Ph.D. Assistant Professor(UK Lecturer) Director, IPAB, School of Informatics, The University of Edinburgh 2107F JCMB, The Kings Buildings, Edinburgh EH9 3JZ, United Kingdom URL: http://homepages.inf.ed.ac.uk/svijayak Ph: +44(0)131 651 3444 SLMC Research Group URL: http://www.ipab.informatics.ed.ac.uk/slmc ------------------------------------------------------------------ Adjunct Assistant Professor, Department of Computer Science, University of Southern California ------------------------------------------------------------------ From te at ecs.soton.ac.uk Tue Dec 6 03:36:10 2005 From: te at ecs.soton.ac.uk (Terry Elliott) Date: Tue, 6 Dec 2005 08:36:10 +0000 (GMT) Subject: Connectionists: PhD studentship Message-ID: University of Southampton School of Electronics and Computer Science PhD Studentship in Computational Neurobiology Applications are invited for a PhD studentship in the general area of Computational Neurobiology, with particular emphasis on mathematical and computational models of synaptic plasticity and neuronal development, under the supervision of Dr Terry Elliott. The successful applicant will join the newly-established Science and Engineering of Natural Systems (SENSe) group within the School. The studentship is funded for UK and EU students at the rate of £12,000 per annum plus fees. The School is the largest of its kind in the UK, and was rated 5* (the top rating) for both Computer Science and Electronics in the last Research Assessment Exercise. The School hosts several major national and European research centres and projects, with funding from numerous sources, including EPSRC, DTI, EU and industry. Applicants should hold, or expect to obtain, a first class or upper second class degree in a highly numerate discipline such as mathematics or physics, although strong applicants from other appropriate subjects will also be considered. Initial, informal inquiries may be addressed to Dr Terry Elliott, e-mail: te at ecs dot soton dot ac dot uk Further details, including application forms, can be found at: www.ecs.soton.ac.uk/admissions/ or can be obtained by contacting: The Postgraduate Admissions Tutor School of Electronics and Computer Science University of Southampton Highfield Southampton S017 1BJ UK. E-mail: PhD-Admissions at ecs.soton.ac.uk Tel: +44 (0) 23 8059 2882 Completed application forms should be returned by 30 April, 2006 for a preferred starting date of 1 October, 2006. From chenyu6 at gmail.com Thu Dec 8 21:30:16 2005 From: chenyu6 at gmail.com (Chen Yu) Date: Thu, 8 Dec 2005 21:30:16 -0500 Subject: Connectionists: Postdoc position at Indiana University Message-ID: Ad #1: POSTDOCTORAL POSITION IN MACHINE LEARNING AND COMPUTER VISION The following postoc position in machine learning and computer vision as applied to visual expertise and visual perception is available at Indiana University, Program in Cognitive Science. Job Title: Postdoctoral Research Associate Job Location: Department of Psychological and Brain Sciences Indiana University Bloomington, IN Closing Date: Application review will begin January 15th. Applications will be considered until the position is filled. The focus of this project will be on using probabilistic modeling and machine learning techniques applied to visual data to infer the processes underlying the development of visual expertise. The successful applicate will have excellent programming skills, experience with C++ and Matlab, and a background in computer vision or machine learning. This project is part of a collaboration between Dr. Richard Shiffrin, Dr. Thomas Busey and Dr. Chen Yu at Indiana University, and is funded through the National Instutites of Justice and the National Institutes of Health. This position is available for two years. Candidates should send letter, curriculum vita, reprints, and names of three referees to (electronic submission preferred): Thomas Busey, PhD Associate Professor Department of Psychological and Brain Sciences, and Program in Cognitive Science Indiana University, Bloomington 1101 E. 10th St Bloomington, IN, 47405 (812) 855-4261 busey at indiana.edu www.indiana.edu/~busey Indiana University is an Affirmative Action/Equal Opportunity employer. Applicants need not be US citizens, and women and minority candidates are especially encouraged to apply. Ad #2: POSTDOCTORAL POSITION IN MACHINE LEARNING AND COMPUTER VISION The following postoc position in machine learning and computer vision as applied to expertise in fingerprint identification is available at Indiana University, Program in Cognitive Science. Job Title: Postdoctoral Research Associate Job Location: Department of Psychological and Brain Sciences Indiana University Bloomington, IN Closing Date: Application review will begin January 15th. Applications will be considered until the position is filled. The focus of this project will be on using probabilistic modeling and machine learning techniques applied to visual data to infer the processes underlying the development of visual expertise in latent print examiners. The successful applicate will have excellent programming skills, experience with C++ and Matlab, and a strong background in computer vision and machine learning. This project is part of a collaboration between Dr. Thomas Busey and Dr. Chen Yu at Indiana University, and is funded through the National Instutites of Justice. This position is available for two years. Candidates should send letter, curriculum vita, reprints, and names of three referees to (electronic submission preferred): Thomas Busey, PhD Associate Professor Department of Psychological and Brain Sciences, and Program in Cognitive Science Indiana University, Bloomington 1101 E. 10th St Bloomington, IN, 47405 (812) 855-4261 busey at indiana.edu www.indiana.edu/~busey Indiana University is an Affirmative Action/Equal Opportunity employer. Applicants need not be US citizens, and women and minority candidates are especially encouraged to apply. From netta at comp.leeds.ac.uk Thu Dec 8 08:54:13 2005 From: netta at comp.leeds.ac.uk (N Cohen) Date: Thu, 8 Dec 2005 13:54:13 +0000 (GMT) Subject: Connectionists: Academic Fellowship in the area of Modelling, Imaging and Design Message-ID: Dear colleagues, below please find an announcement for a prestigious faculty position in at the School of Computing, University of Leeds in the UK. This is a five year fellowship, in the expectation of leading to a permanent academic position. Leeds University has a high concentration of high quality research and thriving research in the School of Computing spanning theory of computing, (Algorithms and Complexity, Program Analysis and Logic Programming), AI (including computer vision, NLP, knowledge representation and learning) and multidisciplinary informatics (including scientific computing, grid computing, biosystems/computational neuroscience, and Visualization and Virtual Reality). The city of Leeds has acquired a much deserved reputation as a cultural, commercial and social hub of the north of England, with internationally respected theatre, opera, sporting and other activities within easy reach. ===================================================================== Academic Fellowship in the area of Modelling, Imaging and Design A prestigious appointment is available in an internationally-leading research group within the School of Computing. Applications are invited from individuals with post-doctoral (or equivalent) experience who have established their potential for excellence in scholarship and research. The Fellowship is five years in length and, subject to satisfactory completion of probation, guarantees an established academic post. In the early stages there will be a strong emphasis on research. Based in the School of Computing, topics include Computational Modelling and Simulation; Vision and Imaging; and, System Architecture Design for Internet Computing. Further details are available on the Academic Research Fellowship (http://www.comp.leeds.ac.uk/vacancies/20060114arf-fp.shtml) and the School of Computing (http://www.comp.leeds.ac.uk/vacancies/20060114arf-fp2.shtml). Enquiries specific to the research area may be made to Professor Peter Jimack (0113) 343 5464, email p.k.jimack at leeds.ac.uk Further information on the Academic Fellowship Scheme is available at http://www.rcuk.ac.uk/acfellow/. The rules for the scheme indicate that people already in permanent employment will not normally be eligible for appointment. Fellows will normally be appointed to Research Staff Grade IA or II (??19,460 - ??29,128) or (??27,116- ??35,883) depending on experience. (More senior appointments than this may, however be considered for particularly outstanding appointments.) The University is introducing a new reward framework which will facilitate the recruitment, retention and motivation of world class staff. Informal enquiries may be made to Margaret Smith (0113) 343 2001 email m.a.smith at leeds.ac.uk To apply on line please visit http://www.leeds.ac.uk and select 'jobs???. Application packs are also available via email recruitment at adm.leeds.ac.uk or tel (0113) 343 5771 Closing date is 14 January 2006 From r.w.clowes at sussex.ac.uk Fri Dec 9 05:46:45 2005 From: r.w.clowes at sussex.ac.uk (Robert Clowes) Date: Fri, 9 Dec 2005 10:46:45 -0000 Subject: Connectionists: CFP for Integrative Approaches to Machine Consciousness 2006 Message-ID: <005201c5fcad$db591f80$3931b88b@rn.informatics.scitech.susx.ac.uk> 1st CFP for: Integrative Approaches to Machine Consciousness April 5th-6th 2006 part of AISB'06: Adaptation in Artificial and Biological Systems University of Bristol, Bristol, England In April 2006 there will be a continuation of the 2005 Machine Consciousness Symposium as a part of the AISB'06 convention. Submissions Integrative Approaches to Machine Consciousness. Abstract Submission by: Jan 21st 2006 Machine Consciousness (MC) concerns itself with the study and creation of artefacts which have mental characteristics typically associated with consciousness such as (self-) awareness, emotion, affect, phenomenal states, imagination, etc. Recently, developments in AI and robotics, especially through the prisms of behavioural and epigenetic robotics, have stressed the embodied, interactive and developmental nature of intelligent agents which are now regarded by many as essential to engineering human-level intelligence. Some recent work has suggested that giving robots imaginative or simulation capabilities might be a big step towards achieving MC. Other studies have emphasized 'second person' issues such as intersubjectivity and empathy as a substrate for human consciousness. Alongside this, the infant-caregiver relationship has been recognised as essential to the development of consciousness in its specifically human form. Bringing these issues to centre stage in the study of artificial consciousness were the focus of last years AISB conference, Next Generation approaches to Machine Consciousness: Imagination, Development, Intersubjectivity, and Embodiment. This conference seeks to continue examination of many of these themes in relation to MC, but with a new focus on attempts to treat the synthesis or fusion of central components of MC in integrated models. We would also be interested in models which show or treat the emergence of processes or systems underlying these core themes. The website for the earlier conference at http://www.sussex.ac.uk/cogs/mc, and the online version of the proceedings can be found at and the http://www.aisb.org.uk/publications/proceedings/aisb05/7_MachConsc_Final.pdf . An article introducing and contextualising some of this work can also be found here: ftp://ftp.informatics.sussex.ac.uk/pub/reports/csrp/csrp574.pdf. Submissions are especially invited on the following topics in their relation to MC: ? Imagination ? Development ? Emotion ? Enactive / Embodied Approaches ? Heterophenomenology ? Synthetic Phenomenology ? Intersubjectivity ? Narrative ? General aspects (techniques, theories, constraints) We especially welcome attempts to study the way these different areas might be related. Preference will be given to submissions that are: ? Relevant: closely related to the themes of the symposium ? Implemented: based on working robotic or other implemented systems ? Novel: not previously presented elsewhere ? Integrative: models that examine the integration, or synthesis of core aspects of machine consciousness (especially two or more of the above topics), or their emergence from more basic cognitive functions. However, it is not expected that all accepted submissions will meet all four criteria of preference. Submissions should be in the form of papers 6,000 words (6-8 pages) OR abstracts (around 2 pages) based around more speculative ideas. The latter will be invited to give shorter presentations based on 4 page papers which will appear in the final proceedings. We also aim to publish a selection of the best articles in a special issue of a journal which we are currently negotiating. Poster submissions are also welcome. Formatting Papers should be PDF format, formatted according to Springer LNCS (see 'Proceedings and Other Multiauthor Volumes') at http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-164-2-72376-0,00.h tml Joint organisers: Rob Clowes, Ron Chrisley & Steve Torrance Program Committee Igor Aleksander, Giovanna Colombetti, Rodney Cotterill, Fr?d?ric Kaplan, Pentti Haikonen, Germund Hesslow, Owen Holland, Takashi Ikegami, Miguel Salichs, Ricardo Sanz, Murray Shanahan, Jun Tani, Tom Ziemke The conference website and submission information can be found at http://www.informatics.sussex.ac.uk/research/paics/machineconsciousness/ Important Dates Submission of papers by: Jan 21st 2006 Notification of decision: Feb 4th 2006 Camera ready copies by: February 20th 2006 From BerndPorr at f2s.com Sun Dec 11 18:14:33 2005 From: BerndPorr at f2s.com (Bernd Porr) Date: Sun, 11 Dec 2005 23:14:33 +0000 Subject: Connectionists: RunBot Message-ID: <439CB2D9.8010106@f2s.com> At the NIPS conference we have presented the RunBot at the demo track and at the poster session. There has been interest in the detailed design of the robot, for example to implement the controller on a (analogue) VLSI chip. I'm pleased to announce a more detailed article about the RunBot which is called: "Coupling of Neural Computation with Physical Computation for Stable Dynamic Biped Walking Control" which will appear in Neural Computation. A final draft of the article can be downloaded here: http://www.berndporr.me.uk/geng_et_al2005/ http://www.cn.stir.ac.uk/~faw1/Publications/papers/geng_etal_nc2005.pdf Please direct any technical questions directly to Tao Geng: http://www.cn.stir.ac.uk/~tgeng/ Regards /Bernd Porr -- www: http://www.berndporr.me.uk/ http://www.linux-usb-daq.co.uk/ Mobile: +44 (0)7840 340069 Work: +44 (0)141 330 5237 University of Glasgow Department of Electronics & Electrical Engineering Room 519, Rankine Building, Oakfield Avenue, Glasgow, G12 8LT From conrad.sanderson at anu.edu.au Mon Dec 12 21:03:03 2005 From: conrad.sanderson at anu.edu.au (conrad sanderson) Date: Tue, 13 Dec 2005 13:03:03 +1100 Subject: Connectionists: CFP: "Beyond Patches" CVPR 2006 workshop Message-ID: <200512131303.03517.conrad.sanderson@anu.edu.au> Call for Papers: "Beyond Patches" workshop, in conjunction with the CVPR 2006 conference. http://prost.cv.ri.cmu.edu/~slucey/BP-CVPR06/ Submission deadline: 24 March 2006 The concept of an image "patch", in computer vision, has many similarities to work within the field of structural pattern recognition. The structural approach takes the view that a pattern is composed of simpler subpatterns which, in turn, are built from even simpler subpatterns. Recently, many inroads have been made into novel areas of computer vision through the employment of patch-based representations with machine learning and pattern recognition techniques. In this workshop, we are soliciting papers from the computer vision and machine learning communities that expand and explore the boundaries of patch representations in computer vision applications. Relevant topics to the workshop include (but are not limited to): * Novel methods for identifying (e.g. SIFT, DoGs, Harris detector) and employing salient patches. * Techniques that explore criteria for deciding the size and shape of a patch based on image content and the application. * Approaches that explore the employment of multiple and/or heterogeneous patch sizes and shapes during analysis. * Applications that explore how important relative patch position is, and whether there are advantages in allowing those patches to move freely or in a constrained fashion. * Novel methods that explore and extend the concept of patches to video (e.g. space-time patches). * Approaches that draw upon previous work in structural pattern recognition in order to improve current patch-based computer vision algorithms. * Novel applications that extend the concept of patch-based analysis to other, hitherto, non-conventional areas of computer vision. * Novel techniques for estimating dependencies between patches in the same image (e.g. 3D rotations) to improve matching/correspondence. Submissions: Papers in PDF format are required by midnight 24 March 2006 EST. Papers should not exceed 8 double-column pages. The paper format must follow the standard IEEE 2-column format of single-spaced text in 10 point Times Roman, with 12 point interline space. All paper submissions must be anonymous. All submissions will be peer-reviewed by members of the program committee. Workshop site: http://prost.cv.ri.cmu.edu/~slucey/BP-CVPR06/ CVPR 2006 site: http://www.cvpr.org/2006/ From dayan at gatsby.ucl.ac.uk Mon Dec 12 08:45:05 2005 From: dayan at gatsby.ucl.ac.uk (Peter Dayan) Date: Mon, 12 Dec 2005 13:45:05 +0000 Subject: Connectionists: Gatsby PhD Programme In-Reply-To: <20050608123601.GE16153@flies.gatsby.ucl.ac.uk> References: <50907.193.217.174.139.1114196535.squirrel@webmail.uio.no> <20050422195120.GA28336@flies.gatsby.ucl.ac.uk> <20050607155808.GB5355@flies.gatsby.ucl.ac.uk> <50732.193.217.174.139.1118220211.squirrel@webmail.uio.no> <20050608085329.GB16153@flies.gatsby.ucl.ac.uk> <20050608123601.GE16153@flies.gatsby.ucl.ac.uk> Message-ID: <20051212134505.GA26158@flies.gatsby.ucl.ac.uk> Gatsby Computational Neuroscience Unit 4 year PhD Programme The Gatsby Unit is a world-class centre for theoretical neuroscience and machine learning, focusing on unsupervised learning, reinforcement learning, neural dynamics, population coding, interpretation of neural data and perceptual processing. It provides a unique opportunity for a critical mass of theoreticians to interact closely with each other, and with other world-class research groups in related departments at University College London, including Anatomy, Computer Science, Functional Imaging Laboratory, Physics, Physiology, Psychology, Neurology, Ophthalmology, and Statistics, and also with other Universities, notably Cambridge. The Unit always has openings for exceptional PhD candidates. Applicants should have a strong analytical background, a keen interest in neuroscience and a relevant first degree, for example in Computer Science, Engineering, Mathematics, Neuroscience, Physics, Psychology or Statistics. The PhD programme lasts four years, including a first year of intensive instruction in techniques and research in theoretical neuroscience and machine learning. It is described at http://www.gatsby.ucl.ac.uk/teaching/phd/ A number of competitive fully-funded studentships are available each year and the Unit also welcomes students with pre-secured funding or with other scholarship/studentship applications in progress. In the first instance, applicants are encouraged to apply informally by sending, in plain text format, a CV, a statement of research interests, and the names and addresses of three referees to admissions at gatsby.ucl.ac.uk. General enquiries should also be directed to this e-mail address. For further details of research interests please see http://www.gatsby.ucl.ac.uk/research.html Applications to begin the programme in September 2006 should be received by the 1st March 2006. From dgw at MIT.EDU Tue Dec 13 16:02:04 2005 From: dgw at MIT.EDU (David Weininger) Date: Tue, 13 Dec 2005 16:02:04 -0500 Subject: Connectionists: Book announcement - Rasmussen Message-ID: <6.2.1.2.2.20051213160202.041b10b0@po14.mit.edu> Hi all: I thought that Connectionists readers might be interested in the following new title from MIT Press. More information about the book is available at http://mitpress.mit.edu/promotions/books/SP2006026218253X. Thanks! Best, David Gaussian Processes for Machine Learning Carl Edward Rasmussen and Christopher K. I. Williams Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increasing attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes. Carl Edward Rasmussen is a Research Scientist at the Department of Empirical Inference for Machine Learning and Perception at the Max Planck Institute for Biological Cybernetics, T?bingen. Christopher K. I. Williams is Professor of Machine Learning and Director of the Institute for Adaptive and Neural Computation in the School of Informatics, University of Edinburgh. 8 x 10, 272 pp., cloth, ISBN 0-262-18253-X David Weininger Associate Publicist MIT Press 55 Hayward Street Cambridge, MA 02142-1315 617.253.2079 617.253.1709 fax dgw at mit.edu Check out the new MIT Press Log http://mitpress.mit.edu/presslog From M.Denham at plymouth.ac.uk Tue Dec 13 08:56:22 2005 From: M.Denham at plymouth.ac.uk (Mike Denham) Date: Tue, 13 Dec 2005 13:56:22 -0000 Subject: Connectionists: submission Message-ID: <52A8091888A23F47A013223014B6E9FE078D7CD1@03-CSEXCH.uopnet.plymouth.ac.uk> Centre for Theoretical and Computational Neuroscience, University of Plymouth, UK Postdoctoral Research Fellow (salary range ?23,643 - ?26,671 (GB Pounds) per annum) Applications are invited for a post of Postdoctoral Research Fellow in the Centre for Theoretical and Computational Neuroscience at the University of Plymouth, UK. Applicants must have a PhD in the area of neuroscience and possess a good knowledge and understanding of the mathematical methods and computational tools for modelling cortical neural networks at a biologically realistic level. The work of the Research Fellow will be specifically concerned with the development and investigation of a model of the laminar microcircuitry of the primary visual cortex, making use of distributed processing tools on an 80-processor Linux cluster simulation facility. The project will draw on neurobiological experimental and modelling results from several of the major neuroscience research labs in Europe who are collaborators in this research programme, and there will be opportunities for travel to and close interactions with these labs. The Centre for Theoretical and Computational Neuroscience is one of the main UK labs specialising in theoretical and modelling approaches to understanding brain function (visit www.plymneuro.org.uk). It has research groups in vision, audition, sensorimotor control, mathematical neuroscience, biophysics of temporal brain dynamics, and neural computation. It is actively collaborating with several UK, US and European labs and participates in a number of major UK research council and EU funded research projects. The research fellow post is available immediately and an appointment will be made as soon as possible. The appointment will be initially for a fixed term of three years, and will be subject to a probationary period of six months. Informal enquiries, ideally including a CV/resum?, should be made in the first instance by email to the Head of the Centre for Theoretical and Computational Neuroscience, Professor Mike Denham: mdenham at plymouth.ac.uk. From jaakko.sarela at tkk.fi Mon Dec 12 03:25:26 2005 From: jaakko.sarela at tkk.fi (Jaakko Sarela) Date: Mon, 12 Dec 2005 10:25:26 +0200 Subject: Connectionists: DSS MATLAB package v1-0 Message-ID: <20051212082525.GA17917@mail.cis.hut.fi> Announcement of the stable release (v1-0) of the DSS MATLAB package We have recently introduced a general framework for source separation called denoising source separation (DSS, [1]), where source separation is constructed around denoising procedures. The DSS algorithms may vary from almost blind (ICA) to detailed algorithms in special settings, allowing the prior information to guide the separation. The framework has already been applied in several fields (neuroinformatics [1], climate analysis [2], CDMA signal acquisition [3] and nonlinear ICA for separation of real-life image mixtures [4], etc.). The DSS MATLAB package, developed under the GNU GPL, has reached stable release v1-0. The package is highly customizable and there is a wide collection of denoising functions readily available. The package includes a command-line version as well as a graphical user interface. The package is available via http://www.cis.hut.fi/projects/dss/package/. Best regards Jaakko S?rel? and Harri Valpola References: [1] Denoising source separation. J. S?rel? and H. Valpola. Journal of Machine Learning Research, 6:233-272, 2005. http://www.cis.hut.fi/projects/dss/publications/#sarela05jmlr [2] Frequency-Based Separation of Climate Signals. A. Ilin and H. Valpola. In the proceedings of the 9th European Conference on Principles and Practice of Knowledge Discovery in Databases (PKDD 2005), Porto, Portugal, pp. 519-526, 2005. http://www.cis.hut.fi/projects/dss/publications/#ilin05pkdd [3] A denoising source separation based approach to interference cancellation for DS-CDMA array systems. K. Raju and J. S?rel?. In Proceedings of the 38th Asilomar Conference on Signals, Systems, and Computers, Pacific grove, CA, USA, pp. 1111 -- 1114, 2004. http://www.cis.hut.fi/projects/dss/publications/#raju04asilomar [4] Separation of nonlinear image mixtures by denoising source separation. M.S.C. Almeida, H. Valpola and J. S?rel?. In Proceedings of the 6th International Conference on Independent Component Analysis and Blind Source Separation, ICA 2006, Charleston, SC, USA, accepted. http://www.cis.hut.fi/projects/dss/publications/#almeida06ica From niki at cse.ohio-state.edu Thu Dec 15 12:42:21 2005 From: niki at cse.ohio-state.edu (Nicoleta Roman) Date: Thu, 15 Dec 2005 12:42:21 -0500 Subject: Connectionists: Ph.D. dissertation announcement: Sound Source Segregation Message-ID: <43A1AAFD.2040107@cse.ohio-state.edu> Dear list members: I would like to bring to your attention my recently completed Ph.D. dissertation, entitled "Auditory-based algorithms for sound segregation in multisource and reverberant environments". An electronic version of the thesis is available at: http://www.ohiolink.edu/etd/view.cgi?osu1124370749 Please find the abstract below. Sincerely, Nicoleta Roman -------- ABSTRACT -------- At a cocktail party, we can selectively attend to a single voice and filter out other interferences. This perceptual ability has motivated a new field of study known as computational auditory scene analysis (CASA) which aims to build speech separation systems that incorporate auditory principles. The psychological process of figure-ground segregation suggests that the target signal should be segregated as foreground while the remaining stimuli are treated as background. Accordingly, the computational goal of CASA should be to estimate an ideal time-frequency (T-F) binary mask, which selects the target if it is stronger than the interference in a local T-F unit. This dissertation investigates four aspects of CASA processing: location-based speech segregation, binaural tracking of multiple moving sources, binaural sound segregation in reverberation, and monaural segregation of reverberant speech. For localization, the auditory system utilizes the interaural time difference (ITD) and interaural intensity difference (IID) between the ears. We observe that within a narrow frequency band, modifications to the relative strength of the target source with respect to the interference trigger systematic changes for ITD and IID resulting in a characteristic clustering. Consequently, we propose a supervised learning approach to estimate the ideal binary mask. A systematic evaluation shows that the resulting system produces masks very close to the ideal binary ones and large speech intelligibility improvements. In realistic environments, source motion requires consideration. Binaural cues are strongly correlated with locations in T-F units dominated by one source resulting in channel-dependent conditional probabilities. Consequently, we propose a multi-channel integration method of these probabilities in order to compute the likelihood function in a target space. Finally, a hidden Markov model is employed for forming continuous tracks and automatically detecting the number of active sources. Reverberation affects the ITD and IID cues. We therefore propose a binaural segregation system that combines target cancellation through adaptive filtering and a binary decision rule to estimate the ideal binary mask. A major advantage of the proposed system is that it imposes no restrictions on the interfering sources. Quantitative evaluations show that our system outperforms related beamforming approaches. Psychoacoustic evidence suggests that monaural processing play a vital role in segregation. It is known that reverberation smears the harmonicity of speech signals. We therefore propose a two-stage separation system that combines inverse filtering of target room impulse response with pitch-based segregation. As a result of the first stage, the harmonicity of a signal arriving from target direction is partially restored while signals arriving from other locations are further smeared, and this leads to improved segregation and considerable signal-to-noise ratio gains. -------------- -------------- From qobi at purdue.edu Thu Dec 15 12:08:19 2005 From: qobi at purdue.edu (Jeffrey Mark Siskind) Date: Thu, 15 Dec 2005 12:08:19 -0500 Subject: Connectionists: CFP: The Fifth IEEE Computer Society Workshop on Perceptual Organization in Computer Vision Message-ID: <200512151708.jBFH8Jf06626@tlamachilistli.ecn.purdue.edu> FIRST CALL FOR PAPERS: POCV 2006 The Fifth IEEE Computer Society Workshop on Perceptual Organization in Computer Vision New York City June 22, 2006, In Conjunction with IEEE CVPR 2006 http://elderlab.yorku.ca/pocv IMPORTANT DATES: * Submission deadline: 11:59pm EST, March 17, 2006 * Notification: April 17, 2006 * Final versions of accepted papers due: April 24, 2006 **Please note that biological vision researchers working in the field of perceptual organization are encouraged to submit work that may stimulate new directions of research in the computer vision community. THEME: Perceptual Organization is the process of establishing a meaningful relational structure over raw visual data, where the extracted relations correspond to the physical structure of the scene. A driving motivation behind perceptual organization research in computer vision is to deliver representations needed for higher-level visual tasks such as object detection, object recognition, activity recognition and scene reconstruction. Because of its wide applicability, the potential payoff from perceptual organization research is enormous. The 5th IEEE POCV Workshop, to be held in conjunction with CVPR 2006 (New York), will bring together experts in perceptual organization and related areas to report on recent research results and to provide ideas for future directions. PREVIOUS IEEE POCV WORKSHOPS: * 2004 CVPR (Washington, DC) * 2001 ICCV (Vancouver, Canada) * 1999 ICCV (Crete, Greece) * 1998 CVPR (Santa Barbara, CA) SCOPE: Papers are solicited in all areas of perceptual organization, including but not limited to: * image segmentation * feature grouping * texture segmentation * contour completion * spatiotemporal/motion segmentation * figure-ground discrimination * integration of top-down and bottom-up methods * perceptual organization for object or activity detection/recognition * unification of segmentation, detection and recognition * biologically-motivated methods * neural basis for perceptual organization * learning in perceptual organization * graphical methods * natural scene statistics * evaluation methods ALGORITHM EVALUATION: Research progress in perceptual organization depends in part on quantitative evaluation and comparison of algorithms. Authors reporting results of new algorithms are strongly encouraged to objectively quantify performance and compare against at least one competing approach. BROADER ISSUES: Perceptual organization research faces a number of challenges. One is defining what the precise goal of perceptual organization algorithms should be. What kind of representation should they deliver? What databases should be used for evaluation? How can we quantify performance to allow objective evaluation and comparison between algorithms? How do we know when we’ve succeeded? To try to meet these challenges, we particularly encourage contributions of a more general nature that attempt to address one or more of these questions. These may include definitional papers, theoretical frameworks that might apply to multiple different perceptual organization problems, establishment of useful databases, modeling of underlying natural scene statistics, evaluation methodologies, etc. Biological Motivation BIOLOGICAL MOTIVATION: Much of the current work in perceptual organization in computer vision has its roots in qualitative principles established by the Gestalt Psychologists nearly a century ago, and this link between computational and biological research continues to this day. Following this tradition, we specifically invite biological vision researchers working in the field of perceptual organization to submit work that may stimulate new directions of research in the computer vision community. WORKSHOP OUTPUT: All accepted papers will be included in the Electronic Proceedings of CVPR, distributed on DVD at the conference, and will be indexed by IEEE Xplore. We are also exploring the possibility of a special journal issue on perceptual organization in computer vision, with a separate call for papers. PAPER SUBMISSION: Submission is electronic, and must be in PDF format. Papers must not exceed 8 double-column pages. Submissions must follow standard IEEE 2-column format of single-spaced text in 10 point Times Roman, with 12 point interline space. All submissions must be anonymous. Please us the IEEE Computer Society CVPR format kit. Stay tuned for exact details on how to submit. In submitting a paper to the POCV Workshop, authors acknowledge that no paper of substantially similar content has been or will be submitted to another conference or workshop during the POCV review period. For further details and updates, please see the workshop website: http://elderlab.yorku.ca/pocv WORKSHOP CHAIRS: James Elder, York University jelder at yorku.ca Jeffrey Mark Siskind, Purdue University qobi at purdue.edu PROGRAM COMMITTEE: Ronen Basri, Weizmann Institute, Israel Kim Boyer, Ohio State University, USA James Coughlan, Smith-Kettlewell Institute, USA Sven Dickinson, University of Toronto, Canada Anthony Hoogs, GE Global Research, USA David Jacobs, University of Maryland, USA Ian Jermyn, INRIA, France Benjamin Kimia, Brown University, USA Norbert Kruger, Aalborg University, Denmark Michael Lindenbaum, Technion, Israel Zili Liu, University of California, Los Angeles, USA David Martin, Boston College, USA Gerard Medioni, University of Southern California, USA Zygmunt Pizlo, Purdue University, USA Sudeep Sarkar, University of South Florida, USA Eric Saund, Palo Alto Research Centre, USA Kaleem Siddiqi, McGill University, Canada Manish Singh, Rutgers University, USA Shimon Ullman, Weizmann Institute, Israel Johan Wagemans, University of Leuven, Belgium Song Wang, University of South Carolina, USA Rich Zemel, University of Toronto, Canada Song-Chun Zhu, University of California, Los Angeles, USA Steve Zucker, Yale University, USA From ted.carnevale at yale.edu Wed Dec 14 10:38:37 2005 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Wed, 14 Dec 2005 10:38:37 -0500 Subject: Connectionists: Announcement: The NEURON Book Message-ID: <43A03C7D.60503@yale.edu> Cambridge University Press has announced that distribution of The NEURON Book will begin in January 2006 http://www.cambridge.org/us/catalogue/catalogue.asp?isbn=0521843219 --Ted Carnevale The NEURON Book N.T. Carnevale and M.L. Hines ISBN-10: 0521843219 The authoritative reference on NEURON, the simulation environment for modeling biological neurons and neural networks that enjoys wide use in the experimental and computational neuroscience communities. This book will show you how to use NEURON to construct and apply empirically based models. Written primarily for neuroscience investigators, teachers, and students, it assumes no previous knowledge of computer programming or numerical methods. Readers with a background in the physical sciences or mathematics, who have some knowledge about brain cells and circuits and are interested in computational modeling, will also find it helpful. The NEURON Book covers material that ranges from the inner workings of this program to practical considerations involved in specifying the anatomical and biophysical properties that are to be represented in models. It uses a problem-solving approach, with many working examples that readers can try for themselves. Nicholas T. Carnevale is a Senior Research Scientist in the Department of Psychology at Yale University. He directs the NEURON courses at the annual meetings of the Society for Neuroscience, and the NEURON Summer Courses at the University of California, San Diego, and University of Minnesota, Minneapolis. Michael L. Hines is a Research Scientist in the Department of Computer Science at Yale University. He created NEURON in collaboration with John W. Moore at Duke University, Durham NC, and is the principal investigator and chief software architect on the project that continues to support and extend it. From terry at salk.edu Thu Dec 15 15:30:34 2005 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 15 Dec 2005 12:30:34 -0800 Subject: Connectionists: NEURAL COMPUTATION 18:2 In-Reply-To: Message-ID: Neural Computation - Contents - Volume 18, Number 2 - February 1, 2006 Article Polychronization: Computation With Spikes Eugene M. Izhikevich Letters Making Working Memory Work: A Computational Model of Learning in the Prefrontal Cortex and Basal Ganglia Randall C. O'Reilly and Michael J. Frank Identification of Multiple-Input Systems with Highly Coupled Inputs: Application to EMG Prediction from Multiple Intra Cortical Electrodes David T. Westwick, Eric A. Pohlmeyer, Sara A. Solla, Lee E. Miller and Eric J. Perreault Oscillatory Networks: Pattern Recognition Without a Superposition Catastrophe Thomas Burwick Topographic Product Models Applied to Natural Scene Statistics Simon Osindero, Max Welling and Geoffrey E. Hinton A Simple Hebbian/Anti-Hebbian Network Learns the Sparse, Independent Components of Natural Images Michael S. Falconbridge, Robert L. Stamps and David R. Badcock Differential Log Likelihood for Evaluating and Learning Gaussian Mixtures Marc M. Van Hulle Magnification Control in Self-Organizing Maps and Neural Gas Thomas Villmann and Jens Christian Claussen Enhancing Density-Based Data Reduction Using Entropy D. Huang and Tommy W. S. Chow ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2006 - VOLUME 18 - 12 ISSUES Electronic only USA Canada* Others USA Canada* Student/Retired $60 $64.20 $114 $54 $57.78 Individual $100 $107.00 $154 $90 $96.30 Institution $730 $781.10 $784 $657 $702.99 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From cindy at bu.edu Fri Dec 16 10:55:01 2005 From: cindy at bu.edu (Cynthia Bradford) Date: Fri, 16 Dec 2005 10:55:01 -0500 Subject: Connectionists: Neural Networks 19(1) 2006 Message-ID: <200512161555.jBGFt1B9015326@kenmore.bu.edu> NEURAL NETWORKS 19(1) Contents - Volume 19, Number 1 - 2006 ------------------------------------------------------------------ EDITORIAL: Another year of exciting Special Issues! NEURAL NETWORKS REFEREES USED IN 2005 ***** Psychology and Cognitive Science ***** J. Molina Vilaplana and J. Lopez Coronado A neural network model for coordination of hand gesture during reach to grasp ***** Neuroscience and Neuropsychology ***** Tony J. Prescott, Fernando M. Montes Gonzalez, Kevin Gurney, Mark D. Humphries, and Peter Redgrave A robot model of the basal ganglia: Behavior and intrinsic processing ***** Mathematical and Computational Analysis ***** Kazunori Iwata, Kazushi Ikeda, and Hideaki Sakai The asymptotic equipartition property in reinforcement learning and its relation to return maximization Shengyuan Xu and James Lam A new approach to exponential stability analysis of neural networks with time-varying delays Arindam Choudhury, Prasanth B. Nair, and Andy J. Keane Constructing a speculative kernel machine for pattern classification ***** Engineering and Design ***** Shen Furao and Osamu Hasegawa An incremental network for on-line supervised classification and topology learning CURRENT EVENTS ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 Y 13,000 Neural Networks (plus Y 2,000 enrollment fee) $20 (student) SEK 460 Y 11,000 (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------------- membership without $30 SEK 200 not available Neural Networks to non-student (subscribe through another society) Y 5,000 student (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------------- Name: ______________________________________________________ Title: ______________________________________________________ Address: ______________________________________________________ Phone: ______________________________________________________ Fax: ______________________________________________________ Email: ______________________________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number _______________________________ expiration date _____________________________ INNS Membership 2810 Crossroads Drive, Suite 3800 Madison WI 53718 USA 608 443 2461, ext. 138 (phone) 608 443 2474 (fax) srees at reesgroupinc.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership JNNS Secretariat c/o Fuzzy Logic Systems Institute 680-41 Kawazu, Iizuka Fukuoka 820-0067 Japan 81 948 24 2771 (phone) 81 948 24 3002 (fax) jnns at flsi.cird.or.jp http://www.jnns.org/ ---------------------------------------------------------------------------- From Martin.Riedmiller at uos.de Fri Dec 16 10:36:16 2005 From: Martin.Riedmiller at uos.de (Martin Riedmiller) Date: Fri, 16 Dec 2005 16:36:16 +0100 Subject: Connectionists: Learning soccer robots - source code release Message-ID: <43A2DEF0.7000908@uos.de> The Brainstormers, current World Champion in RoboCup Soccer Simulation league 2D, proudly announce the release of major parts of their source code. The aim of the Brainstormers project at the Neuroinformatics Group at the University of Osnabrueck is to demonstrate the succesful application of machine learning techniques (in particular Neural Reinforcement Learning methods) in competition. The released source code therefore contains a large amount of examples of learned skills and team behaviours, such as NeuroIntercept, NeuroGo2Position, NeuroKick, NeuroAttack etc. The code release is mainly meant to provide a good starting point for new teams in RoboCup but also might provide useful stimulations for more advanced teams (in particular concerning the learnt modules) and for researchers in Artificial Intelligence/ Machine Learning. Links: Brainstormers home page: www.ni.uos.de/brainstormers Download: www.ni.uos.de/index.php?id=880 Have fun, Martin Riedmiller and Thomas Gabel, Neuroinformatics Group, Univ. of Osnabrueck, www.ni.uos.de From dayan at gatsby.ucl.ac.uk Tue Dec 20 05:56:44 2005 From: dayan at gatsby.ucl.ac.uk (Peter Dayan) Date: Tue, 20 Dec 2005 10:56:44 +0000 Subject: Connectionists: Gatsby PhD Programme: 15th January 2006 closing date In-Reply-To: <20051212134505.GA26158@flies.gatsby.ucl.ac.uk> References: <50907.193.217.174.139.1114196535.squirrel@webmail.uio.no> <20050422195120.GA28336@flies.gatsby.ucl.ac.uk> <20050607155808.GB5355@flies.gatsby.ucl.ac.uk> <50732.193.217.174.139.1118220211.squirrel@webmail.uio.no> <20050608085329.GB16153@flies.gatsby.ucl.ac.uk> <20050608123601.GE16153@flies.gatsby.ucl.ac.uk> <20051212134505.GA26158@flies.gatsby.ucl.ac.uk> Message-ID: <20051220105633.GA17388@flies.gatsby.ucl.ac.uk> I would like to apologise for some erroneous information in my recent posting about the Gatsby Unit's 4 year PhD programme in theoretical neuroscience and machine learning (http://www.gatsby.ucl.ac.uk/teaching/phd/) The closing date for applications (to admissions at gatsby.ucl.ac.uk) is actually 15th January 2006. Peter Dayan From h.jaeger at iu-bremen.de Tue Dec 20 11:44:01 2005 From: h.jaeger at iu-bremen.de (Herbert Jaeger) Date: Tue, 20 Dec 2005 17:44:01 +0100 Subject: Connectionists: CFP Neural Networks Special Issue on ESNs and LSMs Message-ID: <43A834D1.7010302@iu-bremen.de> Content-Type: text/plain; charset=us-ascii; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Scanned: by amavisd-new 20030616p5 at demetrius.iu-bremen.de CALL FOR PAPERS: Neural Networks 2007 Special Issue "Echo State Networks and Liquid State Machines" Guest Co-Editors : Dr. Herbert Jaeger, International University Bremen, h.jaeger at iu-bremen.de Dr. Wolfgang Maass, Technische Universitaet Graz, maass at igi.tugraz.at Dr. Jose C. Principe, University of Florida, principe at cnel.ufl.edu A new approach to analyzing and training recurrent neural network (RNNs) has emerged over the last few years. The central idea is to regard a RNN as a nonlinear, excitable medium, which is driven by input signals or fed-back output signals. From the excited response signals inside the medium, simple (typically linear), trainable readout mechanisms distil the desired output signals. The medium consists of a large, randomly connected network, which is not adapted during learning. It is variously referred to as a dynamical reservoir or liquid. There are currently two main flavours of such networks. Echo state networks were developed from a mathematical and engineering background and are composed of simple sigmoid units, updated in discrete time. Liquid state machines were conceived from a mathematical and computational neuroscience perspective and usually are made of biologically more plausible, spiking neurons with a continuous-time dynamics. These approaches have quickly gained popularity because of their simplicity, expressiveness, ease of training and biological appeal. This Special Issue aims at establishing a first comprehensive overview of this newly emerging area, demonstrating the versatility of the approach, its mathematical foundations and also its limitations. Submissions are solicited that contribute to this area of research with respect to -- mathematical and algorithmic analysis, -- biological and cognitive modelling, -- engineering applications, -- toolboxes and hardware implementations. One of the main questions in current research in this field concerns the structure of the dynamical reservoir / liquid. Submissions are especially welcome which investigate the relationship between the excitable medium topology and algebraic properties and the resulting modeling capacity, or methods for pre-adapting the medium by unsupervised or evolutionary mechanisms, or including special-purpose sub networks (as for instance, feature detectors) into the medium. Submission of Manuscript The manuscripts should be prepared according to the format of the Neural Networks and electronically submitted to one of the Guest Editors. The review will take place within 3 months and only very minor revisions will be accepted. For any further question, please contact the Guest Editors. DEADLINE FOR SUBMISSION : June 1, 2006. ------------------------------------------------------------------ Dr. Herbert Jaeger Professor for Computational Science International University Bremen Campus Ring 12 28759 Bremen, Germany Phone (+49) 421 200 3215 Fax (+49) 421 200 49 3215 email h.jaeger at iu-bremen.de http://www.faculty.iu-bremen.de/hjaeger/ ------------------------------------------------------------------ From A.Cangelosi at plymouth.ac.uk Wed Dec 21 08:55:32 2005 From: A.Cangelosi at plymouth.ac.uk (Angelo Cangelosi) Date: Wed, 21 Dec 2005 13:55:32 -0000 Subject: Connectionists: PhD position in Computational Neuroscience and Interactive Intelligent Systems Message-ID: <64997DB783F0FD4EB5550AD0D550E2290501DDA1@03-CSEXCH.uopnet.plymouth.ac.uk> Centre for Theoretical and Computational Neuroscience (CTCN) Centre for Interactive Intelligent Systems (CIIS) University of Plymouth PhD Studentship in Computational Neuroscience and Interactive Intelligent Systems The University of Plymouth invites applications for a PhD Studentship (stipend to cover living expenses plus UK/EU fees) in the areas of Computational Neuroscience and/or Interactive Intelligent Systems. There are about twelve academic staff in the two Centres, and their work was awarded a rating of 5 (International Excellence) in the 2001 UK Research Assessment Exercise. The primary areas of interest and expertise within the CTCN and CIIS include: - Audition - Biophysics and modelling of temporal brain dynamics - Mathematical neuroscience - Neural computation - Sensorimotor control - Vision - Artificial life models of cognition - Interactive robotics - Information visualisation - Computer music - Semantic web Applicants should have, or expect to obtain, a high grade Bachelors or Masters degree in computing, neuroscience, psychology/cognitive science, physics, mathematics or an allied discipline. The candidate should ideally possess good computational skills and must have a strong motivation for research. For more information on the activity of the CTCN and CIIS, visit: http://www.plymneuro.org.uk/ http://neuromusic.soc.plymouth.ac.uk/ciis.html For informal enquiries contact Professor Mike Denham (m.denham at plymouth.ac.uk) or Dr. Angelo Cangelosi (a.cangelosi at plymouth.ac.uk). Applications should be sent via email to Mrs. Carole Watson (c.watson at plymouth.ac.uk; tel. +44 1752 233329), Senior Research Administrator, Faculty of Technology, University of Plymouth. Closing deadline for applications is March 20th, 2006. Each application should include (1) detailed CV and (2) cover letter and (3) application form. The PhD application form can be downloaded here: http://www.plymouth.ac.uk/pages/view.asp?page=5731 ---------------- Angelo Cangelosi, PhD ---------------- Reader in Artificial Intelligence and Cognition Adaptive Behaviour and Cognition Research Group School of Computing, Communications & Electronics University of Plymouth Portland Square Building (A316) Plymouth PL4 8AA (UK) E-mail: acangelosi at plymouth.ac.uk http://www.tech.plym.ac.uk/soc/staff/angelo (tel) +44 1752 232559 (fax) +44 1752 232540 From wambamm at gmail.com Wed Dec 21 17:12:53 2005 From: wambamm at gmail.com (Michael L Edwards) Date: Wed, 21 Dec 2005 16:12:53 -0600 Subject: Connectionists: WAM-BAMM*06 Announcement Message-ID: <756e5af30512211412x3f6344cm37293fcf257b7d56@mail.gmail.com> The Second Annual World Association of Modelers (WAM) Biologically Accurate Modeling Meeting (BAMM) Wam-bamm *06 March 23rd - March 25th San Antonio, Texas http://wam-bamm.org The second annual meeting devoted to the promotion and extension of biologically accurate modeling and simulation will be held in San Antonio Texas March 23rd - March 25th. Last year's meeting (http://wam-bamm.org/05_links.htm ) attracted more than 100 participants from around the world and was rated by users as 4.5 out of 5.0 (outstanding) with respect to venue, organization, and overall value. This year's meeting will be better still. The meeting's primary objective is to promote communication and collaboration between users and others involved in realistic biological modeling and to also provide an introduction to other scientists interested in realistic biological modeling. This year's meeting will also feature two pre-meetings, one on modeling within the olfactory system, and a second on computational approaches to understanding data in molecular and cellular biology (see website for details). Subjects considered: Modeling results, modeling as a base for understanding biological data, modeling inspired biological experimentation, world modeling community coordination, modeling techniques, simulator design. All computational biologists are invited to present scientific as well as technical work. The meeting encourages participation by modelers using GENESIS, NEURON, any other simulation system, or writing your own code. We also encourage participation by experimental biologists interested in knowing more about biologically accurate modeling techniques. Supplementary travel grants will be available for students presenting work at the meeting. THE PROGRAM Unique in its structure, this meeting will combine introductory, intermediate, and advanced tutorials in realistic modeling techniques with a full agenda of scientific presentations. TUTORIALS Updated versions of most of the tutorials from WAM-BAMM*05 have been published in article form (both in browseable HTML and downloadable PDF format) in the November 2005 special issue on Realistic Neural Modeling in the free electronic journal Brains, Minds, and Media.( http://www.brains-minds-media.org/current/) Currently Scheduled Tutorials: Introduction to realistic neural modeling David Beeman, University of Colorado Boulder Large scale parallel network simulations using NEURON Michael Hines, Yale University How to make the best hand-tuned single-cell model you can Dieter Jaeger, Emory University: XML for Model Specification Workshop Sharon Crook, Arizona State University, and Padriag Gleeson, University College London Biochemical kinetics modeling with Kinetikit and MOOSE. Upinder S. Bhalla, NCBS, Bangalore GENESIS simulations with hsolve Hugo Cornelis, UTHSCA We also encourage meeting participants to suggest tutorials. SCIENTIFIC MEETING Thursday and Friday will be devoted to oral and poster presentations by meeting participants and invited speakers. For additional meeting information please visit http://www.wam-bamm.org. Important Dates: --------------- Deadline for proposed research presentations: January 15, 2006 Submission form is available on the WAM-BAMM web site: www.wam-bamm.org Student registration deadline for travel grants: February 1, 2006 Travel grants for student presenters. (see web site). Deadline for early registration: February 1, 2006 Advance registration $ 99 for graduate students, $ 149 for all others (30% increase after deadline) Deadline for guaranteed housing at the conference rate: February 27, 2006. The meeting will be held at the historic Menger Hotel in Downtown San Antonio, next to the Alamo and the famous San Antonio River Walk. Room rates $109 (single or double), $ 119 (3-4). Arrival date for the meeting: March 22, 2006 Last event: The (in)famous WAM-BAM Country Western Banquet: Saturday, March 25, 2006 Depart from San Antonio: Sunday, March 26, 2006 Registration is now open for WAM-BAMM*06 at http://www.WAM-BAMM.org . Travel funds will be available for students presenting papers. The first annual meeting of the World Association of Modelers (WAM) Biologically Accurate Modeling Meeting (BAMM), in association with the second GENESIS Users Meeting GUM*05 will be held March 31st - April 2nd in beautiful San Antonio, Texas. For further information please visit http://www.wam-bamm.org/ or email us at wam-bamm at wam-bamm.org. Jim Bower Dave Beeman -- James M. Bower Ph.D. Research Imaging Center University of Texas Health Science Center at San Antonio 7703 Floyd Curl Drive San Antonio, TX 78284-6240 Cajal Neuroscience Center University of Texas San Antonio Phone: 210 567 8080 From h.jaeger at iu-bremen.de Wed Dec 21 05:15:18 2005 From: h.jaeger at iu-bremen.de (Herbert Jaeger) Date: Wed, 21 Dec 2005 11:15:18 +0100 Subject: connectionists: CFP Interdisciplinary College IK2006 Message-ID: <43A92B36.7000704@iu-bremen.de> Call for Participation: ======================= Interdisciplinary College IK2006 held at Guenne, Germany, March 10-17, 2006 An interdisciplinary spring school on neurobiology, neural computation, cognitive science/psychology, and artificial intelligence. Focus Theme: Learning Quick Link and Registration: http://www.ik2006.de/ ================================================== Dates: Friday March 10th to Friday March 17th, 2006 Location: Heinrich-Luebke-Haus, Guenne am Moehnesee, Germany Early Registration Deadline: January 15th, 2006 Late Registration Deadline: February 17th, 2006 Chairs: Rainer Malaka (EML Heidelberg), Manfred Spitzer (University of Ulm) Organizations: Main: Gesellschaft fuer Informatik (GI) supporting Organizations: GK, PASCAL NoE Details: ======== The Interdisciplinary College (Interdisziplin?res Kolleg, IK) is an annual one-week spring school which offers a dense, intensive and state-of-the-art course program in neurobiology, neural computation, cognitive science/psychology, artificial intelligence, robotics and philosophy. It is aimed at students, postgraduates and researchers from academia and industry. By combining humanities, science and technology, the IK endeavours to intensify dialogue and connectedness across the various disciplines. Participants come from various European countries, lecturers from all over the world. All courses are taught in English. The course program starts out with several basic and methodological courses providing an up-to-date introduction to the four main fields of the IK. In the second part of the week special courses present in depth discussions of (state-of-the-art research on) specific topics. Additionally, the IK is a unique social event. Participants may enjoy the very special atmosphere: minds meet, music is played and friends are made in long evening and night sessions in the welcoming conference site at the M?hne lake. Focus Theme: Learning The focus of IK 2006 will be learning. What is learning? - As long as nobody asks, we know the answer. Neuroscientists refer to synaptic change, educators to insight, developmental psychologists to phases and stages, cognitive psychologists to categories and rules, modellers and computer scientists to statistics and data-driven inferences. Learning is surely one of the most intensely studied subjects in neurobiology, cognitive science, artificial intelligence, and neuroinformatics. And with life-long learning becoming ever more important, with the PISA-study demonstrating mediocre learning practices in schools, and the economy depending upon the learning brains of the next generation as its only resource, we need to take learning serious. As in previous IK, we want to tackle the theme at issue from various viewpoints, from the synapse to systems, from animals to algorithms, from organisms to automata, and from theory to practical applications. Developmental aspects (the borderland between maturation and learning), modifying factors (age, emotion, motivation), storage systems (memory in its various forms) will be discussed as well as computational learning theories. The IK will aim in particular at bridging the gap between disciplines. Thus we will discuss how computational approaches such as reinforcement learning are related to neurobiological and cognitive insights. This multidisciplinary approach can help to establish new learning paradigms and algorithms for artificial cognitive systems and facilitate our understanding of the nature of learning Courses/lecturers: ================== Basic Courses - Artificial Intelligence (Wolfram Burgard, Freiburg) - Neurobiology (Ansgar B?schges, Cologne) - Cognitive Science (Hanspeter Mallot, T?bingen) - Machine Learning and Neural networks (Herbert Jaeger, Bremen) Methodical Courses - Introduction to Kernel Methods (Matthias Seeger, T?bingen) - How to measure learning and memory. Lessons from psychology (Thomas Kammer & Markus Kiefer, Ulm) - Functional imaging (Thomas Wolbers, Hamburg) Special Courses: Mechanisms of learning - Neuroplasticity (Hubert Dinse, Bochum) - Learning und Sleep (Lisa Marshall, L?beck) - Encoding of prediction errors and microeconomic reward terms by dopamine neurons during Pavlovian conditioning (Philipe Tobler, Cambridge, UK) - Learning as knowledge acquisition (Gerhard Strube, Freiburg) Special Courses: Computational Models Knowledge and Learning - Reinforcement Learning (Martin Riedmiller, Osnabr?ck) - Ontology Learning and Ontology Mapping (Steffen Staab, Koblenz-Landau) - Neural-symbolic learning and reasoning (Pascal Hitzler, Karlsruhe & Sebastian Bader, Dresden) - The emergent ontology: knowledge collectives and conceptual design patterns (Aldo Gangemi, Rome, Italy) Special Courses: Learning by Machines and Robots - A Neural Theory of Language Learning and Use (Srini Narayanan, Berkeley, USA) - From Sensorimotor Sequence to Grammatical Construction: Insights from Neurophysiology, Simulation and Robotics (Peter Dominey, Lyon, France) - The Recruitment theory of Language Origins (Luc Steels, Br?ssel/Paris, France) - Cognitive Developmental Robotics (Minoru Asada, Osaka, Japan) Special Courses: Developmental, Evolution and Neuropsychology - Developmental psychology: insights from the baby lab (NN) - Psychopathology in Adolencense (Matthias Weisbrod, Heidelberg) - Learning and problem solving in monkeys and apes (Josep Call, Leipzig) - The evolution of cognition and learning (Peter G?rdenfors, Lund, Sweden) A limited number of travel and registration support grants are available. For more information, including registration, see http://www.ik2006.de ------------------------------------------------------------------ Dr. Herbert Jaeger Professor for Computational Science International University Bremen Campus Ring 12 28759 Bremen, Germany Phone (+49) 421 200 3215 Fax (+49) 421 200 49 3215 email h.jaeger at iu-bremen.de http://www.faculty.iu-bremen.de/hjaeger/ ------------------------------------------------------------------ From schunn+ at pitt.edu Thu Dec 22 19:29:41 2005 From: schunn+ at pitt.edu (schunn+@pitt.edu) Date: Thu, 22 Dec 2005 19:29:41 -0500 (EST) Subject: Connectionists: Best paper prizes in computational cognitive modeling forCogsci 2006 Message-ID: <39601.67.171.65.112.1135297781.squirrel@webmail.pitt.edu> Four prizes worth $1,000 (USD) each will be awarded for the best full paper submissions to the 2006 Annual Meeting of the Cognitive Science Society that involve computational cognitive modeling. The four separate prizes will represent the best modeling work in the respective areas of: perception, language, higher-level cognition, and applied cognition. The prizes are open to researchers at any level (student, postdoc, research scientist, faculty) from any nationality. Any form of computational cognitive modeling relevant to cognitive science will be eligible, including (but not limited to) connectionism, symbolic, Bayesian, dynamic systems, or various hybrids. No special submission procedure is required---all full paper submissions to the conferences will be automatically considered, using the interdisciplinary program committee that is supervising the review process. The full paper submission deadline is February 1st, 2006. For further details about the conference submission procedure, see http://www.cogsci.rpi.edu/~rsun/cogsci2006/. These prizes are supported by a grant from the US National Science Foundation. Please pass this notice around to relevant colleagues and students. From doya at irp.oist.jp Fri Dec 23 05:32:58 2005 From: doya at irp.oist.jp (Kenji Doya) Date: Fri, 23 Dec 2005 19:32:58 +0900 Subject: Connectionists: Okinawa Computational Neuroscience Course 2006: Call for Applications Message-ID: Call for Applications OKINAWA COMPUTATIONAL NEUROSCIENCE COURSE 2006 "Computing Neurons" June 26 - July 7, 2006. Okinawa, Japan. http://www.irp.oist.jp/ocnc/2006 Application Deadline: APRIL 10TH, 2006 The aim of Okinawa Computational Neuroscience Course is to provide opportunities for young researchers with theoretical backgrounds to learn latest advances in neuroscience, and those with experiment backgrounds to have hands-on experience in computational modeling. We invite graduate students and postgraduate researchers to participate in the course, held from June 26th through July 7th at an oceanfront seminar house of Okinawa Institute of Science and Technology. Those interested in attending the course should send the materials below by e-mail or the course web page by APRIL 10th, 2006. We hope that this course will be a good opportunity for theoretical and experimental neuroscientists to meet together and to explore the attractive nature and culture of Okinawa, the southernmost island prefecture of Japan. ******* Course Outline ******* Okinawa Computational Neuroscience Course (OCNC2006) Theme: Computing Neurons - What neurons compute; How we know by computing - Our brain is a network of billions of neurons, but even a single neuron is a fantastically complex computing device. Technology has made it possible to look into the detailed structure of dendritic branches, variety of ionic channels and receptors, molecular reactions at the synapses, and the network of genes that regulate all these. The challenge is to understand the meaning and function of these components of the neural machine. To do this we need to put together data from many experiments at different levels into a computational model, and to analyze the kinds of computation that single neurons and their networks can perform. This course invites graduate students and postgraduate researchers who are interested in studies integrating experimental and computational approaches for understanding cellular mechanisms of neurons. Lectures: Upi Bhalla (NCBS) Sydney Brenner (OIST) Yang Dan (UC Berkeley) Erik DeSchutter (U Antwerp) Kenji Doya (OIST) Bard Ermentrout (U Pittsburgh) Geoff Goodhill (U Queensland) Shin Ishii (NAIST) Shinya Kuroda (U Tokyo) Nicolas Le Novere (European Bioinformatics Institute) Roberto Malinow (Cold Spring Harbor Lab) Henry Markram (EPFL) Terry Sejnowski (Salk Institute) Susumu Tonegawa (MIT) (more to be announced) Student Projects a) Introduction to neural/cellular simulator platforms b) Model construction from experimental data c) Analysis of neuron models Students will present posters on their current works early in the course and the results of their projects at the end of the course. Date: June 26th to July 7th, 2006 Place: Okinawa Institute of Science and Technology Onna village, Okinawa, Japan Sponsors: Okinawa Institute of Science and Technology Nara Institute of Science and Technology Japanese Neural Network Society Co-organizers: Upinder Bhalla, National Center for Biological Sciences, India Kenji Doya, Okinawa Institute of Science and Technology Shinya Kuroda, University of Tokyo Nicolas Le Novere (European Bioinformatics Institute) Advisors: Sydney Brenner, Okinawa Institute of Science and Technology Hiroaki Kitano, SONY Computer Science Laboratory Terrence Sejnowski, Salk Institute Susumu Tonegawa, Massachusetts Institute of Technology ******* Application ******* Please send the following by e-mail (ocnc at irp.oist.jp) or the web application page by APRIL 10TH, 2006. 1) First name, 2) Middle initial (if any), 3) Family name, 4) Degree, 5) Date of birth, 6) Gender, 7) Nationality, 8) Affiliation, 9) Position, 10) Advisor, 11) Postal address, 12) Phone, 13) Fax, 14) E-mail, 15) Web page (if any), 16) Educational background, 17) Work experience, 18) List of publications, 19) Research interests (up to 500 words), 20) Motivations for attending the course (up to 500 words), 21) Two referees whom can ask recommendations (names, affiliations, e-mail addresses), 22) Need for travel support, 23) How you learned about the course. We will accept 30 students based primarily on their research interests (19) and motivations (20). We will also consider the balance of members' research disciplines, geographic origins, and genders. The sponsor will provide lodging and meals during the course. Support for roundtrip airfare to Okinawa will be considered for students without funding. The result of selection will be informed to applicants via e-mail by May 10th. The details of OCNC2004 and 2005 are available on the web page (http://www.irp.oist.jp/ocnc/). ******* Secretariat ******* Okinawa Computational Neuroscience Course c/o Initial Research Project, Okinawa Institute of Science and Technology 12-22 Suzaki, Gushikawa Okinawa 904-2234, Japan Phone: +81-98-921-3933 Fax: +81-98-921-3873 Email: ocnc at irp.oist.jp For more information, please visit the web page: http://www.irp.oist.jp/ocnc/2006 ---- Kenji Doya Initial Research Project, Okinawa Institute of Science and Technology 12-22 Suzaki, Uruma, Okinawa 904-2234, Japan Phone:+81-98-921-3843; Fax:+81-98-921-3873 http://www.irp.oist.jp/ From steve at cns.bu.edu Sun Dec 25 06:49:06 2005 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sun, 25 Dec 2005 06:49:06 -0500 Subject: Connectionists: neural mechanisms of autism Message-ID: The following article is now available at http://www.cns.bu.edu/Profiles/Grossberg Grossberg, S. and Seidman, D. (2006). Neural dynamics of autistic behaviors: Cognitive, emotional, and timing substrates. Psychological Review, in press. ABSTRACT What brain mechanisms underlie autism and how do they give rise to autistic behavioral symptoms? This article describes a neural model, called the iSTART model, which proposes how cognitive, emotional, timing, and motor processes that involve brain regions like prefrontal and temporal cortex, amygdala, hippocampus, and cerebellum may interact together to create and perpetuate autistic symptoms. These model processes were originally developed to explain data concerning how the brain controls normal behaviors. The iSTART model shows how autistic behavioral symptoms may arise from prescribed breakdowns in these brain processes, notably a combination of underaroused emotional depression in the amygdala and related affective brain regions, learning of hyperspecific recognition categories in temporal and prefrontal cortices, and breakdowns of adaptively timed attentional and motor circuits in the hippocampal system and cerebellum. The model clarifies how malfunctions in a subset of these mechanisms can, though a system-wide vicious circle of environmentally mediated feedback, cause and maintain problems with them all. Key words: autism, learning, categorization, depression, hypervigilance, adaptive resonance theory, adaptive timing, amygdala, frontal cortex, hippocampus, cerebellum From a.cichocki at riken.jp Tue Dec 27 11:54:39 2005 From: a.cichocki at riken.jp (A. Cichocki) Date: Wed, 28 Dec 2005 01:54:39 +0900 Subject: Connectionists: NMF and SCA papers and MATLAB software NMFLAB Message-ID: <43B171CF.8010006@riken.jp> Dear List Members: I would like to bring to your attention our recently updated papers and reports about NMF (non-negative matrix factorization) and SCA (Sparse Component Analysis) for BSS (Blind and Semi-blind Source Separation) available at: http://www.bsp.brain.riken.jp/%7Ecia/recent.html#nmf http://www.bsp.brain.riken.jp/~cia/recent.html#sca http://www.bsp.brain.riken.jp/~cia/ We release also soon new free MATLAB toolboxes: NMFLAB and SCALAB. I would be grateful for any critical comments or suggestions. Best Wishes, Andrzej Cichocki =============== Laboratory for Advanced Brain Signal Processing Riken, Brain Science Institute, JAPAN Wako Shi, Saitama 351-0198 List of selected new papers and reports about sparse NMF and SCA: NMF 1. A. Cichocki, R. Zdunek, and S. Amari, "Csiszar's Divergences for Non-Negative Matrix Factorization: Family of New Algorithms", 6th International Conference on Independent Component Analysis and Blind Signal Separation, Charleston SC, USA, March 5-8, 2006. [.pdf ] 2. A. Cichocki, S. Amari, and R. Zdunek, "Extended SMART Algorithms for Non-Negative Matrix Factorization", Eighth International Conference on Artificial Intelligence and Soft Computing, ICAISC, Zakopane, Poland, 25-29 June, 2006. [.pdf ] 3. R. Zdunek, and A. Cichocki, "Non-Negative Matrix Factorization with Quasi-Newton Optimization", Eighth International Conference on Artificial Intelligence and Soft Computing, ICAISC, Zakopane, Poland, 25-29 June, 2006. [.pdf ] SCA 1. P. G. Georgiev, F. Theis, and A. Cichocki, "Sparse component analysis and blind source separation of underdetermined mixtures", IEEE Transactions on Neural Networks, July 2005, Vol. 16, No.4, pp. 992-996. [.pdf ] 2. P. G. Georgiev, F. Theis, and A. Cichocki, "Optimization algorithms for sparse representations and applications", Chapter in the book Mulitscale Optimization Methods, Ed. Pardalos, 2005. [.pdf ] 3. Y. Li, S. Amari, A. Cichocki, D. W. C. Ho and S. Xie: "Underdetermined Blind Source Separation Based on Sparse Representation", IEEE Transactions on Signal Processing, Vol. 54, No.2, 2006 (in print). [.pdf ] 4. Y. Li, A. Cichocki, and S. Amari, "Blind estimation of channel parameters and source components for EEG signals: A sparse factorization approach, IEEE Transactions on Neural Networks, 2006, (accepted for publication) [ draft version pdf ] 5. F. J. Theis, P. G. Georgiev, and A. Cichocki, "Robust overcomplete matrix recovery for sparse sources using a generalized Hough transform," in Proceedings of 12th European Symposium on Artificial Neural Networks (ESANN2004), (Bruges, Belgium), pp. 343-348, Apr. 2004. [.pdf ] From terry at salk.edu Thu Dec 1 17:02:35 2005 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 01 Dec 2005 14:02:35 -0800 Subject: Connectionists: UCSD Computational Neurobiology Graduate Training Message-ID: DEADLINE: DECEMBER 15, 2005 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/grad/CN_overview.html The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers who are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, behavioral, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology * Darwin Berg (Neurobiology): Regulation synaptic components, assembly and localization, function and long-term stability. * Geoffrey Boynton (Salk Institute): Visual psychophysics; fMRI recordings from visual cortex. * Gert Cauwenberghs (Biology): Neuromorphic Engineering; analog VLSI chips; wireless recording and nanoscale instrumentation for neural systems; large-scale cortical modeling. * EJ Chichilnisky (Salk Institute): Retinal multielectrode recording; neural coding, visual perception. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms * Virginia De Sa (Cognitive Science): Computational basis of perception and learning (both human and machine); multi-sensory integration and contextual influences * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research * Dan Feldman (Biology): Cortical plasticity; spike-teime dependent synaptic plasticity; sensory coding inthe whisker system. * Marla Feller (Neurobiology): Mechanisms and function of spontaneous activity in the developing nervous system including the retina, spinal cord, hippocampus and neocortex. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels * David Kleinfeld (Physics): Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. * William Kristan (Neurobiology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies * Scott Makeig (Institute for Neural Computation): Analysis of cognitive event-related brain dynamics and fMRI using time-frequency and Independent Component Analysis * Javier Movellan (Institute for Neural Computation): Sensory fusion and learning algorithms for continuous stochastic systems * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects * Pamela Reinagel (Biology): Sensory and neural coding; natural scene statistics; recordings from the visual system of cats and rodents. * Massimo Scanziani (Biology): Neural circuits in the somotosensory cortex; physiology of synaptic transmission; inhibitory mechanisms. * Terrence Sejnowski (Salk Institute/Neurobiology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques * Nicholas Spitzer (Neurobiology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of Neurobiology * Charles Stevens (Salk Institute): Synaptic physiology; theoretical models of neuroanatomical scaling. * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms Requests for application materials should be sent to the University of California, San Diego, Division of Biological Sciences 0348, Graduate Admissions Office, 9500 Gilman Drive, La Jolla, CA, 92093-0348 or to [gradprog at biomail.ucsd.edu]. The deadline for completed application materials, including letters of recommendation, is December 15, 2005. For more information about applying to the UCSD Biology Graduate Program. A preapplication is not required for the Computational Neurobiology Program. http://www.biology.ucsd.edu/grad/admissions/index.html From juergen at idsia.ch Thu Dec 1 11:31:40 2005 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Thu, 1 Dec 2005 17:31:40 +0100 Subject: Connectionists: postdoc @ IDSIA, Switzerland Message-ID: <01c72f931e55e5957a3e6a9054f3140e@idsia.ch> We are seeking an outstanding postdoc with experience / interest in topics such as sequence learning algorithms, adaptive robotics, recurrent neural networks (RNN), sequential active vision, hidden Markov models, dynamic Bayes nets and other Bayesian approaches, universal learning machines, Kolmogorov complexity / algorithmic information theory, artificial evolution, in particular RNN evolution, support vector machines (especially recurrent ones), reinforcement learning, curiosity- driven learning. Goal: to advance the state of the art in sequence learning in general, and to build vison-based robots and other agents that learn to solve challenging tasks. More: http://www.idsia.ch/~juergen/postdoc2006.html JS From maass at igi.tu-graz.ac.at Thu Dec 1 12:00:39 2005 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Thu, 01 Dec 2005 18:00:39 +0100 Subject: Connectionists: Feedback in neural circuits (OR: How to expand liquid computing) Message-ID: <438F2C37.6010006@igi.tu-graz.ac.at> The paper Computational Aspects of Feedback in Neural Circuits by Wolfgang Maass, Prashant Joshi , and Eduardo Sontag is now available from the homepages of the authors. There will be a talk and poster on it at NIPS 2005 (under the title "Principles of real-time computing with feedback applied to cortical microcircuit models"). Abstract: It had previously been shown that generic cortical microcircuit models can perform complex real-time computations on continuous input streams, provided that these computations can be carried out with a rapidly fading memory. We investigate in this article the computational capability of such circuits in the more realistic case where not only readout neurons, but in addition a few neurons within the circuit have been trained for specific tasks. This is essentially equivalent to the case where the output of trained readout neurons is fed back into the circuit. We show that this new model overcomes the limitation of a rapidly fading memory. In fact, we prove that in the idealized case without noise it can carry out any conceivable digital or analog computation on time-varying inputs. But even with noise the resulting computational model can perform a large class of biologically relevant real-time computations that require a non-fading memory. We demonstrate these computational implications of feedback both theoretically and through computer simulations of detailed cortical microcircuit models. We show that the application of simple learning procedures (such as linear regression or perceptron learning) enables such circuits, in spite of their complex inherent dynamics, to represent time over behaviorally relevant long time spans, to integrate evidence from incoming spike trains over longer periods of time, and to process new information contained in such spike trains in diverse ways according to the current internal state of the circuit. In particular we show that such generic cortical microcircuits with feedback provide a new model for working memory that is consistent with a large set of biological constraints. Although this article examines primarily the computational role of feedback in circuits of neurons, the mathematical principles on which its analysis is based apply to a large variety of dynamical systems. Hence they may also throw new light on the computational role of feedback in other complex biological dynamical systems, such as for example genetic regulatory networks. -- Wolfgang Maass Professor of Computer Science Technische Universitaet Graz http://www.igi.tugraz.at/maass/ From t.heskes at science.ru.nl Thu Dec 1 14:41:36 2005 From: t.heskes at science.ru.nl (Tom Heskes) Date: Thu, 01 Dec 2005 20:41:36 +0100 Subject: Connectionists: Neurocomputing volume 69 (issues 4-6) Message-ID: <438F51F0.6020607@science.ru.nl> Neurocomputing Volume 69, Issues 4-6 (January 2006) ------- FULL LENGTH PAPERS Bifurcations in Morris?Lecar neuron model Kunichika Tsumoto, Hiroyuki Kitajima, Tetsuya Yoshinaga, Kazuyuki Aihara and Hiroshi Kawakami Using temporal binding for hierarchical recruitment of conjunctive concepts over delayed lines Cengiz G?nay and Anthony S. Maida Adaptive conjugate gradient algorithm for perceptron training G. Nagaraja and R.P. Jagadeesh Chandra Bose Tumor tissue identification based on gene expression data using DWT feature extraction and PNN classifier Guangmin Sun, Xiaoying Dong and Guandong Xu Mathematical modeling and computational analysis of neuronal cell images: Application to dendritic arborization of Golgi-impregnated neurons in dorsal horns of the rat spinal cord D. Ristanovi?, B.D. Stefanovi?, N.T. Milo?evi?, M. Grgurevi? and J.B. Stankovi? Exponential stability and periodic oscillatory of bi-directional associative memory neural network involving delays Hongyong Zhao Time-series prediction using a local linear wavelet neural network Yuehui Chen, Bo Yang and Jiwen Dong Locally recurrent neural networks for long-term wind speed and power prediction T.G. Barbounis and J.B. Theocharis Separation of water artifacts in 2D NOESY protein spectra using congruent matrix pencils K. Stadlthanner, A.M. Tom?, F.J. Theis, E.W. Lang, W. Gronwald and H.R. Kalbitzer Dynamic temperature modeling of continuous annealing furnace using GGAP-RBF neural network Shaoyuan Li, Qing Chen and Guang-Bin Huang Biologically motivated vergence control system using human-like selective attention model Sang-Bok Choi, Bum-Soo Jung, Sang-Woo Ban, Hirotaka Niitsuma and Minho Lee Local regularization assisted orthogonal least squares regression S. Chen A new approach to fuzzy classifier systems and its application in self-generating neuro-fuzzy systems Mu-Chun Su, Chien-Hsing Chou, Eugene Lai and Jonathan Lee ------- JOURNAL SITE: http://www.elsevier.com/locate/neucom SCIENCE DIRECT: http://www.sciencedirect.com/science/issue/5660-2006-999309995-612478 From baolshausen at berkeley.edu Fri Dec 2 15:28:05 2005 From: baolshausen at berkeley.edu (Bruno Olshausen) Date: Fri, 02 Dec 2005 12:28:05 -0800 Subject: Connectionists: Graduate program in neuroscience - UC Berkeley Message-ID: <4390AE55.60102@berkeley.edu> GRADUATE PROGRAM IN NEUROSCIENCE - UC BERKELEY ** Application deadline: December 15, 2005 ** The Graduate Program in Neuroscience at UC Berkeley is currently accepting applications for admission for the 2006-2007 academic year. There are numerous opportunities for students interested in focusing on computational and theoretical approaches within the context of an interdisciplinary neuroscience training program. Faculty supporting this area include: Martin Banks - Visual space perception, psychophysics, virtual reality Jose Carmena - Brain-machine interfaces, sensorimotor control, learning Yang Dan - Information processing in thalamus and cortex Jack Gallant - Neural mechanisms of visual form perception and attention Tom Griffiths - Computational models of cognition Stanley Klein - Computational models of spatial vision, psychophysics Harold Lecar - Theoretical biophysics, network models Bruno Olshausen - Models of visual cortex, scene analysis Fritz Sommer - Network models of associative memory and learning Frederic Theunissen - Neural mechanisms of complex sound recognition Frank Werblin - Information processing in the retina Faculty in other programs also pursuing computational/theoretical approaches to neuroscience questions include Michael Gastpar, EECS - Neural coding, information theory Jitendra Malik, EECS - Models of early vision and object recognition Alva Noe, Philosophy - Theories of perception and sensorimotor loops Lokendra Shastri, ICSI - Models of episodic memory in hippocampus Bin Yu, Statistics - Neural coding, image statistics In addition, the newly established Redwood Center for Theoretical Neuroscience provides a central workspace for theoreticians and organizes a weekly seminar series, workshops, and hosts visiting scholars. See http://redwood.berkeley.edu For further information and details of the application processs see http://neuroscience.berkeley.edu/grad.php and please note the application deadline above. -- Bruno A. Olshausen Director, Redwood Center for Theoretical Neuroscience and Associate Professor, Helen Wills Neuroscience Institute and School of Optometry, UC Berkeley 132 Barker Hall, #3190, Berkeley, CA 94720-3190 (510) 643-1472 / 4952 (fax) http://redwood.berkeley.edu From ccchow at pitt.edu Fri Dec 2 10:42:59 2005 From: ccchow at pitt.edu (Carson Chow) Date: Fri, 2 Dec 2005 10:42:59 -0500 Subject: Connectionists: Positions at NIH Message-ID: <5c7190cc1c09fed9fd20c51a9a9835c1@pitt.edu> > > > The National Institute of Diabetes and Digestive and Kidney Diseases > (NIDDK), National Institutes of Health (NIH), Department of Health and > Human Services, invites applications for tenured or tenure track > positions in the Laboratory of Biological Modeling.? The Laboratory is > currently comprised of scientists who use computational approaches to > understand cell biological and physiological systems.? Specific areas > of > research interest will include mathematical modeling at the > subcellular, > cellular, tissue and system levels. Excellent computational facilities > and resources for rapid achievement of research goals are available. > LBM is in close proximity in particular to the NIDDK Computational > Chemistry Core Facility, engaged in molecular modeling. The position > offers unparalleled opportunities for interdisciplinary collaboration > within NIDDK and throughout NIH.? For further information about > NIDDK, see http://www.niddk.nih.gov.? Candidates must have a Ph. D., M. > D., or equivalent degree in the physical or biomedical sciences.? He or > she should have an outstanding record of research accomplishments in > mathematical modeling and will be expected to propose and pursue an > innovative and independent research program.? Applicants should send a > curriculum vitae and list of publications, copies of three major > publications, a plan for future research, and three letters of > reference > to Dr. Robert Tycko, Chair of the Search Committee, Laboratory of > Chemical Physics, Building 5, Rm 112, 5 Memorial Drive, NIH, Bethesda, > MD 20892-0520, tel: 301-402-8272, fax:301-496-0825, > email:tycko at helix.nih.gov.? (A closing date has not been set, but it > would be best to apply before the end of January, 2006.) > > HHS and NIH are Equal Opportunity Employers > > Position Description: > > The successful candidate will establish an independent group with > research interests focused on mathematical modeling at the subcellular, > cellular, tissue, or organism levels.? Other members of the Laboratory > of Biological Modeling (http://lbm.niddk.nih.gov) conduct basic > research > on a wide variety of topics including insulin secretion (A. Sherman), > insulin action (A. Sherman, C. Chow), metabolism (K. Hall), adipocyte > differentiation (V. Periwal), , calcium homeostasis (A. Sherman), and > neuroscience (C. Chow), all relevant to diabetes and obesity. > Interaction is expected with experimental laboratories or other > computational groups in NIDDK or other NIH institutes. From Dave_Touretzky at cs.cmu.edu Sat Dec 3 02:27:07 2005 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Sat, 03 Dec 2005 02:27:07 -0500 Subject: Connectionists: graduate training in the neural basis of cognition Message-ID: <405.1133594827@ammon.boltz.cs.cmu.edu> Graduate Training at the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers an interdisciplinary doctoral training program operated jointly with elevent affiliated PhD programs at Carnegie Mellon University and the University of Pittsburgh. Detailed information about this program is available on our web site at http://www.cnbc.cmu.edu The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, statistics, or robotics. For more information about the program, and to obtain application materials, visit our web site at www.cnbc.cmu.edu, or contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu Web: http://www.cnbc.cmu.edu The affiliated PhD programs at the two universities are: Carnegie Mellon University of Pittsburgh Biological Sciences BioEngineering Biomedical Engineering Mathematics Computer Science Neuroscience Computational & Statistical Psychology Learning Psychology Robotics Statistics The CNBC training faculty includes: Eric Ahrens (CMU Biology): MRI studies of the vertebrate nervous system Susan Amara (Pitt Neurobiology): neurotransmitter transport and binding John Anderson (CMU Psychology): models of human cognition Galia Avidan (CMU Psychology): fMRI studies of object and face recognition German Barrionuevo (Pitt Neuroscience): hippocampus and prefrontal cortex Alison Barth (CMU Biology): molecular basis of plasticity in neocortex Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Guoqiang Bi (Pitt Neurocience): activity-dependent synaptic modification J. Patrick Card (Pitt Neuroscience): transneuronal tracing of neural circuits Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Justin Crowley (CMU Biology): development of visual cortex Tracy Cui (Pitt BioEngineering): biosensors, neural microlectrode arrays Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Neeraj Gandhi (Pitt Neuroscience): neural control of movement Chris Genovese (CMU Statistics): making inferences from scientific data Lori Holt (CMU Psychology): mechanisms of auditory and speech perception John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Satish Iyengar (Pitt Statistics): spike train data analsysis Jon Johnson (Pitt Neuroscience): ligand-gated ion channels; NMDA receptor Marcel Just (CMU Psychology): visual thinking, language comprehension Karl Kandler (Pitt Neurobiology): neural development; inhibitory pathways Robert Kass (CMU Statistics): transmission of info. by collections of neurons Seog-Gi Kim (Pitt Neurobiology): technology and biophysics of fMRI Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Beatriz Luna (Pitt Pschology): developmental psychology and fMRI Brian MacWhinney (CMU Psychology): models of language acquisition Yoky Matsuoka (CMU Robotics): human motor control and motor learning James McClelland (CMU Psychology): connectionist models of cognition Steve Meriney (Pitt Neuroscience): mechanisms of synaptic plasticity Nancy Minshew (Pitt Neurobiology): cognitive and neural basis of autism Tom Mitchell (CMU Comp. Sci.): machine learning with application to fMRI Bita Moghaddam (Pitt Neuroscience): prefrontal cortex and psychiatric disorders Paula Monaghan-Nichols (Pitt Neurobiology): genetic analysis of verteb. CNS devel. Carl Olson (CNBC): spatial representations in primate frontal cortex Charles Perfetti (Pitt Psychology): language and reading processes David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia Lynne Reder (CMU Psychology): models of memory and cognitive processing Erik Reichle (Pitt Psychology): attention and eye movements in reading Jonathan Rubin (Pitt Mathematics): analysis of systems of coupled neurons Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Andrew Schwartz (Pitt Bioengineering): motor control, neural prostheses Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Greg Siegle (Pitt Psychology): emotion and cognition; cognitive modeling Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex Marc Sommer (Pitt Neuroscience): neural circuitry controlling eye movements Peter Strick (Pitt Neurobiology): motor control; basal ganglia and cerebellum Floh Thiels (Pitt Neurosicence): LTP and LTD in hippocampus Erik Thiessen (Pitt Psychology): child language development Natasha Tokowicz (Pitt Psychology): language learning; bilingualism David Touretzky (CMU Comp. Sci.): hippocampal modeling, cognitive robotics Nathan Urban (CMU Bioogy): circuitry of the olfactory bulb Valerie Ventura (CMU Statistics): structure of neural firing patterns Mark Wheeler (Pitt Psychology): fMRI studies of memory and cognition Nick Yeung (CMU Psychology): neural mechanisms of attention Please see http://www.cnbc.cmu.edu for further details. From matthias.hein at tuebingen.mpg.de Sat Dec 3 11:30:58 2005 From: matthias.hein at tuebingen.mpg.de (Matthias Hein) Date: Sat, 3 Dec 2005 17:30:58 +0100 Subject: Connectionists: PhD-studentship in Learning Theory Message-ID: <008301c5f826$f0e2ac80$1d28260a@kongo> PhD-studentship in Learning Theory/Learning Algorithms at the MPI for Biological Cybernetics A Position for a PhD-studentship in Learning Theory/Learning Algorithms is available in B. Sch?lkopf's Empirical Inference department at the Max Planck Institute in Tuebingen, Germany (see http://www.kyb.tuebingen.mpg.de/bs). We invite applications of candidates with an outstanding academic record in particular a strong mathematical background. Max Planck Institutes are publicly funded research labs with an emphasis on excellence in basic research. Tuebingen is a university town in southern Germany, see http://www.tuebingen.de/kultur/english/index.html for some pictures. If you are interested and you are attending NIPS 2005 please contact Matthias Hein mh at tuebingen.mpg.de so that one can arrange an informal interview there. Otherwise please send inquiries and applications, including a CV (with complete lists of marks, copies of transcripts etc.) and a short statement of research interests to sabrina.nielebock at tuebingen.mpg.de or Sabrina Nielebock Max Planck Institute for Biological Cybernetics Spemannstr. 38 72076 Tuebingen Germany Tel. +49 7071 601 551 Fax +49 7071 601 552 In addition, please arrange for two letters of reference to be sent directly to the address above. Applications will be considered immediately and until the positions are filled. ************************************************* Matthias Hein Max-Planck-Institute for Biological Cybernetics Spemannstrasse 38 72076 T?bingen Tel: 07071 - 601 559 Fax:07071 - 601 552 mail: matthias.hein at tuebingen.mpg.de ************************************************* From jdc at Princeton.EDU Tue Dec 6 20:22:55 2005 From: jdc at Princeton.EDU (Jonathan D. Cohen) Date: Tue, 6 Dec 2005 20:22:55 -0500 Subject: Connectionists: Faculty Position at new Princeton University Institute Message-ID: <824EDA80-CEF4-4A19-8C0F-3CCB48DE47EA@princeton.edu> Princeton University is seeking to make the first of several anticipated new faculty appointments in neuroscience, as part of its new Institute in this area and its growing focus on quantitative approaches to understanding neural coding and dynamics at the systems level. The position is for an Assistant Professor, to begin in September 2006, for a theorist in systems and/or cognitive neuroscience. The appointment will be joint between the Institute and a department appropriate to the individual?s background and interests, with possibilities including (but not limited to) Psychology, Molecular Biology, Mathematics, Physics, Electrical Engineering or Computer Science. Applicants should be prepared to teach both an undergraduate and a graduate level course in neuroscience. Please send a curriculum vitae, a one-page research description, and three letters of recommendation to the Search Committee, Neuroscience Institute, Princeton University, Princeton, NJ 08544, or by email to search at neuroscience.princeton.edu. Materials should be submitted as soon as possible. Applications will be considered on a rolling basis, and the search will remain open until the position is filled. Princeton is an equal opportunity, affirmative action employer. For information about applying to Princeton and how to self-identify, please link to http:// web.princeton.edu/sites/dof/ApplicantsInfo.htm From paul.cisek at umontreal.ca Wed Dec 7 11:47:36 2005 From: paul.cisek at umontreal.ca (Paul Cisek) Date: Wed, 7 Dec 2005 11:47:36 -0500 Subject: Connectionists: International Symposium on Computational Neuroscience - Montreal, Canada, May 8-9, 2006 Message-ID: <006a01c5fb4d$ed9b03e0$2de4cc84@Engram> FIRST ANNOUNCEMENT AND CALL FOR POSTERS ------------------------------------------------------------------- XXVIIIth International Symposium COMPUTATIONAL NEUROSCIENCE: >From theory to neurons and back again May 8-9, 2006 University of Montr?al Montr?al, Qu?bec, Canada ------------------------------------------------------------------- The 28th International Symposium of the Groupe de recherche sur le syst?me nerveux central et le Centre de recherche en sciences neurologiques will be held on May 8-9, 2006, at the University of Montr?al. The objectives of this symposium are to illustrate the power and utility of computational approaches to address fundamental issues of brain function from the level of single cells to that of large systems, as well as to discuss how computational and more traditional physiological methods complement one another. The symposium will include presentations on computational models of sensory and motor systems, learning processes, and information coding. Registration is now open. Please visit http://www.grsnc.umontreal.ca/XXVIIIs/ for information. Submissions are invited for a limited number of poster presentations. Authors of select posters will be invited to contribute a short chapter to a special issue of the book series Progress in Brain Research. Deadline for poster submissions: Friday, March 31, 2006. Conference speakers: Larry Abbott Yoshua Bengio Catherine Carr Paul Cisek Simon Giszter Sten Grillner Stephen Grossberg Geoffrey Hinton Len Maler Eve Marder James McClelland David McCrea Bruce McNaughton Alexandre Pouget Stephen Scott Michael Shadlen Reza Shadmehr Robert Shapley Daniel Wolpert Sponsors: Canadian Institute for Advanced Research (CIAR) Canadian Institutes of Health Research (CIHR) Groupe de recherche sur le syst?me nervaux central (GRSNC) Fonds de la recherche en sant? du Qu?bec (FRSQ) Universit? de Montr?al (CEDAR) From rsun at rpi.edu Sat Dec 3 11:04:52 2005 From: rsun at rpi.edu (Professor Ron Sun) Date: Sat, 3 Dec 2005 11:04:52 -0500 Subject: Connectionists: Ph.D program in Cognitive Science at RPI Message-ID: <8D291EEE-03D7-4372-88FB-BEBD7C1E72C3@rpi.edu> I am looking for a few Ph.D students. The Ph.D program of the Cognitive Science department at RPI is accepting applications. Graduate assistantships and other forms of financial support for graduate students are available. Prospective graduate students with interests in Cognitive Science, especially in learning and skill acquisition and in the relationship between cognition and sociality, are encouraged to apply. Prospective applicants should have background in computer science (the equivalent of a BS in computer science), and have some prior exposure to psychology, artificial intelligence, connectionist models (neural networks), multi-agent systems, and other related areas. Students with a Master's degree already completed are preferred. RPI is a top-tier research university. The CogSci department has identified the Ph.D program and research as its primary missions. The department is conducting research in a number of areas: cognitive modeling, human and machine learning, multi-agent interactions and social simulation, neural networks and connectionist models, human and machine reasoning, cognitive engineering, and so on. See the Web page below regarding my research: http://www.cogsci.rpi.edu/~rsun For the application procedure, see http://www.cogsci.rpi.edu/ The application deadline is Jan.15, 2005. If you decide to apply, follow the official procedure as outlined on the Web page. Send me a short email (in plain text) AFTER you have completed the application. ======================================================== Professor Ron Sun Cognitive Science Department Rensselaer Polytechnic Institute 110 Eighth Street, Carnegie 302A Troy, NY 12180, USA phone: 518-276-3409 fax: 518-276-3017 email: rsun at rpi.edu web: http://www.cogsci.rpi.edu/~rsun ======================================================= From sethu.vijayakumar at ed.ac.uk Wed Dec 7 09:28:25 2005 From: sethu.vijayakumar at ed.ac.uk (Sethu Vijayakumar) Date: Wed, 07 Dec 2005 14:28:25 +0000 Subject: Connectionists: Preprint: Incremental Online Learning in High Dimensions Message-ID: <4396F189.5030106@ed.ac.uk> The following paper is available for download from: http://homepages.inf.ed.ac.uk/svijayak/publications/vijayakumar-NeuCom2005.pdf or as featured (free) article from the MIT Press website: http://mitpress.mit.edu/catalog/item/default.asp?sid=22065875-6E38-4AAA-BB11-E00879BDE665&ttype=4&tid=31 Incremental Online Learning in High Dimensions, Neural Computation, vol. 17, no. 12, pp. 2602-2634 (2005) Locally weighted projection regression (LWPR) is a new algorithm for incremental nonlinear function approximation in high-dimensional spaces with redundant and irrelevant input dimensions. At its core, it employs nonparametric regression with locally linear models. In order to stay computationally efficient and numerically robust, each local model performs the regression analysis with a small number of univariate regressions in selected directions in input space in the spirit of partial least squares regression. We discuss when and how local learning techniques can successfully work in high-dimensional spaces and review the various techniques for local dimensionality reduction before finally deriving the LWPR algorithm. The properties of LWPR are that it (1) learns rapidly with second-order learning methods based on incremental training, (2) uses statistically sound stochastic leave-one-out cross validation for learning without the need to memorize training data, (3) adjusts its weighting kernels based on only local information in order to minimize the danger of negative interference of incremental learning, (4) has a computational complexity that is linear in the number of inputs, and (5) can deal with a large number of?possibly redundant?inputs, as shown in various empirical evaluations with up to 90 dimensional data sets. For a probabilistic interpretation, predictive variance and confidence intervals are derived. To our knowledge, LWPR is the first truly incremental spatially localized learning method that can successfully and efficiently operate in very high-dimensional spaces. Software (MATLAB/C++) implementation of the LWPR algorithm can be found at: http://homepages.inf.ed.ac.uk/svijayak/software/LWPR/ -- ------------------------------------------------------------------ Sethu Vijayakumar, Ph.D. Assistant Professor(UK Lecturer) Director, IPAB, School of Informatics, The University of Edinburgh 2107F JCMB, The Kings Buildings, Edinburgh EH9 3JZ, United Kingdom URL: http://homepages.inf.ed.ac.uk/svijayak Ph: +44(0)131 651 3444 SLMC Research Group URL: http://www.ipab.informatics.ed.ac.uk/slmc ------------------------------------------------------------------ Adjunct Assistant Professor, Department of Computer Science, University of Southern California ------------------------------------------------------------------ From te at ecs.soton.ac.uk Tue Dec 6 03:36:10 2005 From: te at ecs.soton.ac.uk (Terry Elliott) Date: Tue, 6 Dec 2005 08:36:10 +0000 (GMT) Subject: Connectionists: PhD studentship Message-ID: University of Southampton School of Electronics and Computer Science PhD Studentship in Computational Neurobiology Applications are invited for a PhD studentship in the general area of Computational Neurobiology, with particular emphasis on mathematical and computational models of synaptic plasticity and neuronal development, under the supervision of Dr Terry Elliott. The successful applicant will join the newly-established Science and Engineering of Natural Systems (SENSe) group within the School. The studentship is funded for UK and EU students at the rate of £12,000 per annum plus fees. The School is the largest of its kind in the UK, and was rated 5* (the top rating) for both Computer Science and Electronics in the last Research Assessment Exercise. The School hosts several major national and European research centres and projects, with funding from numerous sources, including EPSRC, DTI, EU and industry. Applicants should hold, or expect to obtain, a first class or upper second class degree in a highly numerate discipline such as mathematics or physics, although strong applicants from other appropriate subjects will also be considered. Initial, informal inquiries may be addressed to Dr Terry Elliott, e-mail: te at ecs dot soton dot ac dot uk Further details, including application forms, can be found at: www.ecs.soton.ac.uk/admissions/ or can be obtained by contacting: The Postgraduate Admissions Tutor School of Electronics and Computer Science University of Southampton Highfield Southampton S017 1BJ UK. E-mail: PhD-Admissions at ecs.soton.ac.uk Tel: +44 (0) 23 8059 2882 Completed application forms should be returned by 30 April, 2006 for a preferred starting date of 1 October, 2006. From chenyu6 at gmail.com Thu Dec 8 21:30:16 2005 From: chenyu6 at gmail.com (Chen Yu) Date: Thu, 8 Dec 2005 21:30:16 -0500 Subject: Connectionists: Postdoc position at Indiana University Message-ID: Ad #1: POSTDOCTORAL POSITION IN MACHINE LEARNING AND COMPUTER VISION The following postoc position in machine learning and computer vision as applied to visual expertise and visual perception is available at Indiana University, Program in Cognitive Science. Job Title: Postdoctoral Research Associate Job Location: Department of Psychological and Brain Sciences Indiana University Bloomington, IN Closing Date: Application review will begin January 15th. Applications will be considered until the position is filled. The focus of this project will be on using probabilistic modeling and machine learning techniques applied to visual data to infer the processes underlying the development of visual expertise. The successful applicate will have excellent programming skills, experience with C++ and Matlab, and a background in computer vision or machine learning. This project is part of a collaboration between Dr. Richard Shiffrin, Dr. Thomas Busey and Dr. Chen Yu at Indiana University, and is funded through the National Instutites of Justice and the National Institutes of Health. This position is available for two years. Candidates should send letter, curriculum vita, reprints, and names of three referees to (electronic submission preferred): Thomas Busey, PhD Associate Professor Department of Psychological and Brain Sciences, and Program in Cognitive Science Indiana University, Bloomington 1101 E. 10th St Bloomington, IN, 47405 (812) 855-4261 busey at indiana.edu www.indiana.edu/~busey Indiana University is an Affirmative Action/Equal Opportunity employer. Applicants need not be US citizens, and women and minority candidates are especially encouraged to apply. Ad #2: POSTDOCTORAL POSITION IN MACHINE LEARNING AND COMPUTER VISION The following postoc position in machine learning and computer vision as applied to expertise in fingerprint identification is available at Indiana University, Program in Cognitive Science. Job Title: Postdoctoral Research Associate Job Location: Department of Psychological and Brain Sciences Indiana University Bloomington, IN Closing Date: Application review will begin January 15th. Applications will be considered until the position is filled. The focus of this project will be on using probabilistic modeling and machine learning techniques applied to visual data to infer the processes underlying the development of visual expertise in latent print examiners. The successful applicate will have excellent programming skills, experience with C++ and Matlab, and a strong background in computer vision and machine learning. This project is part of a collaboration between Dr. Thomas Busey and Dr. Chen Yu at Indiana University, and is funded through the National Instutites of Justice. This position is available for two years. Candidates should send letter, curriculum vita, reprints, and names of three referees to (electronic submission preferred): Thomas Busey, PhD Associate Professor Department of Psychological and Brain Sciences, and Program in Cognitive Science Indiana University, Bloomington 1101 E. 10th St Bloomington, IN, 47405 (812) 855-4261 busey at indiana.edu www.indiana.edu/~busey Indiana University is an Affirmative Action/Equal Opportunity employer. Applicants need not be US citizens, and women and minority candidates are especially encouraged to apply. From netta at comp.leeds.ac.uk Thu Dec 8 08:54:13 2005 From: netta at comp.leeds.ac.uk (N Cohen) Date: Thu, 8 Dec 2005 13:54:13 +0000 (GMT) Subject: Connectionists: Academic Fellowship in the area of Modelling, Imaging and Design Message-ID: Dear colleagues, below please find an announcement for a prestigious faculty position in at the School of Computing, University of Leeds in the UK. This is a five year fellowship, in the expectation of leading to a permanent academic position. Leeds University has a high concentration of high quality research and thriving research in the School of Computing spanning theory of computing, (Algorithms and Complexity, Program Analysis and Logic Programming), AI (including computer vision, NLP, knowledge representation and learning) and multidisciplinary informatics (including scientific computing, grid computing, biosystems/computational neuroscience, and Visualization and Virtual Reality). The city of Leeds has acquired a much deserved reputation as a cultural, commercial and social hub of the north of England, with internationally respected theatre, opera, sporting and other activities within easy reach. ===================================================================== Academic Fellowship in the area of Modelling, Imaging and Design A prestigious appointment is available in an internationally-leading research group within the School of Computing. Applications are invited from individuals with post-doctoral (or equivalent) experience who have established their potential for excellence in scholarship and research. The Fellowship is five years in length and, subject to satisfactory completion of probation, guarantees an established academic post. In the early stages there will be a strong emphasis on research. Based in the School of Computing, topics include Computational Modelling and Simulation; Vision and Imaging; and, System Architecture Design for Internet Computing. Further details are available on the Academic Research Fellowship (http://www.comp.leeds.ac.uk/vacancies/20060114arf-fp.shtml) and the School of Computing (http://www.comp.leeds.ac.uk/vacancies/20060114arf-fp2.shtml). Enquiries specific to the research area may be made to Professor Peter Jimack (0113) 343 5464, email p.k.jimack at leeds.ac.uk Further information on the Academic Fellowship Scheme is available at http://www.rcuk.ac.uk/acfellow/. The rules for the scheme indicate that people already in permanent employment will not normally be eligible for appointment. Fellows will normally be appointed to Research Staff Grade IA or II (??19,460 - ??29,128) or (??27,116- ??35,883) depending on experience. (More senior appointments than this may, however be considered for particularly outstanding appointments.) The University is introducing a new reward framework which will facilitate the recruitment, retention and motivation of world class staff. Informal enquiries may be made to Margaret Smith (0113) 343 2001 email m.a.smith at leeds.ac.uk To apply on line please visit http://www.leeds.ac.uk and select 'jobs???. Application packs are also available via email recruitment at adm.leeds.ac.uk or tel (0113) 343 5771 Closing date is 14 January 2006 From r.w.clowes at sussex.ac.uk Fri Dec 9 05:46:45 2005 From: r.w.clowes at sussex.ac.uk (Robert Clowes) Date: Fri, 9 Dec 2005 10:46:45 -0000 Subject: Connectionists: CFP for Integrative Approaches to Machine Consciousness 2006 Message-ID: <005201c5fcad$db591f80$3931b88b@rn.informatics.scitech.susx.ac.uk> 1st CFP for: Integrative Approaches to Machine Consciousness April 5th-6th 2006 part of AISB'06: Adaptation in Artificial and Biological Systems University of Bristol, Bristol, England In April 2006 there will be a continuation of the 2005 Machine Consciousness Symposium as a part of the AISB'06 convention. Submissions Integrative Approaches to Machine Consciousness. Abstract Submission by: Jan 21st 2006 Machine Consciousness (MC) concerns itself with the study and creation of artefacts which have mental characteristics typically associated with consciousness such as (self-) awareness, emotion, affect, phenomenal states, imagination, etc. Recently, developments in AI and robotics, especially through the prisms of behavioural and epigenetic robotics, have stressed the embodied, interactive and developmental nature of intelligent agents which are now regarded by many as essential to engineering human-level intelligence. Some recent work has suggested that giving robots imaginative or simulation capabilities might be a big step towards achieving MC. Other studies have emphasized 'second person' issues such as intersubjectivity and empathy as a substrate for human consciousness. Alongside this, the infant-caregiver relationship has been recognised as essential to the development of consciousness in its specifically human form. Bringing these issues to centre stage in the study of artificial consciousness were the focus of last years AISB conference, Next Generation approaches to Machine Consciousness: Imagination, Development, Intersubjectivity, and Embodiment. This conference seeks to continue examination of many of these themes in relation to MC, but with a new focus on attempts to treat the synthesis or fusion of central components of MC in integrated models. We would also be interested in models which show or treat the emergence of processes or systems underlying these core themes. The website for the earlier conference at http://www.sussex.ac.uk/cogs/mc, and the online version of the proceedings can be found at and the http://www.aisb.org.uk/publications/proceedings/aisb05/7_MachConsc_Final.pdf . An article introducing and contextualising some of this work can also be found here: ftp://ftp.informatics.sussex.ac.uk/pub/reports/csrp/csrp574.pdf. Submissions are especially invited on the following topics in their relation to MC: ? Imagination ? Development ? Emotion ? Enactive / Embodied Approaches ? Heterophenomenology ? Synthetic Phenomenology ? Intersubjectivity ? Narrative ? General aspects (techniques, theories, constraints) We especially welcome attempts to study the way these different areas might be related. Preference will be given to submissions that are: ? Relevant: closely related to the themes of the symposium ? Implemented: based on working robotic or other implemented systems ? Novel: not previously presented elsewhere ? Integrative: models that examine the integration, or synthesis of core aspects of machine consciousness (especially two or more of the above topics), or their emergence from more basic cognitive functions. However, it is not expected that all accepted submissions will meet all four criteria of preference. Submissions should be in the form of papers 6,000 words (6-8 pages) OR abstracts (around 2 pages) based around more speculative ideas. The latter will be invited to give shorter presentations based on 4 page papers which will appear in the final proceedings. We also aim to publish a selection of the best articles in a special issue of a journal which we are currently negotiating. Poster submissions are also welcome. Formatting Papers should be PDF format, formatted according to Springer LNCS (see 'Proceedings and Other Multiauthor Volumes') at http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-164-2-72376-0,00.h tml Joint organisers: Rob Clowes, Ron Chrisley & Steve Torrance Program Committee Igor Aleksander, Giovanna Colombetti, Rodney Cotterill, Fr?d?ric Kaplan, Pentti Haikonen, Germund Hesslow, Owen Holland, Takashi Ikegami, Miguel Salichs, Ricardo Sanz, Murray Shanahan, Jun Tani, Tom Ziemke The conference website and submission information can be found at http://www.informatics.sussex.ac.uk/research/paics/machineconsciousness/ Important Dates Submission of papers by: Jan 21st 2006 Notification of decision: Feb 4th 2006 Camera ready copies by: February 20th 2006 From BerndPorr at f2s.com Sun Dec 11 18:14:33 2005 From: BerndPorr at f2s.com (Bernd Porr) Date: Sun, 11 Dec 2005 23:14:33 +0000 Subject: Connectionists: RunBot Message-ID: <439CB2D9.8010106@f2s.com> At the NIPS conference we have presented the RunBot at the demo track and at the poster session. There has been interest in the detailed design of the robot, for example to implement the controller on a (analogue) VLSI chip. I'm pleased to announce a more detailed article about the RunBot which is called: "Coupling of Neural Computation with Physical Computation for Stable Dynamic Biped Walking Control" which will appear in Neural Computation. A final draft of the article can be downloaded here: http://www.berndporr.me.uk/geng_et_al2005/ http://www.cn.stir.ac.uk/~faw1/Publications/papers/geng_etal_nc2005.pdf Please direct any technical questions directly to Tao Geng: http://www.cn.stir.ac.uk/~tgeng/ Regards /Bernd Porr -- www: http://www.berndporr.me.uk/ http://www.linux-usb-daq.co.uk/ Mobile: +44 (0)7840 340069 Work: +44 (0)141 330 5237 University of Glasgow Department of Electronics & Electrical Engineering Room 519, Rankine Building, Oakfield Avenue, Glasgow, G12 8LT From conrad.sanderson at anu.edu.au Mon Dec 12 21:03:03 2005 From: conrad.sanderson at anu.edu.au (conrad sanderson) Date: Tue, 13 Dec 2005 13:03:03 +1100 Subject: Connectionists: CFP: "Beyond Patches" CVPR 2006 workshop Message-ID: <200512131303.03517.conrad.sanderson@anu.edu.au> Call for Papers: "Beyond Patches" workshop, in conjunction with the CVPR 2006 conference. http://prost.cv.ri.cmu.edu/~slucey/BP-CVPR06/ Submission deadline: 24 March 2006 The concept of an image "patch", in computer vision, has many similarities to work within the field of structural pattern recognition. The structural approach takes the view that a pattern is composed of simpler subpatterns which, in turn, are built from even simpler subpatterns. Recently, many inroads have been made into novel areas of computer vision through the employment of patch-based representations with machine learning and pattern recognition techniques. In this workshop, we are soliciting papers from the computer vision and machine learning communities that expand and explore the boundaries of patch representations in computer vision applications. Relevant topics to the workshop include (but are not limited to): * Novel methods for identifying (e.g. SIFT, DoGs, Harris detector) and employing salient patches. * Techniques that explore criteria for deciding the size and shape of a patch based on image content and the application. * Approaches that explore the employment of multiple and/or heterogeneous patch sizes and shapes during analysis. * Applications that explore how important relative patch position is, and whether there are advantages in allowing those patches to move freely or in a constrained fashion. * Novel methods that explore and extend the concept of patches to video (e.g. space-time patches). * Approaches that draw upon previous work in structural pattern recognition in order to improve current patch-based computer vision algorithms. * Novel applications that extend the concept of patch-based analysis to other, hitherto, non-conventional areas of computer vision. * Novel techniques for estimating dependencies between patches in the same image (e.g. 3D rotations) to improve matching/correspondence. Submissions: Papers in PDF format are required by midnight 24 March 2006 EST. Papers should not exceed 8 double-column pages. The paper format must follow the standard IEEE 2-column format of single-spaced text in 10 point Times Roman, with 12 point interline space. All paper submissions must be anonymous. All submissions will be peer-reviewed by members of the program committee. Workshop site: http://prost.cv.ri.cmu.edu/~slucey/BP-CVPR06/ CVPR 2006 site: http://www.cvpr.org/2006/ From dayan at gatsby.ucl.ac.uk Mon Dec 12 08:45:05 2005 From: dayan at gatsby.ucl.ac.uk (Peter Dayan) Date: Mon, 12 Dec 2005 13:45:05 +0000 Subject: Connectionists: Gatsby PhD Programme In-Reply-To: <20050608123601.GE16153@flies.gatsby.ucl.ac.uk> References: <50907.193.217.174.139.1114196535.squirrel@webmail.uio.no> <20050422195120.GA28336@flies.gatsby.ucl.ac.uk> <20050607155808.GB5355@flies.gatsby.ucl.ac.uk> <50732.193.217.174.139.1118220211.squirrel@webmail.uio.no> <20050608085329.GB16153@flies.gatsby.ucl.ac.uk> <20050608123601.GE16153@flies.gatsby.ucl.ac.uk> Message-ID: <20051212134505.GA26158@flies.gatsby.ucl.ac.uk> Gatsby Computational Neuroscience Unit 4 year PhD Programme The Gatsby Unit is a world-class centre for theoretical neuroscience and machine learning, focusing on unsupervised learning, reinforcement learning, neural dynamics, population coding, interpretation of neural data and perceptual processing. It provides a unique opportunity for a critical mass of theoreticians to interact closely with each other, and with other world-class research groups in related departments at University College London, including Anatomy, Computer Science, Functional Imaging Laboratory, Physics, Physiology, Psychology, Neurology, Ophthalmology, and Statistics, and also with other Universities, notably Cambridge. The Unit always has openings for exceptional PhD candidates. Applicants should have a strong analytical background, a keen interest in neuroscience and a relevant first degree, for example in Computer Science, Engineering, Mathematics, Neuroscience, Physics, Psychology or Statistics. The PhD programme lasts four years, including a first year of intensive instruction in techniques and research in theoretical neuroscience and machine learning. It is described at http://www.gatsby.ucl.ac.uk/teaching/phd/ A number of competitive fully-funded studentships are available each year and the Unit also welcomes students with pre-secured funding or with other scholarship/studentship applications in progress. In the first instance, applicants are encouraged to apply informally by sending, in plain text format, a CV, a statement of research interests, and the names and addresses of three referees to admissions at gatsby.ucl.ac.uk. General enquiries should also be directed to this e-mail address. For further details of research interests please see http://www.gatsby.ucl.ac.uk/research.html Applications to begin the programme in September 2006 should be received by the 1st March 2006. From dgw at MIT.EDU Tue Dec 13 16:02:04 2005 From: dgw at MIT.EDU (David Weininger) Date: Tue, 13 Dec 2005 16:02:04 -0500 Subject: Connectionists: Book announcement - Rasmussen Message-ID: <6.2.1.2.2.20051213160202.041b10b0@po14.mit.edu> Hi all: I thought that Connectionists readers might be interested in the following new title from MIT Press. More information about the book is available at http://mitpress.mit.edu/promotions/books/SP2006026218253X. Thanks! Best, David Gaussian Processes for Machine Learning Carl Edward Rasmussen and Christopher K. I. Williams Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increasing attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes. Carl Edward Rasmussen is a Research Scientist at the Department of Empirical Inference for Machine Learning and Perception at the Max Planck Institute for Biological Cybernetics, T?bingen. Christopher K. I. Williams is Professor of Machine Learning and Director of the Institute for Adaptive and Neural Computation in the School of Informatics, University of Edinburgh. 8 x 10, 272 pp., cloth, ISBN 0-262-18253-X David Weininger Associate Publicist MIT Press 55 Hayward Street Cambridge, MA 02142-1315 617.253.2079 617.253.1709 fax dgw at mit.edu Check out the new MIT Press Log http://mitpress.mit.edu/presslog From M.Denham at plymouth.ac.uk Tue Dec 13 08:56:22 2005 From: M.Denham at plymouth.ac.uk (Mike Denham) Date: Tue, 13 Dec 2005 13:56:22 -0000 Subject: Connectionists: submission Message-ID: <52A8091888A23F47A013223014B6E9FE078D7CD1@03-CSEXCH.uopnet.plymouth.ac.uk> Centre for Theoretical and Computational Neuroscience, University of Plymouth, UK Postdoctoral Research Fellow (salary range ?23,643 - ?26,671 (GB Pounds) per annum) Applications are invited for a post of Postdoctoral Research Fellow in the Centre for Theoretical and Computational Neuroscience at the University of Plymouth, UK. Applicants must have a PhD in the area of neuroscience and possess a good knowledge and understanding of the mathematical methods and computational tools for modelling cortical neural networks at a biologically realistic level. The work of the Research Fellow will be specifically concerned with the development and investigation of a model of the laminar microcircuitry of the primary visual cortex, making use of distributed processing tools on an 80-processor Linux cluster simulation facility. The project will draw on neurobiological experimental and modelling results from several of the major neuroscience research labs in Europe who are collaborators in this research programme, and there will be opportunities for travel to and close interactions with these labs. The Centre for Theoretical and Computational Neuroscience is one of the main UK labs specialising in theoretical and modelling approaches to understanding brain function (visit www.plymneuro.org.uk). It has research groups in vision, audition, sensorimotor control, mathematical neuroscience, biophysics of temporal brain dynamics, and neural computation. It is actively collaborating with several UK, US and European labs and participates in a number of major UK research council and EU funded research projects. The research fellow post is available immediately and an appointment will be made as soon as possible. The appointment will be initially for a fixed term of three years, and will be subject to a probationary period of six months. Informal enquiries, ideally including a CV/resum?, should be made in the first instance by email to the Head of the Centre for Theoretical and Computational Neuroscience, Professor Mike Denham: mdenham at plymouth.ac.uk. From jaakko.sarela at tkk.fi Mon Dec 12 03:25:26 2005 From: jaakko.sarela at tkk.fi (Jaakko Sarela) Date: Mon, 12 Dec 2005 10:25:26 +0200 Subject: Connectionists: DSS MATLAB package v1-0 Message-ID: <20051212082525.GA17917@mail.cis.hut.fi> Announcement of the stable release (v1-0) of the DSS MATLAB package We have recently introduced a general framework for source separation called denoising source separation (DSS, [1]), where source separation is constructed around denoising procedures. The DSS algorithms may vary from almost blind (ICA) to detailed algorithms in special settings, allowing the prior information to guide the separation. The framework has already been applied in several fields (neuroinformatics [1], climate analysis [2], CDMA signal acquisition [3] and nonlinear ICA for separation of real-life image mixtures [4], etc.). The DSS MATLAB package, developed under the GNU GPL, has reached stable release v1-0. The package is highly customizable and there is a wide collection of denoising functions readily available. The package includes a command-line version as well as a graphical user interface. The package is available via http://www.cis.hut.fi/projects/dss/package/. Best regards Jaakko S?rel? and Harri Valpola References: [1] Denoising source separation. J. S?rel? and H. Valpola. Journal of Machine Learning Research, 6:233-272, 2005. http://www.cis.hut.fi/projects/dss/publications/#sarela05jmlr [2] Frequency-Based Separation of Climate Signals. A. Ilin and H. Valpola. In the proceedings of the 9th European Conference on Principles and Practice of Knowledge Discovery in Databases (PKDD 2005), Porto, Portugal, pp. 519-526, 2005. http://www.cis.hut.fi/projects/dss/publications/#ilin05pkdd [3] A denoising source separation based approach to interference cancellation for DS-CDMA array systems. K. Raju and J. S?rel?. In Proceedings of the 38th Asilomar Conference on Signals, Systems, and Computers, Pacific grove, CA, USA, pp. 1111 -- 1114, 2004. http://www.cis.hut.fi/projects/dss/publications/#raju04asilomar [4] Separation of nonlinear image mixtures by denoising source separation. M.S.C. Almeida, H. Valpola and J. S?rel?. In Proceedings of the 6th International Conference on Independent Component Analysis and Blind Source Separation, ICA 2006, Charleston, SC, USA, accepted. http://www.cis.hut.fi/projects/dss/publications/#almeida06ica From niki at cse.ohio-state.edu Thu Dec 15 12:42:21 2005 From: niki at cse.ohio-state.edu (Nicoleta Roman) Date: Thu, 15 Dec 2005 12:42:21 -0500 Subject: Connectionists: Ph.D. dissertation announcement: Sound Source Segregation Message-ID: <43A1AAFD.2040107@cse.ohio-state.edu> Dear list members: I would like to bring to your attention my recently completed Ph.D. dissertation, entitled "Auditory-based algorithms for sound segregation in multisource and reverberant environments". An electronic version of the thesis is available at: http://www.ohiolink.edu/etd/view.cgi?osu1124370749 Please find the abstract below. Sincerely, Nicoleta Roman -------- ABSTRACT -------- At a cocktail party, we can selectively attend to a single voice and filter out other interferences. This perceptual ability has motivated a new field of study known as computational auditory scene analysis (CASA) which aims to build speech separation systems that incorporate auditory principles. The psychological process of figure-ground segregation suggests that the target signal should be segregated as foreground while the remaining stimuli are treated as background. Accordingly, the computational goal of CASA should be to estimate an ideal time-frequency (T-F) binary mask, which selects the target if it is stronger than the interference in a local T-F unit. This dissertation investigates four aspects of CASA processing: location-based speech segregation, binaural tracking of multiple moving sources, binaural sound segregation in reverberation, and monaural segregation of reverberant speech. For localization, the auditory system utilizes the interaural time difference (ITD) and interaural intensity difference (IID) between the ears. We observe that within a narrow frequency band, modifications to the relative strength of the target source with respect to the interference trigger systematic changes for ITD and IID resulting in a characteristic clustering. Consequently, we propose a supervised learning approach to estimate the ideal binary mask. A systematic evaluation shows that the resulting system produces masks very close to the ideal binary ones and large speech intelligibility improvements. In realistic environments, source motion requires consideration. Binaural cues are strongly correlated with locations in T-F units dominated by one source resulting in channel-dependent conditional probabilities. Consequently, we propose a multi-channel integration method of these probabilities in order to compute the likelihood function in a target space. Finally, a hidden Markov model is employed for forming continuous tracks and automatically detecting the number of active sources. Reverberation affects the ITD and IID cues. We therefore propose a binaural segregation system that combines target cancellation through adaptive filtering and a binary decision rule to estimate the ideal binary mask. A major advantage of the proposed system is that it imposes no restrictions on the interfering sources. Quantitative evaluations show that our system outperforms related beamforming approaches. Psychoacoustic evidence suggests that monaural processing play a vital role in segregation. It is known that reverberation smears the harmonicity of speech signals. We therefore propose a two-stage separation system that combines inverse filtering of target room impulse response with pitch-based segregation. As a result of the first stage, the harmonicity of a signal arriving from target direction is partially restored while signals arriving from other locations are further smeared, and this leads to improved segregation and considerable signal-to-noise ratio gains. -------------- -------------- From qobi at purdue.edu Thu Dec 15 12:08:19 2005 From: qobi at purdue.edu (Jeffrey Mark Siskind) Date: Thu, 15 Dec 2005 12:08:19 -0500 Subject: Connectionists: CFP: The Fifth IEEE Computer Society Workshop on Perceptual Organization in Computer Vision Message-ID: <200512151708.jBFH8Jf06626@tlamachilistli.ecn.purdue.edu> FIRST CALL FOR PAPERS: POCV 2006 The Fifth IEEE Computer Society Workshop on Perceptual Organization in Computer Vision New York City June 22, 2006, In Conjunction with IEEE CVPR 2006 http://elderlab.yorku.ca/pocv IMPORTANT DATES: * Submission deadline: 11:59pm EST, March 17, 2006 * Notification: April 17, 2006 * Final versions of accepted papers due: April 24, 2006 **Please note that biological vision researchers working in the field of perceptual organization are encouraged to submit work that may stimulate new directions of research in the computer vision community. THEME: Perceptual Organization is the process of establishing a meaningful relational structure over raw visual data, where the extracted relations correspond to the physical structure of the scene. A driving motivation behind perceptual organization research in computer vision is to deliver representations needed for higher-level visual tasks such as object detection, object recognition, activity recognition and scene reconstruction. Because of its wide applicability, the potential payoff from perceptual organization research is enormous. The 5th IEEE POCV Workshop, to be held in conjunction with CVPR 2006 (New York), will bring together experts in perceptual organization and related areas to report on recent research results and to provide ideas for future directions. PREVIOUS IEEE POCV WORKSHOPS: * 2004 CVPR (Washington, DC) * 2001 ICCV (Vancouver, Canada) * 1999 ICCV (Crete, Greece) * 1998 CVPR (Santa Barbara, CA) SCOPE: Papers are solicited in all areas of perceptual organization, including but not limited to: * image segmentation * feature grouping * texture segmentation * contour completion * spatiotemporal/motion segmentation * figure-ground discrimination * integration of top-down and bottom-up methods * perceptual organization for object or activity detection/recognition * unification of segmentation, detection and recognition * biologically-motivated methods * neural basis for perceptual organization * learning in perceptual organization * graphical methods * natural scene statistics * evaluation methods ALGORITHM EVALUATION: Research progress in perceptual organization depends in part on quantitative evaluation and comparison of algorithms. Authors reporting results of new algorithms are strongly encouraged to objectively quantify performance and compare against at least one competing approach. BROADER ISSUES: Perceptual organization research faces a number of challenges. One is defining what the precise goal of perceptual organization algorithms should be. What kind of representation should they deliver? What databases should be used for evaluation? How can we quantify performance to allow objective evaluation and comparison between algorithms? How do we know when we’ve succeeded? To try to meet these challenges, we particularly encourage contributions of a more general nature that attempt to address one or more of these questions. These may include definitional papers, theoretical frameworks that might apply to multiple different perceptual organization problems, establishment of useful databases, modeling of underlying natural scene statistics, evaluation methodologies, etc. Biological Motivation BIOLOGICAL MOTIVATION: Much of the current work in perceptual organization in computer vision has its roots in qualitative principles established by the Gestalt Psychologists nearly a century ago, and this link between computational and biological research continues to this day. Following this tradition, we specifically invite biological vision researchers working in the field of perceptual organization to submit work that may stimulate new directions of research in the computer vision community. WORKSHOP OUTPUT: All accepted papers will be included in the Electronic Proceedings of CVPR, distributed on DVD at the conference, and will be indexed by IEEE Xplore. We are also exploring the possibility of a special journal issue on perceptual organization in computer vision, with a separate call for papers. PAPER SUBMISSION: Submission is electronic, and must be in PDF format. Papers must not exceed 8 double-column pages. Submissions must follow standard IEEE 2-column format of single-spaced text in 10 point Times Roman, with 12 point interline space. All submissions must be anonymous. Please us the IEEE Computer Society CVPR format kit. Stay tuned for exact details on how to submit. In submitting a paper to the POCV Workshop, authors acknowledge that no paper of substantially similar content has been or will be submitted to another conference or workshop during the POCV review period. For further details and updates, please see the workshop website: http://elderlab.yorku.ca/pocv WORKSHOP CHAIRS: James Elder, York University jelder at yorku.ca Jeffrey Mark Siskind, Purdue University qobi at purdue.edu PROGRAM COMMITTEE: Ronen Basri, Weizmann Institute, Israel Kim Boyer, Ohio State University, USA James Coughlan, Smith-Kettlewell Institute, USA Sven Dickinson, University of Toronto, Canada Anthony Hoogs, GE Global Research, USA David Jacobs, University of Maryland, USA Ian Jermyn, INRIA, France Benjamin Kimia, Brown University, USA Norbert Kruger, Aalborg University, Denmark Michael Lindenbaum, Technion, Israel Zili Liu, University of California, Los Angeles, USA David Martin, Boston College, USA Gerard Medioni, University of Southern California, USA Zygmunt Pizlo, Purdue University, USA Sudeep Sarkar, University of South Florida, USA Eric Saund, Palo Alto Research Centre, USA Kaleem Siddiqi, McGill University, Canada Manish Singh, Rutgers University, USA Shimon Ullman, Weizmann Institute, Israel Johan Wagemans, University of Leuven, Belgium Song Wang, University of South Carolina, USA Rich Zemel, University of Toronto, Canada Song-Chun Zhu, University of California, Los Angeles, USA Steve Zucker, Yale University, USA From ted.carnevale at yale.edu Wed Dec 14 10:38:37 2005 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Wed, 14 Dec 2005 10:38:37 -0500 Subject: Connectionists: Announcement: The NEURON Book Message-ID: <43A03C7D.60503@yale.edu> Cambridge University Press has announced that distribution of The NEURON Book will begin in January 2006 http://www.cambridge.org/us/catalogue/catalogue.asp?isbn=0521843219 --Ted Carnevale The NEURON Book N.T. Carnevale and M.L. Hines ISBN-10: 0521843219 The authoritative reference on NEURON, the simulation environment for modeling biological neurons and neural networks that enjoys wide use in the experimental and computational neuroscience communities. This book will show you how to use NEURON to construct and apply empirically based models. Written primarily for neuroscience investigators, teachers, and students, it assumes no previous knowledge of computer programming or numerical methods. Readers with a background in the physical sciences or mathematics, who have some knowledge about brain cells and circuits and are interested in computational modeling, will also find it helpful. The NEURON Book covers material that ranges from the inner workings of this program to practical considerations involved in specifying the anatomical and biophysical properties that are to be represented in models. It uses a problem-solving approach, with many working examples that readers can try for themselves. Nicholas T. Carnevale is a Senior Research Scientist in the Department of Psychology at Yale University. He directs the NEURON courses at the annual meetings of the Society for Neuroscience, and the NEURON Summer Courses at the University of California, San Diego, and University of Minnesota, Minneapolis. Michael L. Hines is a Research Scientist in the Department of Computer Science at Yale University. He created NEURON in collaboration with John W. Moore at Duke University, Durham NC, and is the principal investigator and chief software architect on the project that continues to support and extend it. From terry at salk.edu Thu Dec 15 15:30:34 2005 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 15 Dec 2005 12:30:34 -0800 Subject: Connectionists: NEURAL COMPUTATION 18:2 In-Reply-To: Message-ID: Neural Computation - Contents - Volume 18, Number 2 - February 1, 2006 Article Polychronization: Computation With Spikes Eugene M. Izhikevich Letters Making Working Memory Work: A Computational Model of Learning in the Prefrontal Cortex and Basal Ganglia Randall C. O'Reilly and Michael J. Frank Identification of Multiple-Input Systems with Highly Coupled Inputs: Application to EMG Prediction from Multiple Intra Cortical Electrodes David T. Westwick, Eric A. Pohlmeyer, Sara A. Solla, Lee E. Miller and Eric J. Perreault Oscillatory Networks: Pattern Recognition Without a Superposition Catastrophe Thomas Burwick Topographic Product Models Applied to Natural Scene Statistics Simon Osindero, Max Welling and Geoffrey E. Hinton A Simple Hebbian/Anti-Hebbian Network Learns the Sparse, Independent Components of Natural Images Michael S. Falconbridge, Robert L. Stamps and David R. Badcock Differential Log Likelihood for Evaluating and Learning Gaussian Mixtures Marc M. Van Hulle Magnification Control in Self-Organizing Maps and Neural Gas Thomas Villmann and Jens Christian Claussen Enhancing Density-Based Data Reduction Using Entropy D. Huang and Tommy W. S. Chow ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2006 - VOLUME 18 - 12 ISSUES Electronic only USA Canada* Others USA Canada* Student/Retired $60 $64.20 $114 $54 $57.78 Individual $100 $107.00 $154 $90 $96.30 Institution $730 $781.10 $784 $657 $702.99 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From cindy at bu.edu Fri Dec 16 10:55:01 2005 From: cindy at bu.edu (Cynthia Bradford) Date: Fri, 16 Dec 2005 10:55:01 -0500 Subject: Connectionists: Neural Networks 19(1) 2006 Message-ID: <200512161555.jBGFt1B9015326@kenmore.bu.edu> NEURAL NETWORKS 19(1) Contents - Volume 19, Number 1 - 2006 ------------------------------------------------------------------ EDITORIAL: Another year of exciting Special Issues! NEURAL NETWORKS REFEREES USED IN 2005 ***** Psychology and Cognitive Science ***** J. Molina Vilaplana and J. Lopez Coronado A neural network model for coordination of hand gesture during reach to grasp ***** Neuroscience and Neuropsychology ***** Tony J. Prescott, Fernando M. Montes Gonzalez, Kevin Gurney, Mark D. Humphries, and Peter Redgrave A robot model of the basal ganglia: Behavior and intrinsic processing ***** Mathematical and Computational Analysis ***** Kazunori Iwata, Kazushi Ikeda, and Hideaki Sakai The asymptotic equipartition property in reinforcement learning and its relation to return maximization Shengyuan Xu and James Lam A new approach to exponential stability analysis of neural networks with time-varying delays Arindam Choudhury, Prasanth B. Nair, and Andy J. Keane Constructing a speculative kernel machine for pattern classification ***** Engineering and Design ***** Shen Furao and Osamu Hasegawa An incremental network for on-line supervised classification and topology learning CURRENT EVENTS ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 Y 13,000 Neural Networks (plus Y 2,000 enrollment fee) $20 (student) SEK 460 Y 11,000 (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------------- membership without $30 SEK 200 not available Neural Networks to non-student (subscribe through another society) Y 5,000 student (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------------- Name: ______________________________________________________ Title: ______________________________________________________ Address: ______________________________________________________ Phone: ______________________________________________________ Fax: ______________________________________________________ Email: ______________________________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number _______________________________ expiration date _____________________________ INNS Membership 2810 Crossroads Drive, Suite 3800 Madison WI 53718 USA 608 443 2461, ext. 138 (phone) 608 443 2474 (fax) srees at reesgroupinc.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership JNNS Secretariat c/o Fuzzy Logic Systems Institute 680-41 Kawazu, Iizuka Fukuoka 820-0067 Japan 81 948 24 2771 (phone) 81 948 24 3002 (fax) jnns at flsi.cird.or.jp http://www.jnns.org/ ---------------------------------------------------------------------------- From Martin.Riedmiller at uos.de Fri Dec 16 10:36:16 2005 From: Martin.Riedmiller at uos.de (Martin Riedmiller) Date: Fri, 16 Dec 2005 16:36:16 +0100 Subject: Connectionists: Learning soccer robots - source code release Message-ID: <43A2DEF0.7000908@uos.de> The Brainstormers, current World Champion in RoboCup Soccer Simulation league 2D, proudly announce the release of major parts of their source code. The aim of the Brainstormers project at the Neuroinformatics Group at the University of Osnabrueck is to demonstrate the succesful application of machine learning techniques (in particular Neural Reinforcement Learning methods) in competition. The released source code therefore contains a large amount of examples of learned skills and team behaviours, such as NeuroIntercept, NeuroGo2Position, NeuroKick, NeuroAttack etc. The code release is mainly meant to provide a good starting point for new teams in RoboCup but also might provide useful stimulations for more advanced teams (in particular concerning the learnt modules) and for researchers in Artificial Intelligence/ Machine Learning. Links: Brainstormers home page: www.ni.uos.de/brainstormers Download: www.ni.uos.de/index.php?id=880 Have fun, Martin Riedmiller and Thomas Gabel, Neuroinformatics Group, Univ. of Osnabrueck, www.ni.uos.de From dayan at gatsby.ucl.ac.uk Tue Dec 20 05:56:44 2005 From: dayan at gatsby.ucl.ac.uk (Peter Dayan) Date: Tue, 20 Dec 2005 10:56:44 +0000 Subject: Connectionists: Gatsby PhD Programme: 15th January 2006 closing date In-Reply-To: <20051212134505.GA26158@flies.gatsby.ucl.ac.uk> References: <50907.193.217.174.139.1114196535.squirrel@webmail.uio.no> <20050422195120.GA28336@flies.gatsby.ucl.ac.uk> <20050607155808.GB5355@flies.gatsby.ucl.ac.uk> <50732.193.217.174.139.1118220211.squirrel@webmail.uio.no> <20050608085329.GB16153@flies.gatsby.ucl.ac.uk> <20050608123601.GE16153@flies.gatsby.ucl.ac.uk> <20051212134505.GA26158@flies.gatsby.ucl.ac.uk> Message-ID: <20051220105633.GA17388@flies.gatsby.ucl.ac.uk> I would like to apologise for some erroneous information in my recent posting about the Gatsby Unit's 4 year PhD programme in theoretical neuroscience and machine learning (http://www.gatsby.ucl.ac.uk/teaching/phd/) The closing date for applications (to admissions at gatsby.ucl.ac.uk) is actually 15th January 2006. Peter Dayan From h.jaeger at iu-bremen.de Tue Dec 20 11:44:01 2005 From: h.jaeger at iu-bremen.de (Herbert Jaeger) Date: Tue, 20 Dec 2005 17:44:01 +0100 Subject: Connectionists: CFP Neural Networks Special Issue on ESNs and LSMs Message-ID: <43A834D1.7010302@iu-bremen.de> Content-Type: text/plain; charset=us-ascii; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Scanned: by amavisd-new 20030616p5 at demetrius.iu-bremen.de CALL FOR PAPERS: Neural Networks 2007 Special Issue "Echo State Networks and Liquid State Machines" Guest Co-Editors : Dr. Herbert Jaeger, International University Bremen, h.jaeger at iu-bremen.de Dr. Wolfgang Maass, Technische Universitaet Graz, maass at igi.tugraz.at Dr. Jose C. Principe, University of Florida, principe at cnel.ufl.edu A new approach to analyzing and training recurrent neural network (RNNs) has emerged over the last few years. The central idea is to regard a RNN as a nonlinear, excitable medium, which is driven by input signals or fed-back output signals. From the excited response signals inside the medium, simple (typically linear), trainable readout mechanisms distil the desired output signals. The medium consists of a large, randomly connected network, which is not adapted during learning. It is variously referred to as a dynamical reservoir or liquid. There are currently two main flavours of such networks. Echo state networks were developed from a mathematical and engineering background and are composed of simple sigmoid units, updated in discrete time. Liquid state machines were conceived from a mathematical and computational neuroscience perspective and usually are made of biologically more plausible, spiking neurons with a continuous-time dynamics. These approaches have quickly gained popularity because of their simplicity, expressiveness, ease of training and biological appeal. This Special Issue aims at establishing a first comprehensive overview of this newly emerging area, demonstrating the versatility of the approach, its mathematical foundations and also its limitations. Submissions are solicited that contribute to this area of research with respect to -- mathematical and algorithmic analysis, -- biological and cognitive modelling, -- engineering applications, -- toolboxes and hardware implementations. One of the main questions in current research in this field concerns the structure of the dynamical reservoir / liquid. Submissions are especially welcome which investigate the relationship between the excitable medium topology and algebraic properties and the resulting modeling capacity, or methods for pre-adapting the medium by unsupervised or evolutionary mechanisms, or including special-purpose sub networks (as for instance, feature detectors) into the medium. Submission of Manuscript The manuscripts should be prepared according to the format of the Neural Networks and electronically submitted to one of the Guest Editors. The review will take place within 3 months and only very minor revisions will be accepted. For any further question, please contact the Guest Editors. DEADLINE FOR SUBMISSION : June 1, 2006. ------------------------------------------------------------------ Dr. Herbert Jaeger Professor for Computational Science International University Bremen Campus Ring 12 28759 Bremen, Germany Phone (+49) 421 200 3215 Fax (+49) 421 200 49 3215 email h.jaeger at iu-bremen.de http://www.faculty.iu-bremen.de/hjaeger/ ------------------------------------------------------------------ From A.Cangelosi at plymouth.ac.uk Wed Dec 21 08:55:32 2005 From: A.Cangelosi at plymouth.ac.uk (Angelo Cangelosi) Date: Wed, 21 Dec 2005 13:55:32 -0000 Subject: Connectionists: PhD position in Computational Neuroscience and Interactive Intelligent Systems Message-ID: <64997DB783F0FD4EB5550AD0D550E2290501DDA1@03-CSEXCH.uopnet.plymouth.ac.uk> Centre for Theoretical and Computational Neuroscience (CTCN) Centre for Interactive Intelligent Systems (CIIS) University of Plymouth PhD Studentship in Computational Neuroscience and Interactive Intelligent Systems The University of Plymouth invites applications for a PhD Studentship (stipend to cover living expenses plus UK/EU fees) in the areas of Computational Neuroscience and/or Interactive Intelligent Systems. There are about twelve academic staff in the two Centres, and their work was awarded a rating of 5 (International Excellence) in the 2001 UK Research Assessment Exercise. The primary areas of interest and expertise within the CTCN and CIIS include: - Audition - Biophysics and modelling of temporal brain dynamics - Mathematical neuroscience - Neural computation - Sensorimotor control - Vision - Artificial life models of cognition - Interactive robotics - Information visualisation - Computer music - Semantic web Applicants should have, or expect to obtain, a high grade Bachelors or Masters degree in computing, neuroscience, psychology/cognitive science, physics, mathematics or an allied discipline. The candidate should ideally possess good computational skills and must have a strong motivation for research. For more information on the activity of the CTCN and CIIS, visit: http://www.plymneuro.org.uk/ http://neuromusic.soc.plymouth.ac.uk/ciis.html For informal enquiries contact Professor Mike Denham (m.denham at plymouth.ac.uk) or Dr. Angelo Cangelosi (a.cangelosi at plymouth.ac.uk). Applications should be sent via email to Mrs. Carole Watson (c.watson at plymouth.ac.uk; tel. +44 1752 233329), Senior Research Administrator, Faculty of Technology, University of Plymouth. Closing deadline for applications is March 20th, 2006. Each application should include (1) detailed CV and (2) cover letter and (3) application form. The PhD application form can be downloaded here: http://www.plymouth.ac.uk/pages/view.asp?page=5731 ---------------- Angelo Cangelosi, PhD ---------------- Reader in Artificial Intelligence and Cognition Adaptive Behaviour and Cognition Research Group School of Computing, Communications & Electronics University of Plymouth Portland Square Building (A316) Plymouth PL4 8AA (UK) E-mail: acangelosi at plymouth.ac.uk http://www.tech.plym.ac.uk/soc/staff/angelo (tel) +44 1752 232559 (fax) +44 1752 232540 From wambamm at gmail.com Wed Dec 21 17:12:53 2005 From: wambamm at gmail.com (Michael L Edwards) Date: Wed, 21 Dec 2005 16:12:53 -0600 Subject: Connectionists: WAM-BAMM*06 Announcement Message-ID: <756e5af30512211412x3f6344cm37293fcf257b7d56@mail.gmail.com> The Second Annual World Association of Modelers (WAM) Biologically Accurate Modeling Meeting (BAMM) Wam-bamm *06 March 23rd - March 25th San Antonio, Texas http://wam-bamm.org The second annual meeting devoted to the promotion and extension of biologically accurate modeling and simulation will be held in San Antonio Texas March 23rd - March 25th. Last year's meeting (http://wam-bamm.org/05_links.htm ) attracted more than 100 participants from around the world and was rated by users as 4.5 out of 5.0 (outstanding) with respect to venue, organization, and overall value. This year's meeting will be better still. The meeting's primary objective is to promote communication and collaboration between users and others involved in realistic biological modeling and to also provide an introduction to other scientists interested in realistic biological modeling. This year's meeting will also feature two pre-meetings, one on modeling within the olfactory system, and a second on computational approaches to understanding data in molecular and cellular biology (see website for details). Subjects considered: Modeling results, modeling as a base for understanding biological data, modeling inspired biological experimentation, world modeling community coordination, modeling techniques, simulator design. All computational biologists are invited to present scientific as well as technical work. The meeting encourages participation by modelers using GENESIS, NEURON, any other simulation system, or writing your own code. We also encourage participation by experimental biologists interested in knowing more about biologically accurate modeling techniques. Supplementary travel grants will be available for students presenting work at the meeting. THE PROGRAM Unique in its structure, this meeting will combine introductory, intermediate, and advanced tutorials in realistic modeling techniques with a full agenda of scientific presentations. TUTORIALS Updated versions of most of the tutorials from WAM-BAMM*05 have been published in article form (both in browseable HTML and downloadable PDF format) in the November 2005 special issue on Realistic Neural Modeling in the free electronic journal Brains, Minds, and Media.( http://www.brains-minds-media.org/current/) Currently Scheduled Tutorials: Introduction to realistic neural modeling David Beeman, University of Colorado Boulder Large scale parallel network simulations using NEURON Michael Hines, Yale University How to make the best hand-tuned single-cell model you can Dieter Jaeger, Emory University: XML for Model Specification Workshop Sharon Crook, Arizona State University, and Padriag Gleeson, University College London Biochemical kinetics modeling with Kinetikit and MOOSE. Upinder S. Bhalla, NCBS, Bangalore GENESIS simulations with hsolve Hugo Cornelis, UTHSCA We also encourage meeting participants to suggest tutorials. SCIENTIFIC MEETING Thursday and Friday will be devoted to oral and poster presentations by meeting participants and invited speakers. For additional meeting information please visit http://www.wam-bamm.org. Important Dates: --------------- Deadline for proposed research presentations: January 15, 2006 Submission form is available on the WAM-BAMM web site: www.wam-bamm.org Student registration deadline for travel grants: February 1, 2006 Travel grants for student presenters. (see web site). Deadline for early registration: February 1, 2006 Advance registration $ 99 for graduate students, $ 149 for all others (30% increase after deadline) Deadline for guaranteed housing at the conference rate: February 27, 2006. The meeting will be held at the historic Menger Hotel in Downtown San Antonio, next to the Alamo and the famous San Antonio River Walk. Room rates $109 (single or double), $ 119 (3-4). Arrival date for the meeting: March 22, 2006 Last event: The (in)famous WAM-BAM Country Western Banquet: Saturday, March 25, 2006 Depart from San Antonio: Sunday, March 26, 2006 Registration is now open for WAM-BAMM*06 at http://www.WAM-BAMM.org . Travel funds will be available for students presenting papers. The first annual meeting of the World Association of Modelers (WAM) Biologically Accurate Modeling Meeting (BAMM), in association with the second GENESIS Users Meeting GUM*05 will be held March 31st - April 2nd in beautiful San Antonio, Texas. For further information please visit http://www.wam-bamm.org/ or email us at wam-bamm at wam-bamm.org. Jim Bower Dave Beeman -- James M. Bower Ph.D. Research Imaging Center University of Texas Health Science Center at San Antonio 7703 Floyd Curl Drive San Antonio, TX 78284-6240 Cajal Neuroscience Center University of Texas San Antonio Phone: 210 567 8080 From h.jaeger at iu-bremen.de Wed Dec 21 05:15:18 2005 From: h.jaeger at iu-bremen.de (Herbert Jaeger) Date: Wed, 21 Dec 2005 11:15:18 +0100 Subject: connectionists: CFP Interdisciplinary College IK2006 Message-ID: <43A92B36.7000704@iu-bremen.de> Call for Participation: ======================= Interdisciplinary College IK2006 held at Guenne, Germany, March 10-17, 2006 An interdisciplinary spring school on neurobiology, neural computation, cognitive science/psychology, and artificial intelligence. Focus Theme: Learning Quick Link and Registration: http://www.ik2006.de/ ================================================== Dates: Friday March 10th to Friday March 17th, 2006 Location: Heinrich-Luebke-Haus, Guenne am Moehnesee, Germany Early Registration Deadline: January 15th, 2006 Late Registration Deadline: February 17th, 2006 Chairs: Rainer Malaka (EML Heidelberg), Manfred Spitzer (University of Ulm) Organizations: Main: Gesellschaft fuer Informatik (GI) supporting Organizations: GK, PASCAL NoE Details: ======== The Interdisciplinary College (Interdisziplin?res Kolleg, IK) is an annual one-week spring school which offers a dense, intensive and state-of-the-art course program in neurobiology, neural computation, cognitive science/psychology, artificial intelligence, robotics and philosophy. It is aimed at students, postgraduates and researchers from academia and industry. By combining humanities, science and technology, the IK endeavours to intensify dialogue and connectedness across the various disciplines. Participants come from various European countries, lecturers from all over the world. All courses are taught in English. The course program starts out with several basic and methodological courses providing an up-to-date introduction to the four main fields of the IK. In the second part of the week special courses present in depth discussions of (state-of-the-art research on) specific topics. Additionally, the IK is a unique social event. Participants may enjoy the very special atmosphere: minds meet, music is played and friends are made in long evening and night sessions in the welcoming conference site at the M?hne lake. Focus Theme: Learning The focus of IK 2006 will be learning. What is learning? - As long as nobody asks, we know the answer. Neuroscientists refer to synaptic change, educators to insight, developmental psychologists to phases and stages, cognitive psychologists to categories and rules, modellers and computer scientists to statistics and data-driven inferences. Learning is surely one of the most intensely studied subjects in neurobiology, cognitive science, artificial intelligence, and neuroinformatics. And with life-long learning becoming ever more important, with the PISA-study demonstrating mediocre learning practices in schools, and the economy depending upon the learning brains of the next generation as its only resource, we need to take learning serious. As in previous IK, we want to tackle the theme at issue from various viewpoints, from the synapse to systems, from animals to algorithms, from organisms to automata, and from theory to practical applications. Developmental aspects (the borderland between maturation and learning), modifying factors (age, emotion, motivation), storage systems (memory in its various forms) will be discussed as well as computational learning theories. The IK will aim in particular at bridging the gap between disciplines. Thus we will discuss how computational approaches such as reinforcement learning are related to neurobiological and cognitive insights. This multidisciplinary approach can help to establish new learning paradigms and algorithms for artificial cognitive systems and facilitate our understanding of the nature of learning Courses/lecturers: ================== Basic Courses - Artificial Intelligence (Wolfram Burgard, Freiburg) - Neurobiology (Ansgar B?schges, Cologne) - Cognitive Science (Hanspeter Mallot, T?bingen) - Machine Learning and Neural networks (Herbert Jaeger, Bremen) Methodical Courses - Introduction to Kernel Methods (Matthias Seeger, T?bingen) - How to measure learning and memory. Lessons from psychology (Thomas Kammer & Markus Kiefer, Ulm) - Functional imaging (Thomas Wolbers, Hamburg) Special Courses: Mechanisms of learning - Neuroplasticity (Hubert Dinse, Bochum) - Learning und Sleep (Lisa Marshall, L?beck) - Encoding of prediction errors and microeconomic reward terms by dopamine neurons during Pavlovian conditioning (Philipe Tobler, Cambridge, UK) - Learning as knowledge acquisition (Gerhard Strube, Freiburg) Special Courses: Computational Models Knowledge and Learning - Reinforcement Learning (Martin Riedmiller, Osnabr?ck) - Ontology Learning and Ontology Mapping (Steffen Staab, Koblenz-Landau) - Neural-symbolic learning and reasoning (Pascal Hitzler, Karlsruhe & Sebastian Bader, Dresden) - The emergent ontology: knowledge collectives and conceptual design patterns (Aldo Gangemi, Rome, Italy) Special Courses: Learning by Machines and Robots - A Neural Theory of Language Learning and Use (Srini Narayanan, Berkeley, USA) - From Sensorimotor Sequence to Grammatical Construction: Insights from Neurophysiology, Simulation and Robotics (Peter Dominey, Lyon, France) - The Recruitment theory of Language Origins (Luc Steels, Br?ssel/Paris, France) - Cognitive Developmental Robotics (Minoru Asada, Osaka, Japan) Special Courses: Developmental, Evolution and Neuropsychology - Developmental psychology: insights from the baby lab (NN) - Psychopathology in Adolencense (Matthias Weisbrod, Heidelberg) - Learning and problem solving in monkeys and apes (Josep Call, Leipzig) - The evolution of cognition and learning (Peter G?rdenfors, Lund, Sweden) A limited number of travel and registration support grants are available. For more information, including registration, see http://www.ik2006.de ------------------------------------------------------------------ Dr. Herbert Jaeger Professor for Computational Science International University Bremen Campus Ring 12 28759 Bremen, Germany Phone (+49) 421 200 3215 Fax (+49) 421 200 49 3215 email h.jaeger at iu-bremen.de http://www.faculty.iu-bremen.de/hjaeger/ ------------------------------------------------------------------ From schunn+ at pitt.edu Thu Dec 22 19:29:41 2005 From: schunn+ at pitt.edu (schunn+@pitt.edu) Date: Thu, 22 Dec 2005 19:29:41 -0500 (EST) Subject: Connectionists: Best paper prizes in computational cognitive modeling forCogsci 2006 Message-ID: <39601.67.171.65.112.1135297781.squirrel@webmail.pitt.edu> Four prizes worth $1,000 (USD) each will be awarded for the best full paper submissions to the 2006 Annual Meeting of the Cognitive Science Society that involve computational cognitive modeling. The four separate prizes will represent the best modeling work in the respective areas of: perception, language, higher-level cognition, and applied cognition. The prizes are open to researchers at any level (student, postdoc, research scientist, faculty) from any nationality. Any form of computational cognitive modeling relevant to cognitive science will be eligible, including (but not limited to) connectionism, symbolic, Bayesian, dynamic systems, or various hybrids. No special submission procedure is required---all full paper submissions to the conferences will be automatically considered, using the interdisciplinary program committee that is supervising the review process. The full paper submission deadline is February 1st, 2006. For further details about the conference submission procedure, see http://www.cogsci.rpi.edu/~rsun/cogsci2006/. These prizes are supported by a grant from the US National Science Foundation. Please pass this notice around to relevant colleagues and students. From doya at irp.oist.jp Fri Dec 23 05:32:58 2005 From: doya at irp.oist.jp (Kenji Doya) Date: Fri, 23 Dec 2005 19:32:58 +0900 Subject: Connectionists: Okinawa Computational Neuroscience Course 2006: Call for Applications Message-ID: Call for Applications OKINAWA COMPUTATIONAL NEUROSCIENCE COURSE 2006 "Computing Neurons" June 26 - July 7, 2006. Okinawa, Japan. http://www.irp.oist.jp/ocnc/2006 Application Deadline: APRIL 10TH, 2006 The aim of Okinawa Computational Neuroscience Course is to provide opportunities for young researchers with theoretical backgrounds to learn latest advances in neuroscience, and those with experiment backgrounds to have hands-on experience in computational modeling. We invite graduate students and postgraduate researchers to participate in the course, held from June 26th through July 7th at an oceanfront seminar house of Okinawa Institute of Science and Technology. Those interested in attending the course should send the materials below by e-mail or the course web page by APRIL 10th, 2006. We hope that this course will be a good opportunity for theoretical and experimental neuroscientists to meet together and to explore the attractive nature and culture of Okinawa, the southernmost island prefecture of Japan. ******* Course Outline ******* Okinawa Computational Neuroscience Course (OCNC2006) Theme: Computing Neurons - What neurons compute; How we know by computing - Our brain is a network of billions of neurons, but even a single neuron is a fantastically complex computing device. Technology has made it possible to look into the detailed structure of dendritic branches, variety of ionic channels and receptors, molecular reactions at the synapses, and the network of genes that regulate all these. The challenge is to understand the meaning and function of these components of the neural machine. To do this we need to put together data from many experiments at different levels into a computational model, and to analyze the kinds of computation that single neurons and their networks can perform. This course invites graduate students and postgraduate researchers who are interested in studies integrating experimental and computational approaches for understanding cellular mechanisms of neurons. Lectures: Upi Bhalla (NCBS) Sydney Brenner (OIST) Yang Dan (UC Berkeley) Erik DeSchutter (U Antwerp) Kenji Doya (OIST) Bard Ermentrout (U Pittsburgh) Geoff Goodhill (U Queensland) Shin Ishii (NAIST) Shinya Kuroda (U Tokyo) Nicolas Le Novere (European Bioinformatics Institute) Roberto Malinow (Cold Spring Harbor Lab) Henry Markram (EPFL) Terry Sejnowski (Salk Institute) Susumu Tonegawa (MIT) (more to be announced) Student Projects a) Introduction to neural/cellular simulator platforms b) Model construction from experimental data c) Analysis of neuron models Students will present posters on their current works early in the course and the results of their projects at the end of the course. Date: June 26th to July 7th, 2006 Place: Okinawa Institute of Science and Technology Onna village, Okinawa, Japan Sponsors: Okinawa Institute of Science and Technology Nara Institute of Science and Technology Japanese Neural Network Society Co-organizers: Upinder Bhalla, National Center for Biological Sciences, India Kenji Doya, Okinawa Institute of Science and Technology Shinya Kuroda, University of Tokyo Nicolas Le Novere (European Bioinformatics Institute) Advisors: Sydney Brenner, Okinawa Institute of Science and Technology Hiroaki Kitano, SONY Computer Science Laboratory Terrence Sejnowski, Salk Institute Susumu Tonegawa, Massachusetts Institute of Technology ******* Application ******* Please send the following by e-mail (ocnc at irp.oist.jp) or the web application page by APRIL 10TH, 2006. 1) First name, 2) Middle initial (if any), 3) Family name, 4) Degree, 5) Date of birth, 6) Gender, 7) Nationality, 8) Affiliation, 9) Position, 10) Advisor, 11) Postal address, 12) Phone, 13) Fax, 14) E-mail, 15) Web page (if any), 16) Educational background, 17) Work experience, 18) List of publications, 19) Research interests (up to 500 words), 20) Motivations for attending the course (up to 500 words), 21) Two referees whom can ask recommendations (names, affiliations, e-mail addresses), 22) Need for travel support, 23) How you learned about the course. We will accept 30 students based primarily on their research interests (19) and motivations (20). We will also consider the balance of members' research disciplines, geographic origins, and genders. The sponsor will provide lodging and meals during the course. Support for roundtrip airfare to Okinawa will be considered for students without funding. The result of selection will be informed to applicants via e-mail by May 10th. The details of OCNC2004 and 2005 are available on the web page (http://www.irp.oist.jp/ocnc/). ******* Secretariat ******* Okinawa Computational Neuroscience Course c/o Initial Research Project, Okinawa Institute of Science and Technology 12-22 Suzaki, Gushikawa Okinawa 904-2234, Japan Phone: +81-98-921-3933 Fax: +81-98-921-3873 Email: ocnc at irp.oist.jp For more information, please visit the web page: http://www.irp.oist.jp/ocnc/2006 ---- Kenji Doya Initial Research Project, Okinawa Institute of Science and Technology 12-22 Suzaki, Uruma, Okinawa 904-2234, Japan Phone:+81-98-921-3843; Fax:+81-98-921-3873 http://www.irp.oist.jp/ From steve at cns.bu.edu Sun Dec 25 06:49:06 2005 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sun, 25 Dec 2005 06:49:06 -0500 Subject: Connectionists: neural mechanisms of autism Message-ID: The following article is now available at http://www.cns.bu.edu/Profiles/Grossberg Grossberg, S. and Seidman, D. (2006). Neural dynamics of autistic behaviors: Cognitive, emotional, and timing substrates. Psychological Review, in press. ABSTRACT What brain mechanisms underlie autism and how do they give rise to autistic behavioral symptoms? This article describes a neural model, called the iSTART model, which proposes how cognitive, emotional, timing, and motor processes that involve brain regions like prefrontal and temporal cortex, amygdala, hippocampus, and cerebellum may interact together to create and perpetuate autistic symptoms. These model processes were originally developed to explain data concerning how the brain controls normal behaviors. The iSTART model shows how autistic behavioral symptoms may arise from prescribed breakdowns in these brain processes, notably a combination of underaroused emotional depression in the amygdala and related affective brain regions, learning of hyperspecific recognition categories in temporal and prefrontal cortices, and breakdowns of adaptively timed attentional and motor circuits in the hippocampal system and cerebellum. The model clarifies how malfunctions in a subset of these mechanisms can, though a system-wide vicious circle of environmentally mediated feedback, cause and maintain problems with them all. Key words: autism, learning, categorization, depression, hypervigilance, adaptive resonance theory, adaptive timing, amygdala, frontal cortex, hippocampus, cerebellum From a.cichocki at riken.jp Tue Dec 27 11:54:39 2005 From: a.cichocki at riken.jp (A. Cichocki) Date: Wed, 28 Dec 2005 01:54:39 +0900 Subject: Connectionists: NMF and SCA papers and MATLAB software NMFLAB Message-ID: <43B171CF.8010006@riken.jp> Dear List Members: I would like to bring to your attention our recently updated papers and reports about NMF (non-negative matrix factorization) and SCA (Sparse Component Analysis) for BSS (Blind and Semi-blind Source Separation) available at: http://www.bsp.brain.riken.jp/%7Ecia/recent.html#nmf http://www.bsp.brain.riken.jp/~cia/recent.html#sca http://www.bsp.brain.riken.jp/~cia/ We release also soon new free MATLAB toolboxes: NMFLAB and SCALAB. I would be grateful for any critical comments or suggestions. Best Wishes, Andrzej Cichocki =============== Laboratory for Advanced Brain Signal Processing Riken, Brain Science Institute, JAPAN Wako Shi, Saitama 351-0198 List of selected new papers and reports about sparse NMF and SCA: NMF 1. A. Cichocki, R. Zdunek, and S. Amari, "Csiszar's Divergences for Non-Negative Matrix Factorization: Family of New Algorithms", 6th International Conference on Independent Component Analysis and Blind Signal Separation, Charleston SC, USA, March 5-8, 2006. [.pdf ] 2. A. Cichocki, S. Amari, and R. Zdunek, "Extended SMART Algorithms for Non-Negative Matrix Factorization", Eighth International Conference on Artificial Intelligence and Soft Computing, ICAISC, Zakopane, Poland, 25-29 June, 2006. [.pdf ] 3. R. Zdunek, and A. Cichocki, "Non-Negative Matrix Factorization with Quasi-Newton Optimization", Eighth International Conference on Artificial Intelligence and Soft Computing, ICAISC, Zakopane, Poland, 25-29 June, 2006. [.pdf ] SCA 1. P. G. Georgiev, F. Theis, and A. Cichocki, "Sparse component analysis and blind source separation of underdetermined mixtures", IEEE Transactions on Neural Networks, July 2005, Vol. 16, No.4, pp. 992-996. [.pdf ] 2. P. G. Georgiev, F. Theis, and A. Cichocki, "Optimization algorithms for sparse representations and applications", Chapter in the book Mulitscale Optimization Methods, Ed. Pardalos, 2005. [.pdf ] 3. Y. Li, S. Amari, A. Cichocki, D. W. C. Ho and S. Xie: "Underdetermined Blind Source Separation Based on Sparse Representation", IEEE Transactions on Signal Processing, Vol. 54, No.2, 2006 (in print). [.pdf ] 4. Y. Li, A. Cichocki, and S. Amari, "Blind estimation of channel parameters and source components for EEG signals: A sparse factorization approach, IEEE Transactions on Neural Networks, 2006, (accepted for publication) [ draft version pdf ] 5. F. J. Theis, P. G. Georgiev, and A. Cichocki, "Robust overcomplete matrix recovery for sparse sources using a generalized Hough transform," in Proceedings of 12th European Symposium on Artificial Neural Networks (ESANN2004), (Bruges, Belgium), pp. 343-348, Apr. 2004. [.pdf ]