From mieko at hip.atr.co.jp Mon Jan 3 21:05:12 2000 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Tue, 4 Jan 2000 11:05:12 +0900 Subject: Neural Networks 13(1) Message-ID: <200001040206.LAA29389@mailhost.hip.atr.co.jp> NEURAL NETWORKS 13(1) Contents - Volume 13, Number 1 - 2000 _______________________________________________________________ EDITORIAL: Our millennium issue! S. Grossberg, M. Kawato, J.G. Taylor NEURAL NETWORKS LETTERS: A learning rule for dynamic recruitment and decorrelation K.P. Kording, P. Konig CURRENT OPINIONS: A proposed name for aperiodic brain activity: stochastic chaos W.J. Freeman Neural networks are useful tools for drug design G. Schneider How good are support vector machines? S. Raudys ARTICLES: *** Psychology and Cognitive Science *** Anxiety-like behavior in rats: a computational model C. Salum, S. Morato, A. Roque-Da-Silva *** Neuroscience and Neuropsychology *** Self-organization of orientation maps in a formal neuron model using a cluster learning rule J. Kuroiwa, S. Inawashiro, S. Miyake, H. Aso *** Mathematical and Computational Analysis *** A cascade associative memory model with a hierarchical memory structure M. Hirahara, N. Oka, T. Kindo Cascade associative memory storing hierarchically correlated patterns with various correlations M. Hirahara, O. Oka, T. Kindo On impulsuve autoassociative neural networks Z.-H. Guan, J. Lam, G. Chen Pattern segmentation in a binary/analog world: unsupervised learning versus memory storing C. Lourenco, A. Babloyantz, M. Hougardy Partially pre-calculated weights for the backpropagation learning regime and high accuracy function mapping using continuous input RAM-based sigma-pi Nets R.S. Neville, T.J. Stonham, R.J. Glover *** Engineering and Design *** Neural net based MRAC for a class of nonlinear plants M.S. Ahmed *** Technology and Applications *** Training neural networks to be insensitive to weight random variations S. Orcioni BOOK REVIEW: Reinforcement learning: an introduction R.P.N. Rao _______________________________________________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** From jfgf at cs.berkeley.edu Mon Jan 3 19:00:20 2000 From: jfgf at cs.berkeley.edu (Nando de Freitas) Date: Mon, 03 Jan 2000 16:00:20 -0800 Subject: NIPS MCMC (Markov Chain Monte-Carlo methods) Workshop update Message-ID: <38713814.451FE4D5@cs.berkeley.edu> Dear connectionists The talks, software for the tutorial examples and several related papers are now available from the NIPS MCMC for machine learning workshop page: http://www.cs.berkeley.edu/~jfgf/nips99.html [ Moderator's note: Here is the description of the workshop from the web page: MCMC techniques are a set of powerful simulation methods that may be applied to solve integration and optimisation problems in large dimensional spaces. These two types of problems are the major stumbling blocks of Bayesian statistics and decision analysis. The basic idea of MCMC methods is to draw a large number of samples distributed according to the posterior distributions of interest or weighted such that it is possible to estimate simulation-based consistent estimates. MCMC methods were introduced in the physics literature in the 1950's, but only became popular in other fields at the beginning of the 1990's. The development of these methods is at the origin of the recent Bayesian revolution in applied statistics and related fields including econometrics and biometrics. The methods are not yet well-known in machine learning and neural networks, despite their ability to allow statistical estimation to be performed for realistic and thus often highly complex models. Neal (1996) introduced MCMC methods, specifically the hybrid Monte Carlo method, into the analysis of neural networks. He showed that the approach can lead to many benefits. MCMC methods have also been successfully applied to interesting inference problems in probabilistic graphical models. However, many recent advances in MCMC simulation, including model selection and model mixing, perfect sampling, parallel chains, forward-backward sampling and sequential MCMC among others, have been overlooked by the neural networks community. This workshop will attempt to provide a simple tutorial review of these state-of-the-art simulation-based computational methods. It will also focus on application domains and encourage audience participation. Speakers will be encouraged to keep the presentation at a tutorial level. -- Dave Touretzky, CONNECTIONISTS moderator ] Happy New Year !!! Nando -- Computer Science Division | Phone : (510) 642-2038 387 Soda Hall | Fax : (510) 642-5775 University of California, Berkeley | E-mail: jfgf at cs.berkeley.edu Berkeley, CA 94720-1776 USA | URL : http://www.cs.berkeley.edu/~jfgf From smyth at sifnos.ICS.UCI.EDU Tue Jan 4 17:41:37 2000 From: smyth at sifnos.ICS.UCI.EDU (Padhraic Smyth) Date: Tue, 04 Jan 2000 14:41:37 -0800 Subject: faculty positions in biomedical image/signal analysis at UC Irvine Message-ID: <200001041441.aa10281@gremlin-relay.ics.uci.edu> Dear Connectionist Colleagues, UCI has 5 new faculty slots currently open in our biomedical engineering department - the department was started last year based on an award from the Whittaker Foundation and is expected to grow substantially over the next few years. See http://soeweb.eng.uci.edu/bme/jobs.stm for details. One area of particular relevance to readers of this list is biomedical image and signal analysis: there are excellent opportunities for collaborative research across campus in this area at present, e.g., we have very active medical and biological research programs in brain imaging (e.g., in Alzheimer's research, autism research) with significant opportunities for interdisciplinary projects. Note that researchers whose focus is specifically in medical imaging (for example) are likely to be of more interest to UCI than researchers interested in image analysis in general. Other listed research areas of potential interest to connectionists are computational neuroscience, quantitative modeling of biological systems, and parallel and/or distributed biomedical computational systems. Note that although the advertised deadline is January 1, late applications are still welcome - I meant to send this email out last Fall :). Positions are available at both senior and junior levels. Please do not send your applications to me personally (use the address on the Web page). But feel free to let me know you have applied, particularly if you apply in the image/signal analysis area. Padhraic Smyth Information and Computer Science University of California, Irvine From terry at salk.edu Wed Jan 5 15:59:26 2000 From: terry at salk.edu (terry@salk.edu) Date: Wed, 5 Jan 2000 12:59:26 -0800 (PST) Subject: NEURAL COMPUTATION 12:1 Message-ID: <200001052059.MAA09969@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 1 - January 1, 2000 ARTICLES Dorrectness of Local Probability Propagation in Graphical Models with Loops Yair Weiss Population Dynamics of Spiking Neurons Fast Transients, Asynchronous States, and Locking Wulfram Gerstner Dynamics of Strongly-Coupled Spiking Neurons Paul C. Bresloff and S. Coombes NOTES On Connectedness: A Solution Based On Oscillatory Correlation DeLiang L. Wang Practical Identifiability of Finite Mixtures of Multivariate Bernoulli Distributions Miguel A. Carreira-Perpinan and Steve Renals LETTERS The Effects of Pair-wise And Higher-order Correlations on the Firing Rate of a Postsynaptic Neuron S. M. Bohte, H. Spekreijse and P. R. Roelfsema Effects of Spike Timing On Winner-Take-All Competition in Model Cortical Circuits Erik D. Lumer Model Dependence in Quantification of Spike Interdependence by Joint Peri-Stimulus Time Histogram Hiroyuki Ito and Satoshi Tsuji Reinforcement Learning in Continuous Time and Space Kenji Doya ----- ON-LINE - http://neco.mitpress.org/ ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From morten at compute.it.siu.edu Thu Jan 6 20:09:55 2000 From: morten at compute.it.siu.edu (Dr. Morten H. Christiansen) Date: Thu, 6 Jan 2000 19:09:55 -0600 (CST) Subject: Special Issue of Cognitive Science on Connectionist Language Processing Message-ID: The members of this list may be interested in the most recent issue of Cognitive Science which is a Special Issue on connectionist language processing: Christiansen, M.H., Chater, N. & Seidenberg, M.S. (Eds.) (1999). Connectionist models of human language processing: Progress and prospects. Special issue of Cognitive Science, Vol. 23(4), 415-634. PREFACE Connectionist Models of Human Language Processing: Progress and Prospects Editors Morten H. Christiansen, Nick Chater & Mark S. Seidenberg This Special Issue appraises the progress made so far and the prospects for future development of connectionist models of natural language processing. This project is timely - the decade since the publication of Rumelhart & McClelland's influential PDP volumes has seen an explosive growth of connectionist modeling of natural language, ranging from models of early speech perception, to syntax and to discourse level phenomena. The breadth and variety of this work is illustrated in the review, which forms the introductory paper in the volume. How much has been achieved by this vast research effort? Part I presents some of the most recent progress by leading connectionist researchers, in a range of topics of central interest in language processing. Gaskell & Marslen-Wilson describe recent developments in connectionist models of speech perception. Plunkett & Juola report on progress in the highly controversial area of connectionist models of morphology. Tabor & Tanenhaus describe their work utilizing recurrent networks to model parsing within a dynamic perspective. Dell, Chang & Griffin provide accounts of lexical and syntactic aspects of language production. Plaut outlines recent developments in connectionist models of reading. Where Part I brings us to the forefront of current connectionist modeling of natural language processing, Part II considers the prospects for future research. Seidenberg and MacDonald argue that connectionism provides a fundamentally new way of looking at language processing and acquisition, which challenges traditional viewpoints derived from linguistics. By contrast, Smolensky attempts to synthesize lessons learned from both linguistics and connectionist research, arguing that progress will come from providing an integration of the two approaches. Steedman takes on the role as an "outside" observer, seeking to put connectionist natural language processing in perspective. Connectionist modeling has had a vast impact throughout cognitive science, and has been both most productive and most controversial in the area of natural language processing and acquisition. This issue can be used as an overview of the "state of the art" in connectionist models of natural language processing. But more important, we hope that it serves also as a contribution to the current research effort in this area, and as a stimulus to informed debate concerning future research on human natural language. TABLE OF CONTENTS (Abstracts can be found at http://siva.usc.edu/~morten/cs.SI-abtracts.html) Introduction Connectionist Natural Language Processing: The State of the Art. Morten H. Christiansen & Nick Chater Part I: Progress Ambiguity, Competition and Blending in Spoken Word Recognition. M. Gareth Gaskell & William D. Marslen-Wilson A Connectionist Model of English Past Tense and Plural Morphology. Kim Plunkett & Patrick Juola Dynamical Models of Sentence Processing. Whitney Tabor & Michael K. Tanenhaus Connectionist Models of Language Production: Lexical Access and Grammatical Encoding. Gary S. Dell, Franklin Chang & Zenzi M. Griffin A Connectionist Approach to Word Reading and Acquired Dyslexia: Extension to Sequential Processing. David C. Plaut Part II: Prospects A Probabilistic Constraints Approach to Language Acquisition and Processing. Mark S. Seidenberg & Maryellen C. MacDonald Grammar-based Connectionist Approaches to Language. Paul Smolensky Connectionist Sentence Processing in Perspective. Mark Steedman [Sorry, I can provide no hardcopies - for electronic copies, please contact the authors directly]. Best regards, Morten Christiansen ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A Personal Web Page: http://www.siu.edu/~psycho/faculty/mhc.html Lab Web Site: http://www.siu.edu/~morten/csl ---------------------------------------------------------------------- From morten at compute.it.siu.edu Fri Jan 7 11:42:57 2000 From: morten at compute.it.siu.edu (Dr. Morten H. Christiansen) Date: Fri, 7 Jan 2000 10:42:57 -0600 (CST) Subject: Graduate Openings in Brain and Cognitive Sciences Message-ID: Dear Colleague, Please bring the following information to the attention of potential graduate school applicants from your program with an interest in Brain and Cognitive Sciences. GRADUATE PROGRAM IN BRAIN AND COGNITIVE SCIENCES IN THE DEPARTMENT OF PSYCHOLOGY AT SOUTHERN ILLINOIS UNIVERSITY, CARBONDALE. The Department of Psychology at Southern Illinois University has several openings for fall 2000 admission to its newly established Ph.D. program in Brain and Cognitive Sciences. The program emphasizes cognitive behavior approached from a combination of developmental (infancy and childhood, adolescence and aging), neurobiological (neurophysiology, neuropsychology, genetics), behavioral (human and animal experimentation) and computational (neural networks, statistical analyses) perspectives. As an integral part of their training, students become active participants in ongoing faculty research programs in the Brain and Cognitive Sciences. Students will receive training in two or more different research methodologies, and are expected to develop a multidisciplinary approach to their own research. Current research by the Brain and Cognitive Sciences faculty includes perinatal risk factors in child development, neurophysiological and behavioral correlates of infant and child cognitive and language development, personality and social correlates of cognitive aging, child play and social behaviors, identity development across the life span, neural network modeling of language acquisition and processing, artificial grammar learning, sentence processing, evolution of language and the brain, the pharmacological modulation of memory, effects of psychoactive drugs, reversible inactivation of discrete brain areas and memory, recovery of function from brain damage, electrophysiological models (e.g., long-term potentiation), the neurophysiology of memory, animal learning, and human learning and memory. For more information about the program and application procedures, please visit our web site at: http://www.siu.edu/~psycho/bcs Visit also the Department's web site at: http://www.siu.edu/~psycho The deadline for applications is February 1st, 2000. Complete applications received by January 15, 2000 may be considered for one of the prestigious Morris Fellowships. Best regards, Morten Christiansen Coordinator of the Brain and Cognitive Sciences Program ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A Personal Web Page: http://www.siu.edu/~psycho/faculty/mhc.html Lab Web Site: http://www.siu.edu/~morten/csl ---------------------------------------------------------------------- From caroly at cns.bu.edu Fri Jan 7 15:02:12 2000 From: caroly at cns.bu.edu (Carol Yanakakis Jefferson) Date: Fri, 7 Jan 2000 15:02:12 -0500 Subject: Cognitive and Neural Systems: A Tenth Anniversary Celebration Message-ID: <200001072002.PAA01349@cochlea.bu.edu> COGNITIVE AND NEURAL SYSTEMS: A TENTH ANNIVERSARY CELEBRATION Tuesday, May 23,2000 at the Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, MA 02215 This one-day event celebrates the tenth anniversary of our department. It will be filled with talks by past graduates of the department, and will include plenty of time for discussion and celebration. The event is open to the public and there is no admission fee. If you plan to attend, please send email to Carol Jefferson (caroly at cns.bu.edu) by May 1, 2000 so that we can estimate attendance for purposes of planning enough food and drink. The celebration will come right before the Fourth International Conference on Cognitive and Neural Systems, which occurs from Wednesday, May 24 through Saturday, May 27. This conference drew around 300 participants from 31 countries last year, and focuses on the two themes: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? For further information about this conference, see http://cns.bu.edu/meetings/ TENTH ANNIVERSARY PROGRAM 8:30-9:00 Provost Dennis Berkey and Stephen Grossberg, Boston University Introduction and Welcome 9:00-9:30 Gregory Francis, Purdue University Orientational Afterimages: Evidence for FACADE 9:30-10:00 Alexander Grunewald, Cal Tech The Perception of Visual Motion: Psychophysics, Physiology and Modeling 10:00-10:30 John Reynolds, NIMH Visual Salience, Competition and Selective Attention 10:30-11:00 Coffee Break 11:00-11:30 David Somers, MIT Attentional Mechanisms in Human Visual Cortex: Evidence from fMRI 11:30-12:00 Luiz Pessoa, NIMH Attentional Strategies for Object Recognition 12:00-12:30 Bruce Fischl, Mass General Hospital Surface-Based Analysis of the Human Cerebral Cortex 12:30-2:00 Lunch (everyone on their own) 2:00-2:30 Paul Cisek, University of Montreal Two Action Systems: Specification and Selection in the Cerebral Cortex 2:30-3:00 John Fiala, Boston University Structural Dynamics of Synapses 3:00-3:30 Karen Roberts, Cognex Corp. Alignment and Inspection of Boundary Contours 3:30-4:00 Coffee Break 4:00-4:30 Gary Bradski, Intel Corp. Motion Segmentation and Pose Recognition with Motion History Gradients 4:30-5:00 Rob Cunningham, MIT Lincoln Laboratory Detecting Computer Attackers: Recognizing Patterns of Malicious, Stealthy Behavior 5:00-8:00 Reception From arbib at pollux.usc.edu Fri Jan 7 18:52:35 2000 From: arbib at pollux.usc.edu (Michael Arbib) Date: Fri, 07 Jan 2000 15:52:35 -0800 Subject: Faculty Position in Computational Neuroscience/Neuroinformatics Message-ID: <200001072355.PAA01066@relay.usc.edu> The University of Southern California (USC) Department of Computer Science (www.usc.edu/dept/cs) invites applications for a tenure-track position in computational neuroscience/neuroinformatics. We are looking for an individual who has an exceptional track record in modeling large-scale neural systems and working with experimentalists to link their data to these models. The successful applicant will be involved in teaching in a computer science and interdisciplinary environment, and will also have the technical ability to serve as Associate Director of the USC Brain Project (http://www-hbp.usc.edu). In particular, the candidate is expected to supervise and contribute to the development of a database environment for integration of empirical neuroscience data and brain models. The computer science community at USC is large and diverse, with faculty both on the University Park Campus and at USC's Information Sciences Institute (ISI). Research topics include algorithms and cryptography, collaborative agents, computational neuroscience, computer architecture, databases and information management, educational technology, genomics & DNA computing, graphics and multi-media, knowledge acquisition, knowledge representation, learning, natural language processing, networking, neural networks, neuroinformatics, ontologies, planning, robotics, software engineering, virtual humans, and vision. Computer Science at USC has a long history as a key component of the University's Neuroscience Program (NIBS: Neural, Informational and Behavioral Sciences) with work in cognitive neuroscience, computational neuroscience, language mechanisms, neural engineering (through the Center for Neural Engineering (http://www.usc.edu/dept/engineering/CNE), neuroinformatics, vision, and visuomotor coordination (with links to biomimetic robotics). In particular, the USC Brain Project (USCBP), funded in part by the Human Brain Project consortium, integrates empirical research in the neuroanatomy, neurochemistry and neurophysiology of synaptic plasticity, motivation, and visuomotor coordination with research in neuroinformatics, adapting such computational techniques as databases, the World Wide Web, data mining, and visualization to the analysis of neuro-science data, and employing computational neuroscience to study the relations between structure and function. USC is also part of the Dynamic Brain Project, an international research focus on computational motor control. USC's broader computer science community includes not only the Computer Science Department, the Computer Engineering Program and ISI but also the Integrated Media Systems Center (IMSC), an NSF Engineering Research Center focusing on computer interfaces, information management and media communications; and the newly created Institute for Creative Technologies that brings together expertise from USC's Schools of Engineering, Cinema-Television, and Communications (Annenberg) plus the entertainment industry, to develop the art and technology for compellingly realistic virtual experiences. Preliminary enquiries and requests for further information may be sent to Michael Arbib (arbib at pollux.usc.edu). Applicants should send a comprehensive resume, a list of references, and a statement of goals to: Paulina Tagle, Computer Science, USC SAL 300, Los Angeles, CA 90089-0781 (paulina at pollux.usc.edu). USC is an Equal Opportunity/Affirmative Action Employer and encourages applications from women and minorities. From arbib at pollux.usc.edu Fri Jan 7 19:01:06 2000 From: arbib at pollux.usc.edu (Michael Arbib) Date: Fri, 07 Jan 2000 16:01:06 -0800 Subject: Faculty Position in Machine Learning Message-ID: <200001080004.QAA01197@relay.usc.edu> The University of Southern California (USC) Department of Computer Science (http://www.usc.edu/dept/cs/) invites applications for tenure-track positions from outstanding candidates in Machine Learning. Exceptional candidates in other areas of Artificial Intelligence (and Computer Science) may also be considered. We are particularly seeking candidates with strong collaborative inclinations. USC's Intelligent Systems Group comprises faculty both on the main campus and at USC's Information Sciences Institute (ISI). It ranks fourth overall in terms of AAAI Fellows and includes the current Chair of SIGART along with two former chairs. Research topics include collaborative agents, computational neuroscience, educational technology, knowledge acquisition, knowledge representation, learning, natural language processing, neural networks, ontologies, planning, robotics, virtual humans, and vision. The computer science community at USC is similarly large and diverse, with emphases in addition to intelligent systems in such areas as algorithms and cryptography, computer architecture, databases and information management, genomics & DNA computing, graphics and multi-media, networking, neuroinformatics, and software engineering. The broader computer science community includes not only the Computer Science Department, the Computer Engineering Program and ISI but also the Integrated Media Systems Center (IMSC), an NSF Engineering Research Center focusing on computer interfaces, information management and media communications; and the newly created Institute for Creative Technologies that brings together expertise from USC's Schools of Engineering, Cinema-Television, and Communications (Annenberg) plus the entertainment industry, to develop the art and technology for compellingly realistic virtual experiences. If interested, please send a comprehensive resume, a list of references, and a statement of goals to: Paulina Tagle, Computer Science, USC SAL 300, Los Angeles, CA 90089-0781 (paulina at pollux.usc.edu). USC is an Equal Opportunity/Affirmative Action Employer and encourages applications from women and minorities. Distributed Multimedia Information Management and Databases Committee: Databases and Information Management: McLeod (Chair), Boehm [?], Nikias Long Ad: The University of Southern California (USC) Department of Computer Science (http://www.usc.edu/dept/cs/) invites applications for tenure-track positions from outstanding candidates in Distributed Multimedia Information Management and Databases. This position will be in conjunction with the USC Integrated Media Systems Center (IMSC), an NSF ERC in the area of Integrated Media Systems. The applicant will be expected to take a major leadership role in IMSC Research and collaboratory projects. The computer science community at USC is large and diverse, with emphases in such areas as algorithms and cryptography, collaborative agents, computational neuroscience, computer architecture, databases and information management, educational technology, genomics & DNA computing, graphics and multi-media, knowledge acquisition, knowledge representation, learning, natural language processing, networking, neural networks, neuroinformatics, ontologies, planning, robotics, software engineering, virtual humans, and vision. The broader computer science community includes not only the Computer Science Department, the Computer Engineering Program and IMSC, but also the Information Sciences Institute and the newly created Institute for Creative Technologies that brings together expertise from USC's Schools of Engineering, Cinema-Television, and Communications (Annenberg) plus the entertainment industry, to develop the art and technology for compellingly realistic virtual experiences. If interested, please send a comprehensive resume, a list of references, and a statement of goals to: Paulina Tagle, Computer Science, USC SAL 300, Los Angeles, CA 90089-0781 (paulina at pollux.usc.edu). USC is an Equal Opportunity/Affirmative Action Employer and encourages applications from women and minorities. From aslin at cvs.rochester.edu Sun Jan 9 15:00:28 2000 From: aslin at cvs.rochester.edu (Richard Aslin) Date: Sun, 9 Jan 2000 15:00:28 -0500 Subject: postdoc positions at the University of Rochester Message-ID: The University of Rochester seeks five or more outstanding postdoctoral fellows with research interests in several areas of the Cognitive Sciences, including language, learning, and development. Three grants from NIH and NSF provide support. (1) An NIH training grant is affiliated with the Center for the Sciences of Language. The Center brings together faculty and students with interests in spoken and signed languages from the Departments of Brain and Cognitive Sciences, Computer Science, Linguistics, and Philosophy, as well as the interdepartmental program in Neuroscience. We encourage applicants from any of these disciplines who have expertise in any area of natural language. We are particularly interested in postdoctoral fellows who want to contribute to an interdisciplinary community. (2) A second NIH training grant spans the disciplines of Learning, Development, and Behavior. Applicants should have expertise in human or animal research on learning and developmental plasticity or in computational modeling. Contributing faculty are in the Departments of Brain and Cognitive Sciences, Computer Science, and the interdepartmental program in Neuroscience. (3) An NSF research grant on Learning and Intelligent Systems is directed to questions of rapid statistical learning in a variety of domains. Applicants should have expertise in behavioral, computational, or neurobiological approaches to statistical learning in humans or animals. Contributing faculty are in the Departments of Brain and Cognitive Sciences at Rochester and the Department of Psychology at Harvard. The NIH fellowships are open only to US citizens or permanent residents. Applicants should send a letter describing their graduate training and research interests, a curriculum vitae, and arrange to have three letters of recommendation sent to: Professor Richard N. Aslin, Department of Brain and Cognitive Sciences, Meliora Hall, University of Rochester, Rochester, NY 14627-0268. Review of applications will begin on February 15, 2000 and continue until all of the positions are filled, with expected start dates ranging from June 30 to September 1, 2000. Learn more about the relevant departments, faculty, and training opportunities by visiting the University of Rochester web site at http://www.rochester.edu. -------------------------------------------------------- Richard N. Aslin Department of Brain and Cognitive Sciences Meliora Hall University of Rochester Rochester, NY 14627 email: aslin at cvs.rochester.edu phone: (716) 275-8687 FAX: (716) 442-9216 http://www.cvs.rochester.edu/people/r_aslin/r_aslin.html From berthouz at etl.go.jp Tue Jan 11 01:39:26 2000 From: berthouz at etl.go.jp (Luc Berthouze) Date: Tue, 11 Jan 2000 15:39:26 +0900 Subject: postdoc position at the Electrotechnical Laboratory (ETL), Japan Message-ID: <200001110639.PAA08517@aidan.etl.go.jp> We are seeking a postdoctoral fellow with research interests in computational neuroscience and cognitive science to collaborate in a project aiming at identifying the neural correlates of sensorimotor categorization. Candidates should have expertise in computational modeling, in human or animal research on learning and some experience in applying neural models to artificial systems (robots or simulations). The fellowship, for a 2-years period, include salary, accomodation and airfare. Candidates should contact Luc Berthouze for more details. ----- Dr. Luc Berthouze Information Science Division Electrotechnical Laboratory Umezono 1-1-4, Tsukuba 305-8568, Japan Tel: +81-298-545369 Fax: +81-298-545857 email: berthouz at etl.go.jp  From m.niranjan at dcs.shef.ac.uk Tue Jan 11 08:16:59 2000 From: m.niranjan at dcs.shef.ac.uk (Mahesan Niranjan) Date: Tue, 11 Jan 2000 13:16:59 +0000 (GMT) Subject: Research Assistantships Message-ID: <200001111316.NAA02109@bayes.dcs.shef.ac.uk> ______________________________________________________________________________ Research Assistantships: Modelling Tools for Air Pollution Two-year post doctoral positions, one at Sheffield University CS Department and one at Anglia Polytechnic University(APU) Geography Department, will be advertised shortly for immediate start. The project, funded by the European Community involves a consortium of 9 partners, the lead contractor being the Department of Environmental Sciences, University of East Anglia. The project acronym is APPETISE [the 'A' is for 'Air', one of the 'P's is for Pollution, and the rest is necessary condition for EC funding :-)]. A substantial part of the project will involve the use of data driven models, such as neural networks and other time series modelling tools applied to air-pollution, traffic and meteorological data from an urban environment. As is common with such projects, there are milestones, workpackages and deliverables, but there will also be good space for original research. The work at APU will have a slight bias towards instrumentation, data collection and handling. The work at Sheffield will have a theoretical/ modelling bias. If interested, or if you know anyone who might be interested, please let us know: Mahesan Niranjan : m.niranjan at dcs.shef.ac.uk Alison Greig : A.J.Greig at anglia.ac.uk ____________________________________________________________________ Mahesan Niranjan Phone: 44 114 222 1805 Professor of Computer Science FaX: 44 114 222 1810 University of Sheffield Email: M.Niranjan at dcs.shef.ac.uk http://www.dcs.shef.ac.uk/~niranjan ____________________________________________________________________ From terry at salk.edu Tue Jan 11 15:35:51 2000 From: terry at salk.edu (terry@salk.edu) Date: Tue, 11 Jan 2000 12:35:51 -0800 (PST) Subject: NEURAL COMPUTATION 12:2 Message-ID: <200001112035.MAA12438@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 2 - February 1, 2000 ARTICLE Minimizing Binding Errors Using Learned Conjunctive Features Bartlett Mel and Jozsef Fiser NOTES Relationship Between Phase And Energy Methods For Disparity Computation Ning Qian and Sam Mikaelian N-tuple Network, CART And Bagging Aleksander Kolcz Improving The Practice Of Classifier Performance Assessment N. M. Adams and D. J. Hand LETTER Do Simple Cells In Primary Visual Cortex Form A Tight Frame? Emilio Salinas and L.F. Abbott Learning Overcomplete Representations Michael S. Lewicki and Terrence J. Sejnowski Noise In Integrate-And-Fire Neurons: From Stochastic Input To Excape Rates Hans E. Plesser and Wulfram Gerstner Modeling Synaptic Plasticity In Conjunction With The Timing of Pre- And Postsynaptic Action Potentials Werner M. Kistler and J. Leo van Hemmen On-line EM Algorithm For the Normalized Gaussian Network Masa-aki Sato and Shin Ishii A General Probability Estimation Approach for Neural Computation Maxim Khaikine and Klaus Holthausen On The Synthesis Of Brain-State-In-A-Box Neural Models With Application To Associative Memory Fation Sevrani and Kennichi Abe ----- ON-LINE - http://neco.mitpress.org/ ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From bert at mbfys.kun.nl Wed Jan 12 08:17:20 2000 From: bert at mbfys.kun.nl (bert@mbfys.kun.nl) Date: Wed, 12 Jan 2000 14:17:20 +0100 (MET) Subject: Graphical model software Message-ID: <200001121317.OAA02671@bertus.mbfys.kun.nl> SOFTWARE ANNOUNCEMENT We would like to announce BayesBuilder, a tool for constructing and testing Bayesian networks. This software can be used free of charge for non-commercial purposes. BayesBuilder supports the following features: - User friendly graphical interface. - Comprehensive help function. - Defining several views on parts of the network, which is essential for building large networks. - Importing networks from the Hugin Format, the Netica Format, the Microsoft Bayesian Network Format and the Bayesian Interchange Format. - Exporting the status of the network to a database of cases, and importing from the cases database to the network. - Undo/redo support. - Automatic network layout. System requirements: Win32 Release for Windows 95, Windows 98 and Windows NT (4.0) on Intel hardware. A Pentium or faster processor. A minimum of 32 megabytes of RAM is required. A minimum of 22 Mb on harddisk is required. Minimum desktop area: 800x600 pixels. Color pallette: 256 colors, VGA. For more information and for free downloading of BayesBuilder, please go to http://www.mbfys.kun.nl/snn/Research/bayesbuilder/ BayesBuilder was used by our group to develop Promedas, a diagnostic decision support system for anaemia. For more information about Promedas and a free demo CD, see http://www.mbfys.kun.nl/snn/Research/promedas/ Suggestions and comments are welcome. SNN PO Box 9101 6500 HB Nijmegen The Netherlands University of Nijmegen Tel.: +31-(0)24 3614241 Fax.: +31-(0)24 3541435 mailto:snn at mbfys.kun.nl Best regards, Bert Kappen From ecai2000 at mcculloch.ing.unifi.it Wed Jan 12 10:34:35 2000 From: ecai2000 at mcculloch.ing.unifi.it (Paolo Frasconi) Date: Wed, 12 Jan 2000 16:34:35 +0100 Subject: ECAI-2000 Workshop: Connectionist-symbolic integration Message-ID: <000a01bf5d12$87b69e70$6e0fd996@dsi.unifi.it> CALL FOR PAPERS AND PARTICIPATION ECAI-2000 Workshop on Foundations of connectionist-symbolic integration: representation, paradigms, and algorithms http://www.dsi.unifi.it/~paolo/ECAI2000 Humboldt University, Berlin (Germany) August 21, 2000 BACKGROUND In recent years much attention has been paid to the integration of connectionist systems with symbol based techniques. Whereas such an approach has clear advantages, it also encounters serious difficulties and challenges. Various models and ideas have been proposed to address various problems and aspects in this integration. The few unified approaches that have been proposed are still very limited, showing both the lack of a full understanding of the relevant aspects of this new discipline and the broad complexity in scope and tasks. In this workshop, we aim at fostering a deep discussion about at least three topics that we believe to be fundamental for the development of a successful theory of Connectionist-Symbolic Integration: representation, paradigms, and algorithms. Concerning representation, it is fully recognized that structured representations are ubiquitous in different fields such as knowledge representation, language modeling and pattern recognition. The interest in developing connectionist architectures capable of dealing with these rich representations (as opposed to "flat" or vector-based representations) can be traced back to the end of the 80's. Today, after more than ten years since the explosion of interest in connectionism, research in architectures and algorithms for learning structured representations still has a lot to explore and no definitive answers have emerged. Different integration paradigms have also been proposed: these are the unified and the hybrid approaches to integration. Whereas the purely connectionist ("connectionist-to-the-top") approach claims that complex symbol processing functionalities can be achieved via neural networks alone, the hybrid approach is premised on the complementarity of the two paradigms and aims at their synergistic combination in systems comprising both neural and symbolic components. In fact, these trends can be viewed as two ends of an entire spectrum. Topics of interest include: - Algorithms for extraction, injection and refinement of structured knowledge from, into and by neural networks. - Inductive discovery/formation of structured knowledge. - Classification, recognition, prediction, matching and manipulation of structured information. - Neural models to infer hierarchical categories. - Applications of hybrid symbolic-connectionist models to real world problems. All these topics are usually investigated and probed independently from each other and making use of different assumptions and techniques. The organizers believe it is necessary to enforce a higher level of cross-interaction among these issues, making use of all the computational tools we have available, such as deterministic and probabilistic approaches, event-based modeling, computational logic, computational learning theory, and so on. Moreover, special attention will be given to applications domains, with the aim to devise a taxonomy that may be useful to the selection of the most suited integration paradigms and techniques to be used. We hope, also, to be able to discuss some application cases where to verify the basic ideas emerged in the literature and in the workshop's discussion itself. PARTICIPATION Participation in the workshop is open to all members of the AI community. Participants are expected to register for the main ECAI-2000 conference (please see http://www.ecai2000.hu-berlin.de for details). The number of participants is limited. The workshop will feature invited talks, contributed presentations, and open discussion. Submitted papers will be reviewed by at least two referees. Articles reporting work in progress are encouraged. However, papers should be original and not already submitted for publication. All submissions should be sent to the organizers by e-mail, in PostScript or PDF format, to the address ecai2000 at mcculloch.ing.unifi.it. Common compression utilities (such as gzip, compress, or winzip) can be used. Submitted papers should not exceed 10 pages. Other researchers interested in attending the workshop without contributing a paper should send a position paper of 1-2 pages describing their interest in the mentioned topics. IMPORTANT DATES Submission Deadline: March 31, 2000 Submission Notification: May 15, 2000 Final Submission Due: June 10, 2000 Workshop Held: August 21, 2000 WORKSHOP ORGANIZERS: Paolo Frasconi, University of Florence, Italy (paolo at dsi.unifi.it) Marco Gori, University of Siena, Italy, (marco at ing.unisi.it) Franz Kurfess, Concordia University, Canada (franz at cs.concordia.ca) Alessandro Sperduti, University of Pisa, Italy (perso at di.unipi.it) From renner at ecst.csuchico.edu Wed Jan 12 13:36:38 2000 From: renner at ecst.csuchico.edu (Renee Renner) Date: Wed, 12 Jan 2000 10:36:38 -0800 (PST) Subject: IC-AI 2000 CFP Message-ID: *** apologies to recipients of multiple lists *** C A L L F O R P A P E R S ============================= Neural Network Subsystems A SPECIAL SESSION OF ==================== The 2000 International Conference on Artificial Intelligence (IC-AI'2000) June 26 - 29, 2000 Monte Carlo Resort, Las Vegas, Nevada, USA SESSION CHAIR: R.S. Renner renner at ecst.csuchico.edu SESSION INFORMATION: ************************************************************************** Artificial neural networks are increasingly being incorporated as components or subsystems of higher-order systems. These systems may represent ensembles of neural networks, fuzzy-neural systems, neural-genetic systems, or other such hybrid intelligent systems intended for classification, prediction, model selection, analysis, data mining or control. This session is open to neural network architectures, algorithms, applications, and tools that contribute to or provide a framework for interfacing ANNs with larger intelligent systems. *************************************************************************** GENERAL INFORMATION: The IC-AI'2000 will be held simultaneously (same location and dates) with The International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA) and The International Conference on Imaging Science, Systems, and Technology (CISST). (A link to the IC-AI'2000 official web site is available from: http://www.ashland.edu/~iajwa/Conferences/index.html) IMPORTANT DATES: February 28, 2000 (Monday): Draft papers (about 4 pages) due April 3, 2000 (Monday): Notification of acceptance May 1, 2000 (Monday): Camera-Ready papers & Prereg. due June 26 - 29, 2000: IC-AI'2000 Conference All accepted papers are expected to be presented at the conference. SCOPE: Topics of interest for other sessions include, but are not limited to, the following: O. Intelligent Information Systems O. Intelligent Software Engineering O. Intelligent Agents O. Intelligent Networks O. Intelligent Databases O. Brain Models O. Evolutionary Algorithms O. Data mining O. Machine Learning O. Reasoning Strategies O. Automated Problem Solving O. Distributed AI Algorithms and Techniques O. Distributed AI Systems and Architectures O. Expert Systems O. Fuzzy Logic O. Genetic Algorithms O. Heuristic Searching O. Knowledge Acquisition O. Knowledge Discovery O. Knowledge Representation O. Knowledge-Intensive Problem Solving Techniques O. Languages and Programming Techniques for AI O. Software Tools for AI O. Natural Language Processing O. Neural Networks and Applications O. Multisource Information Fusion: Theory and Applications O. Multisource-Multisensor Data Fusion O. Learning and Adaptive Sensor Fusion O. Multisensor Data Fusion Using Neural and Fuzzy Techniques O. Integration of AI with other Technologies O. Evaluation of AI Tools O. Evolutionary Computation O. Social Impact of AI O. Applications - Computer Vision O. Applications - Signal Processing O. Applications - Military O. Applications - Surveillance O. Applications - Robotics O. Applications - Medicine O. Applications - Pattern Recognition O. Applications - Face Recognition O. Applications - Finger Print Recognition O. Applications - Finance and Marketing O. Applications - Stock Market O. Applications - Education O. Emerging Applications SUBMISSION OF PAPERS: Prospective authors are invited to submit three copies of their draft paper (about 4 pages) to the session chair, Dr. R.S. Renner, by the due date: R.S. Renner California State University, Chico Department of Computer Science Chico, CA 95929-0410, U.S.A. Tel: (530) 898-5419 Fax: (530) 898-5995 E-mail: renner at ecst.csuchico.edu E-mail and Fax submissions are also acceptable. The length of the Camera-Ready papers (if accepted) will be limited to 7 pages. Papers must not have been previously published or currently submitted for publication elsewhere. The first page of the draft paper should include: title of the paper, name, affiliation, postal address, E-mail address, telephone number, and Fax number for each author. The first page should also include the name of the author who will be presenting the paper (if accepted) and a maximum of 5 keywords. EVALUATION PROCESS: Papers will be evaluated for originality, significance, clarity, and soundness. Each paper will be refereed by two researchers in the topical area. The Camera-Ready papers will be reviewed by one person. PUBLICATION: The conference proceedings will be published by CSREA Press (ISBN). The proceedings will be available at the conference. Some accepted papers will also be considered for journal publication (soon after the conference). ORGANIZERS/SPONSORS: A number of university faculty members and their staff in cooperation with the Monte Carlo Resort (Conference Division, Las Vegas ), will be organizing the conference. The conference will be sponsored by Computer Science Research, Education, & Applications Press (CSREA: USA Federal EIN # 58-2171953) in cooperation with research centers, international associations, international research groups, and developers of high-performance machines and systems. The complete list of sponsors and co-sponsors will be available at a later time. (Last conference's sponsors included: CSREA, the National Supercomputing Center for Energy and the Environment - DOE, The International Association for Mathematics and Computers in Simulation, The International Technology Institute (ITI), The Java High Performance Computing research group, World Scientific and Engineering Society, Sundance Digitial Signal Processing Inc., the Computer Vision Research and Applications Tech., ...) LOCATION OF CONFERENCE: The conference will be held in the Monte Carlo Resort hotel, Las Vegas, Nevada, USA. This is a mega hotel with excellent conference facilities and over 3000 rooms. The hotel is minutes from the Las Vegas airport with free shuttles to and from the airport. This hotel has many vacation and recreational attractions, including: waterfalls, casino, spa, pools & kiddie pools, sunning decks, Easy River water ride, wave pool with cascades, lighted tennis courts, health spa (with workout equipment, whirlpool, sauna, ...), arcade virtual reality game rooms, nightly shows, snack bars, a number of restaurants, shopping area, bars, ... Many of these attractions are open 24 hours a day and most are suitable for families and children. The negotiated hotel's room rate for conference attendees is very reasonable (79USD + tax) per night (no extra charge for double occupancy) for the duration of the conference. The hotel is within walking distance from most other Las Vegas attractions (major shopping areas, recreational destinations, fine dining and night clubs, free street shows, ...). For the benefit of our international colleagues: the state of Nevada neighbors with the states of California, Oregon, Idaho, Utah, and Arizona. Las Vegas is only a few driving hours away from other major cities, including: Los Angeles, San Diego, Phoenix, Grand Canyon, ... EXHIBITION: An exhibition is planned for the duration of the conference. We have reserved 20+ exhibit spaces. Interested parties should contact H. R. Arabnia (address is given below). All exhibitors will be considered to be the co-sponsors of the conference. SESSION CONTACT: Renee S. Renner California State University, Chico Department of Computer Science Chico, CA 95929-0410, U.S.A. Tel: (530) 898-5419 Fax: (530) 898-5995 E-mail: renner at ecst.csuchico.edu CONFERENCE CONTACT: Hamid R. Arabnia The University of Georgia Department of Computer Science 415 Graduate Studies Research Center Athens, Georgia 30602-7404, U.S.A. Tel: (706) 542-3480 Fax: (706) 542-2966 E-mail: hra at cs.uga.edu *********************************************************************** From ted.carnevale at yale.edu Wed Jan 12 16:08:44 2000 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Wed, 12 Jan 2000 16:08:44 -0500 Subject: Two new papers Message-ID: <387CED5C.4728CDFD@yale.edu> The following two articles may be of interest to those who are interested in synaptic integration, either from the theoretical or experimental standpoint, and/or empirically-based modeling of neurons. The first paper will be most relevant to those who are interested in modeling the roles of biophysical mechanisms and use-dependent plasticity in the operation of individual neurons and neural networks. Carnevale, N.T., and Hines, M.L. Expanding NEURON=92s repertoire of mechanisms with NMODL. Neural Computation 12:839-851, 2000 This is an "executive summary"; for those who need to know the details, a preprint of this paper that includes many more figures, examples, and an extensive index is available from our WWW site at http://www.neuron.yale.edu/neuron/papers/nc99/nmodl.htm The second paper shows that many classes of neurons (especially nonpyramidal cells in vertebrate CNS) are fundamentally similar to the processing elements of artificial neural nets, in the sense that synaptic impact at the spike trigger zone is determined by the properties of the synapse itself, and not by the anatomical location of the synapse. It also challenges the widely-held notion that active currents are necessary to overcome location-dependent attenuation of synaptic inputs (with one important exception, as noted below). Passive normalization of synaptic integration influenced by dendritic architecture. David B. Jaffe and Nicholas T. Carnevale J. Neurophysiol. 1999 82(6): p. 3268-3285 If you or your institutuion subscribes to Journal of Neurophysiology, you can get their PDF file of the article from a link at http://jn.physiology.org/cgi/content/abstract/82/6/3268 Otherwise, you may pick up a preprint from http://www.neuron.yale.edu/neuron/papers/jnp99/pasnorm.pdf The two most significant findings reported in this paper are: 1. The peak amplitude of individual PSPs as a function of synaptic location is best predicted by the spatial profile of transfer impedance (Zc), rather than the more commonly studied somatopetal voltage transfer ratio (Vsoma/Vsynapse). 2. Active currents are generally NOT necessary to overcome location-dependent attenuation of PSP amplitudes in real neurons. Dendritic fields that are organized around a central or "primary" dendrite were the only exception to this rule. In other words, peak PSP amplitude observed at the spike trigger zone is very nearly as large as at the synaptic location, and shows little variation with synaptic location in cells such as interneurons, granule cells of the dentate gyrus, and CA3 pyramidal neurons, even when active currents are NOT present. This also applies to synapses onto the basilar branches of CA1 pyramidal cells and deep neocortical pyramids. Since this reduction of location-dependent PSP amplitude variance does not require active currents, we call this phenomenon "passive synaptic normalization." As noted above, passive synaptic normalization does not occur in dendritic fields that have terminal branches organized around a central or "primary" dendrite, e.g. the apical dendrites of CA1 and deep neocortical pyramidal cells. In subsequent work that we presented at the most recent meeting of the Neuroscience Society Carnevale, N.T. and Jaffe, D.B.: Dendritic architecture can compensate for synaptic location without active currents: the importance of input and transfer impedance for synaptic integration. Neuroscience Society Abstracts 25:1741, 1999. we found that the lack of passive synaptic normalization in such dendritic fields is due to the loading effect of terminal branches, which tend to flatten the spatial profile of input impedance along the primary dendrite. --Ted From ken at phy.ucsf.EDU Wed Jan 12 18:24:15 2000 From: ken at phy.ucsf.EDU (Ken Miller) Date: Wed, 12 Jan 2000 15:24:15 -0800 (PST) Subject: UCSF Postdoctoral Fellowships in Theoretical Neurobiology: 2nd Notice Message-ID: <14461.3359.538556.812685@django.ucsf.edu> The Sloan Center for Theoretical Neurobiology at UCSF solicits applications for post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. Applications for postdoctoral fellowships are due Feb.~1, 2000. We also offer predoctoral training, but the application deadline for this year has passed. FOR FULL INFORMATION, PLEASE SEE: http://www.sloan.ucsf.edu/sloan/sloan-info.html In particular, we have recently added to our web site a description of a set of sample projects of particular current interest to the Sloan faculty, to give a more concrete idea of our work to those contemplating entering neuroscience from another field. PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR EMAIL sloan-info at phy.ucsf.edu. From erik at bbf.uia.ac.be Thu Jan 13 09:11:22 2000 From: erik at bbf.uia.ac.be (Erik De Schutter) Date: Thu, 13 Jan 2000 15:11:22 +0100 Subject: CNS 2000 Call For Papers Message-ID: CALL FOR PAPERS Ninth Annual Computational Neuroscience Meeting CNS*2000 July 16-20, 2000 Brugge, Belgium http://cns.numedeon.com/cns2000 DEADLINE FOR SUMMARIES AND ABSTRACTS: **>> 11:59 pm January 26, 2000 <<<<** This is the ninth annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. THIS YEAR'S MEETING The meeting in 2000 will take place for the first time in Europe, in Brugge, Belgium from the 16th to the 20th of July. The meeting will officially start at 9 am, Sunday, July 16th and end with the annual banquet on Thursday evening, July 20th. There will be no parallel sessions. The meeting will include time for informal workshops organized both before and during the meeting. The meeting will be held at the Congress Centre Old Saint-John in Brugge (http://www.brugge.be/toerisme/en/meetinge.htm). Brugge is known in some circles as "The Venice of the North" and is an old and beautiful city with a modern conference center and easy access to international travel connections. Housing accommodations will be available in numerous nearby hotels. SUBMISSION INSTRUCTIONS With this announcement we solicit paper submissions to the meeting. Papers can include experimental, model-based, as well as more abstract theoretical approaches to understanding neurobiological computation. We especially encourage papers that mix experimental and theoretical studies. We also accept papers that describe new technical approaches to theoretical and experimental issues in computational neuroscience. Papers for the meeting should be submitted electronically using a custom designed JAVA/HTML interface found at the meeting web site: http://cns.numedeon.com/cns2000. Authors must submit two descriptions of completed work. First, a 100 word abstract must be provided that succinctly describes the research results. This abstract is published in the conference program as well as on the meeting web site. Authors must also submit a 1000 word description of their research. This description is in the review process and should clearly state the objectives and context for the work as well as the results and its significance. Information on all authors must be entered. Submissions will not be considered if they lack any of the required information or if they arrive late. All submissions will be acknowledged immediately by email. It is important to note that this notice, as well as all other communication related to the paper will be sent to the designated correspondence author only. Full instructions for submission can be found at the meeting web site: http://cns.numedeon.com/cns2000. THE REVIEW PROCESS All papers submitted to CNS are peer reviewed. Because the meeting this year will be held in Europe, we have accelerated the process of paper acceptance to allow more time to make travel plans. For this reason, the review process will take place in two rounds. In the first papers will be judged and accepted for the meeting based on the clarity with which the work is described and the biological relevance of the research. For this reason authors should be careful to make the connection to biology clear in both the 100 word abstract and the 1000 word research summary. We expect to notify authors of meeting acceptance by the second week of February. The second stage of review will take place in March and involves evaluation of each submission by two referees. The primary objective of this round of review will be to select papers for oral presentation. All accepted papers not selected for oral talks as well as papers explicitly submitted as poster presentations will be included in one of three evening poster sessions. Authors will be notified of the presentation format of their papers no later than the second week of May, 2000. CONFERENCE PROCEEDINGS All research accepted and presented at the CNS meeting is eligible for publication in the CNS proceedings. The proceedings volume is published each year as a special supplement to the journal 'Neurocomputing'. In addition the proceedings are published in a hard bound edition by Elsevier Press. 6 page proceedings papers are submitted in October following the meeting. For reference, papers presented at CNS*98 can be found in volumes 26 and 27 of Neurocomputing published in 1999. STUDENT TRAVEL GRANTS We have made an extra effort this year to raise funds to provide travel grant supplements for students and postdoctoral fellows presenting papers. Also, we will have travels grants available both for Europeans and USA participants. While grants are awarded based on need, we anticipate that any presenting student requiring a travel supplement will be able to receive some assistance. In addition, the program committee has arranged very inexpensive housing for students in Brugge. FURTHER MEETING CORRESPONDENCE Additional questions about this year's meeting or the paper submission process can be sent via email to cns2000 at bbb.caltech.edu or via surface mail to: CNS*2000 Division of Biology 216-76 Caltech Pasadena, CA 91125 CNS*2000 ORGANIZING COMMITTEE: Co-meeting Chair / Logistics - Erik De Schutter, University of Antwerp Co-meeting Chair / Finances and Program - Jim Bower, Caltech Governmental Liaison - Dennis Glanzman, NIMH/NIH Workshop Organizer - Maneesh Sahani, University College, London CNS*2000 PROGRAM COMMITTEE: Avrama Blackwell, George Mason University Anders Lansner, Royal Institute of Technology, Sweden Chip Levy, University of Virginia Ray Glantz, Rice University David Horn, University of Tel Aviv Ranu Jung, University of Kentucky Steven J. Schiff, George Mason University Simon Thorpe, CNRS, Toulouse, France From eric at research.nj.nec.com Thu Jan 13 15:28:20 2000 From: eric at research.nj.nec.com (Eric B. Baum) Date: Thu, 13 Jan 2000 15:28:20 -0500 (EST) Subject: Career Opportunities Message-ID: <14462.13606.868870.801430@borg22.nj.nec.com> The NEC Research Institute (NECI) has immediate openings for outstanding researchers in computer science. Candidates are expected to establish a basic research program of international stature. Ph.D.s are required. NECI currently has programs in theory; machine learning; web computing; computer vision; computational linguistics; operating systems, programming languages and compilers; and parallel architecture. We are primarily seeking applicants who work on machine learning of relevance to web applications, or on systems with relevance to web applications. However, we will also give consideration to exceptional applicants in any of our existing areas of focus. The NEC Research Institute, founded ten years ago, has as its mission basic research in Computer Science and Physics underlying future computer and communication technologies. The Institute offers unusual opportunities in that: 1)Members are free to decide on their own basic research directions and projects; 2)Positions include budgets for research, support staff, travel, and equipment; and 3)All results are published in the open literature. Located near Princeton, NJ, NECI has close ties with many outstanding research universities in the area. The Institute's laboratories are state-of-the-art and include several high-end parallel compute servers. For more details about NECI, see http://www.neci.nj.nec.com. Applicants must show documentation of eligibility for employment. NECI is an equal opportunity employer. Full applications should include resumes, copies of selected publications, names of at least three references, and a two-page statement of proposed research directions. Applications will be reviewed beginning February 1, 2000. Please send applications or inquiries to: David L. Waltz VP, Computer Science Research NEC Research Institute 4 Independence Way Princeton, NJ 08540 cs-candidates at research.nj.nec.com From akaysha at kongzi.unm.edu Fri Jan 14 01:30:49 2000 From: akaysha at kongzi.unm.edu (Akaysha Tang) Date: Thu, 13 Jan 2000 23:30:49 -0700 (MST) Subject: No subject Message-ID: From jchsieh at vghtpe.gov.tw Sat Jan 15 20:32:26 2000 From: jchsieh at vghtpe.gov.tw (Jen-Chuen Hsieh) Date: Sun, 16 Jan 2000 09:32:26 +0800 Subject: Faculty and Post-doc positions available Message-ID: <006b01bf5fc1$92bab660$c43efea9@fmrilab> Faculty and Post-Doc Position(s) Available at in Taiwan (National Yang-Ming University and Taipei Veterans General Hospital) on fMRI/Magnetoencephalography, Advanced Signal Processing, and Human Brain Science (preferably cognitive neuropsychology). Wanted: Cognitive neuroscientists, cognitive neuropsychologist, programmers, computer scientists, and physicists to join our growing Human Brain Rseaarch Group. The National Yang-Ming University and Taipei-Veterans General Hospital of Taiwan have in the campus an encampossing setup of PET, 3T-MRI, whole-head MEG, ERP and TMS for brain research. With my best regards! JC -------------------------------------------------- Jen-Chuen Hsieh, MD, PhD Associate Professor & Project Coordinator Integrated Brain Research Unit Taipei Veterans General Hospital; Institute of Neuroscience, School of Life Science Department of Medicine, School of Medicine National Yang-Ming University No.201, Sect.2, Shih-Pai Rd. 112, Taipei Taiwan email: jchsieh at vghtpe.gov.tw tel: (886)-2-28757480 fax: (886)-2-28757612 From terry at salk.edu Mon Jan 17 06:23:12 2000 From: terry at salk.edu (terry@salk.edu) Date: Mon, 17 Jan 2000 03:23:12 -0800 (PST) Subject: Telluride Workshop 2000 Message-ID: <200001171123.DAA15542@hebb.salk.edu> NEUROMORPHIC ENGINEERING WORKSHOP Sunday, JUNE 25 - Saturday, JULY 15, 2000 TELLURIDE, COLORADO ------------------------------------------------------------------------ Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Timmer HORIUCHI (Johns Hopkins University) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) ------------------------------------------------------------------------ We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, June 26 to Sunday, July 15, 2000. The application deadline is Friday, March 3, and application instructions are described at the bottom of this document. The 1999 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, NASA, the Office of Naval Research, and by the Center for Neuromorphic Systems Engineering at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available here. We strongly encourage interested parties to browse through the previous workshop web pages. GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on active participation, with demonstration systems and hands-on-experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware (with an emphasis on analog and asynchronous digital VLSI), are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures on systems neuroscience (in particular learning, oculo-motor and other motor systems and attention), practical tutorials on analog VLSI design, small mobile robots (Koalas and Kheperas), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed (soon to be defined). They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. The analog VLSI practical tutorials will cover all aspects of analog VLSI design, simulation, layout, and testing during the three weeks of the workshop. The first week covers basics of transistors, simple circuit design and simulation. This material is intended for participants who have no experience with analog VLSI. The second week will focus on design frames for silicon retinas, from the silicon compilation and layout of on-chip video scanners, to building the peripheral boards necessary for interfacing analog VLSI retinas to video output monitors. Retina chips will be provided. The third week will feature sessions on floating gates, including lectures on the physics of tunneling and injection, and on inter-chip communication systems. We will also feature a tutorial on the use of small, mobile robots, focussing on Koala's, as an ideal platform for vision, auditory and sensory-motor circuits. Projects that are carried out during the workshop will be centered in a number of groups, including * active vision * audition * olfaction * motor control * central pattern generator * robotics, multichip communication * analog VLSI * learning The active perception project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include a robot head active vision system consisting of a three degree-of-freedom binocular camera system that is fully programmable. The central pattern generator group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The robotics group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The audition group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The multichip communication project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communication will be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place at the new Telluride Public High School (and not at the Elementary School which is being renovated this year) located in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours drive away from Denver (350 miles). Continental and United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of workstations running UNIX, MACs and PCs running LINUX and Microsoft Windows. Unless otherwise arranged with one of the organizers, we expect participants to stay for the entire duration of this three week workshop. FINANCIAL ARRANGEMENT: We have several funding requests pending to pay for most of the costs associated with this workshop. As in 1999, after notification of acceptances have been mailed out around April 1. 2000, participants are expected to pay a $275.- workshop fee. In case of real hardship, this can be waived. Shared condominiums will be provided for all academic participants at no cost to them. We expect participant from National Laboratories and Industry to pay for these modestly priced condominiums. We expect to have funds to reimburse student participants for travel (up to $500 for US domestic travel and up to $800 for overseas travel). Please specify on the application whether such financial help is needed. HOW TO APPLY: The deadline for receipt of applications is March 3. 2000. Applicants should be at the level of graduate students or above (i.e. postdoctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: * First name, Last name, valid email address. * Curriculum Vitae. * One page summary of background and interests relevant to the workshop. * Description of special equipment needed for demonstrations that could be brought to the workshop. * Two letters of recommendation Complete applications should be sent to: Prof. Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 email: terry at salk.edu FAX: (619) 587 0417 Applicants will be notified by email around March 31. 1999 From vera at cs.cas.cz Mon Jan 17 17:30:35 2000 From: vera at cs.cas.cz (Vera Kurkova) Date: Mon, 17 Jan 00 17:30:35 CET Subject: Call for papers NNW'2000 Message-ID: <63036.vera@uivt1.uivt.cas.cz> Call for papers and participation: NNW 2000 10th Anniversary International Conference on Artificial Neural Networks and Intelligent Systems Prague, Czech Republic, July 9-12, 2000 Purpose of the conference The main focus of NNW2000 is the development and application of computational paradigms inspired by natural processes, namely artificial neural networks, evolutionary algorithms, and related subjects including adaptive agents, artificial life, soft computing, etc. The conference takes place in the year of the 10th anniversary of founding the Neural Network World international scientific journal. It is jointly organized by Institute of Computer Science, Academy of Sciences of the Czech Republic, Neural Network World Editorial Board and Action M Agency. Important Dates Submission of draft version of paper: February 15, 2000 Notification of acceptance: April 15, 2000 Delivery of revised papers: May 30, 2000 NNW2000 conference: July 9-12, 2000 Conference topics The following list indicates the areas of interest, but it is not exhaustive: * Neural Networks: Architectures, Algorithms, Approximation, Complexity, Biological Foundations * Evolutionary Computation: Genetic Algorithms, Genetic Programming, Classifier Systems, Artificial Life * Hybrid Systems: Fuzzy Logic, Soft Computing, Neuro-Fuzzy Controller, Genetic Learning of Neural Networks * Adaptive Agents: Models and Architectures, Distribution and Cooperation, AI Agents, Software Agents, Complex Adaptive Systems * Applications: Pattern Recognition, Signal Processing, Simulation, Hardware Implementation, Robotics Call for papers Prospective authors are invited to submit a draft version of paper describing original results for review by an international Program Committee. The paper should be written in English and should not exceed 8 pages. Proposals for tutorials, special sessions, and workshops are also invited. Electronic submission of paper is preferred. It is possible via the conference web site or by email to nnw2000 at uivt.cas.cz Alternatively, three printed copies can be sent to the following address: Roman Neruda Institute of Computer Science Academy of Sciences of the Czech Republic PO Box 5 18207 Prague Czech Republic Accepted papers will be published in the special issue of the Neural Network World journal and available at the conference. Submitting the final version of papers will follow the standard Neural Network World instructions for authors: LaTeX versions (standard article document class) with Encapsulated Postscript figures are preferred, alternative formats (such as MS Word) are possible. Further information * Updated information is available at the conference web site: http://www.cs.cas.cz/nnw2000 * Contact the organizers by email: nnw2000 at cs.cas.cz From barba at cvs.rochester.edu Mon Jan 17 12:13:07 2000 From: barba at cvs.rochester.edu (Barbara Arnold) Date: Mon, 17 Jan 2000 13:13:07 -0400 Subject: 22nd CVS Symposium 2000 (Center for Visual Science) Message-ID: 22nd CVS Symposium 2000 NEURAL CODING June 1 - 3, 2000 For more information or an application, see our website: www.cvs.rochester.edu or contact Barbara Arnold at 716-275-8659 or barba at cvs.rochester.edu One of the fundamental difficulties in understanding the neural basis of perception/cognition is understanding the computational or informational significance of neural activity. This is true at all levels: from individual synapses and neurons, to local circuits and large-scale organization. The enormous complexity of the brain and the behavior it generates demands more sophisticated development of theories of neural coding and communication on a large scale. In the tradition of past CVS Symposia, our goal is to bring recent developments in this fundamentally important topic to a broader audience than that captured by more specialized meetings. We have designed the symposium to bring together leading scientists with diverse perspectives to provide an opportunity for cross-fertilization and interaction that is not usually available. PROGRAM FOR THE MEETING Thursday, June 1 I. Information Coding in Spike Trains II. Early Circuits Friday, June 2 III. Coding Experience: development and plasticity IV. Functional specialization and distributed codes Saturday, June 3 V. Large Scale information flow. SPEAKERS FOR MEETING MOSHE ABELES, Hebrew University DANA BALLARD, University of Rochester KEN BRITTEN, UC Davis CAROL COLBY, University of Pittsburgh MAURIZIO CORBETTA, Washington University DAVID J. FIELD, Cornell University ZACHARY F. MAINEN, Cold Spring Harbor Lab RAFAEL MARCOS YUSTE, Columbia University KEN MILLER, University of California-San Francisco R.CLAY REID, Harvard Medical School TERRENCE J. SEJNOWSKI, Salk Institute for Biological Studies JEFFREY D. SCHALL, Vanderbilt University ROBERT SHAPLEY, New York University ADAM SILLITO, University College London MICHAEL WELIKY, University of Rochester ANTHONY M. ZADOR, Salk Institute ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Barbara N. Arnold Administrator email: barba at cvs.rochester.edu Center for Visual Science phone: 716 275 8659 University of Rochester fax: 716 271 3043 Meliora Hall 274 Rochester NY 14627-0270 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From amari at brain.riken.go.jp Tue Jan 18 00:09:35 2000 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Tue, 18 Jan 2000 14:09:35 +0900 Subject: RIKEN Summer School Message-ID: <20000118140935P.amari@brain.riken.go.jp> RIKEN Brain Science Institute will organize Summer Program 2000 The Brain Science Institute (BSI) at RIKEN is offering a summer program to train advanced students interested in brain function. Applicants may choose either a laboratory internship for two months with one of the 30 Laboratories at BSI, or participate in an intensive two-week lecture course featuring a distinguished international faculty. Summer Interns (Plan A) also enroll in the Lecture Course (Plan B). Travel and lodging expenses will be supported. Deadline : March 31, 2000 INTERNSHIP (PLAN A) July 4 - Sept 1 (Laboratories at RIKEN BSI) ?Neuronal Function Research Group (K.Mori/Y.Yoshihara/R.Yano/T.K.Hensch) ?Neuronal Circuit Mechanisms Research Group (M.Ito/H.Niki/T.Knopfel) ?Cognitive Brain Science Group (K.Tanaka/M.Tanifuji/A.A.Ioannides) ?Developmental Brain Science Group (K.Mikoshiba/H.Okamoto/T.Furuichi/K.Kajiwara) ?Molecular Neuropathology Group (N.Nukina/K.Yamakawa/R.Takahashi/T.Okamoto) ?Aging and Psychiatric Research Group (T.C.Saido/A.Takashima/T.Yoshikawa) ?Brainway Group (G.Matsumoto/M.Ichikawa) ?Brain-Style Information Systems Research Group (S.Amari/S.Tanaka/A.J.Cichocki) ?Advanced Technology Development Center(ATDC) (C.Itakura/T.Hashikawa/S.Itohara/A.Miyawaki/M.Ogawa) LECTURE COURSE (PLAN B) Topic:'How the Brain Works: Experimental and Theoretical Approaches' July 4 - July 15 The purpose of the course is to provide basic concepts necessary for understanding computation in the brain at different levels from synapses to systems, and from both experimental and theoretical perspectives. Individual lecturers will provide basics of their field and advanced topics. (Lecturers) Tomoyuki Takahashi (University of Tokyo) Anthony M. Zador (Cold Spring Harbor Laboratory) Idan Segev (The Hebrew University) Bruce L. McNaughton (University of Arizona) Kensaku Mori (RIKEN/University of Tokyo) Ad Aertsen (Albert-Ludwigs-University) Shun-ichi Amari (RIKEN) Kathleen S. Rockland (University of Iowa) Masakazu Konishi (California Institute of Technology) Mitsuo Kawato (ATR Human Info. Proc. Res. Labs.) Earl K.Miller (RIKEN/Massachusetts Institute of Technology) Nancy G. Kanwisher (Massachusetts Institute of Technology) Shimon Ullman (The Weizmann Institute of Science) Keiji Tanaka (RIKEN) Okihide Hikosaka (Juntendo University) Jun Tanji (Tohoku University) Charles Jennings (Nature Neuroscience, Editor) others to be included FURTHER INFORMATION Application forms : visit our web site http://summer.brain.riken.go.jp or send inquiries to Summer Program Organizing Committee, BSI, RIKEN 2-1 Hirosawa, Wako-shi, Saitama 351-0198, JAPAN E-mail : info at summer.brain.riken.go.jp Fax : +81-48-462-4914 From priel at math.tau.ac.il Tue Jan 18 05:37:09 2000 From: priel at math.tau.ac.il (avner priel) Date: Tue, 18 Jan 2000 12:37:09 +0200 (GMT+0200) Subject: PhD Thesis Message-ID: Dear Connectionists, My PhD thesis is now available from the following URL: http://www.math.tau.ac.il/~priel/papers.html Below please find the abstract. Best wishes, Priel Avner. ----------------------------------------------------- Priel Avner < priel at math.tau.ac.il > < http://www.math.tau.ac.il/~priel > School of Mathematical Sciences, Tel-Aviv University, Israel. ----------------------------------------------------------------------- ----------------------------------------------------------------------- "Dynamic and Static Properties of Neural Networks with FeedBack" Ph.D. Thesis Avner Priel Department of Physics Bar-Ilan University, Israel. ABSTRACT: This thesis describes analytical and numerical study of time series generated by a special type of recurrent neural networks, a continuous-valued feed-forward network in which the next input vector is determined from past output values. The topics covered in this work include the analysis of the sequences generated by the network in the stable and unstable regimes of the parameter space, the effect of an additive noise on the long-term properties of the network and the ability of the model to capture the rule of a long-range correlated sequence. The asymptotic solutions of the sequences generated by the model in the stable regime are found analytically for various architectures, transfer functions and choice of the weights. We find that the generic solution is a quasi-periodic attractor (excluding the cases where the solution is a fixed point). We find a hierarchy among the complexity of time series generated by different architectures; more hidden units can generate higher dimensional attractors. The relaxation time from an arbitrary initial condition to the vicinity of the asymptotic attractor is studied for the case of a perceptron and a two-layered perceptron. In both cases, the relaxation time scales linearly with the size of the network. Although networks with monotonic, as well as non-monotonic, transfer functions are capable of generating chaotic sequences, the unstable regions of the parameter space exhibit different features. Non-monotonic functions can produce robust chaos, whereas monotonic functions generate fragile chaos only. In the case of non-monotonic functions, the number of positive Lyapunov exponents increases as a function of one of the free parameters in the model; hence, high dimensional chaotic attractors can be generated. We study also a combination of monotonic and non-monotonic functions. The stability of the asymptotic results obtained for the model is tested by analysing the effect of an additive noise introduced in the output of the network. A single attractor in the presence of noise is broadened. The phase of a noisy model diffuses w.r.t.\ a noise-free model with a diffusion constant which is inversely proportional to the size of the network; hence, phase coherence is maintained for a time length that is proportional to the network's size. When the network has more than a single possible attractor, they become meta-stable in the presence of noise. We study the properties of an important quantity - the mean first passage time, and derive a relation between the size of the network, the distance from the bifurcation point and the mean first passage time to escape from the basin of attraction. The last subject we address concerns the capability of our model to learn the rule of a sequence obeying a power-law correlation function. An ensemble of long sequences is generated and used to train the network. The trained networks are then used to generate a subsequent sequences. It is found that the generated sequences have a similar power-law correlation function as the training sequences. By studying the properties of the trained networks, we conclude that the correlation function of the weight matrix should be dominated by vertical power-law correlations in order to generate long-range correlated sequences. This conclusion is verified by numerical simulations. Analysis of the mean-field approximation of the correlation function leads to the same qualitative conclusion regarding the weight matrix. From kruschke at indiana.edu Wed Jan 19 06:35:15 2000 From: kruschke at indiana.edu (John K. Kruschke) Date: Wed, 19 Jan 2000 06:35:15 -0500 (EST) Subject: Postdoctoral Position in Cognitive Modeling Message-ID: POSTDOCTORAL TRAINING FELLOWSHIPS in the COGNITIVE SCIENCE PROGRAM at INDIANA UNIVERSITY in MODELING OF COGNITIVE PROCESSES The Psychology Department and Cognitive Science Program at Indiana University anticipate one or more Postdoctoral Traineeships in the area of Modeling of Cognitive Processes, funded by the National Institutes of Health. Appointments will pay rates appropriate for a new or recent Ph.D. and will be for one or two years, beginning July 1, 2000 or later. Traineeships will be offered to qualified individuals who wish to further their training in mathematical modeling or computer simulation modeling, in any substantive area of cognitive psychology or Cognitive Science. Trainees will be expected to carry out original theoretical and empirical research in association with one or more of these faculty and their laboratories, and to interact with other relevant faculty and other pre- and postdoctoral trainees. In addition, they should plan to take or audit courses offered within the Cognitive Modeling Program. We are particularly interested in applicants with strong mathematical, scientific, and research credentials. Indiana University has superb computational and research facilities, and faculty with outstanding credentials in this area of research, including James Townsend, director of the training program, and Jerome Busemeyer, Robert Nosofsky, John Kruschke, Michael Gasser, Robert Goldstone, Geoffrey Bingham, Tom Busey, Donald Robinson, Robert Port, and Richard Shiffrin. Applicants should send an up-to-date vita, relevant reprints and preprints, a personal letter describing their research interests, background, goals, and career plans, and reference letters from two individuals. Women, minority group members, and handicapped individuals are urged to apply. Deadline for submission of application materials is April 1, 2000, but we encourage earlier applications. PLEASE NOTE: The conditions of our grant restrict all awards to U.S. citizens or current green card holders. Awards also have a 'payback' provision, generally requiring awardees to carry out research or teach (not necessarily at IU) for a minimum period after termination of the traineeship. Cognitive Science information may be obtained at http://www.psych.indiana.edu/ Send Materials to Professor Jerome R. Busemeyer Department of Psychology, Rm 367 Indiana University 1101 E. 10th St. Bloomington, IN 47405-7007 Voice: 812-855-4882 Fax: 812-855-1086 email: jbusemey at indiana.edu Indiana University is an Affirmative Action Employer From gini at elet.polimi.it Wed Jan 19 12:55:10 2000 From: gini at elet.polimi.it (Giuseppina Gini) Date: Wed, 19 Jan 2000 18:55:10 +0100 Subject: Research fellowships Message-ID: The positions offered can be of interest to people in this list. - Giuseppina Gini ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +++++ PRE- AND POSTDOCTORAL POSITIONS AVAILABLE Applications are invited from young researchers with an interest in joining a four-year (2000-2003) EU network project aimed at the development of models to predict toxicity, starting Spring 2000. The network consists of laboratories (see the list below) working in toxicology, computational chemistry and computer science. Grants are available to work in one of the seven laboratories of the Network. Grant will be from 1650 to 4320 Euro/month, depending on qualification, experience and location. Pre-doc positions are compatible with PhD studies. Post-doc positions require that the candidate holds (or is near to obtain) a PhD. Interested candidates should send an application (including CV, summary of research interests and preferred location) to: Dr Emilio Benfenati Head, Laboratory of Environmental Chemistry and Toxicology Istituto di Ricerche Farmacologiche "Mario Negri" Via Eritrea 62, 20157 Milano, Italy Tel: +39-02-39014420 Fax: +39-02-39001916 e-mail: benfenati at irfmn.mnegri.it http://www.irfmn.mnegri.it/ambsal/chem-toxi/Benfenati.htm http://www.irfmn.mnegri.it/ambsal/chem-toxi/Default.htm Network Title: Intelligent Modelling Algorithms for the General Evaluation of TOXicities (IMAGETOX) The Co-ordinator: Istituto di Ricerche Farmacologiche "Mario Negri", Laboratory of Environmental Chemistry and Toxicology, Milan, Italy Contact person: Dr Emilio Benfenati (benfenati at irfmn.mnegri.it) The Members - Dr. M. Cronin, Liverpool John Moores University, School of Pharmacy and Chemistry, United Kingdom (m.t.cronin at livjm.ac.uk) - Prof. J. Hermens, Utrecht University, Research Institute of Toxicology, Faculty of Veterinary Sciences, The Netherlands (j.hermens at ritox.vet.uu.nl) - Prof. G. Gini, Politecnico di Milano, Dipartimento di Elettronica e Informazione, Milano, Italy (gini at elet.polimi.it) - Dr. M. Vracko, National Institute of Chemistry, Ljubljana, Slovenia (marjan.vracko at ki.si) - Prof. G. Sch??rmann, UFZ Centre for Environmental Research, Leipzig, Germany (gs at uoe.ufz.de) - Prof. M. Karelson, University of Tartu, Department of Chemistry, Estonia (mati at chem.ut.ee). Conditions for EU grants The researcher must be 35 years old or less at the time of his appointment (allowances are possible for special cases). The researcher must be a holder of a doctoral degree or of a degree in an appropriate subject in Science or Engineering. The appointment will be for a fixed-term. The applicant must be a national of a Member State of the European Community or of an Associated State or have resided in the European Community for at least five years prior to his appointment. The applicant must choose a Centre located in a state different from his national state and he must not have carried out his activities in that state for more than 12 of the 24 months prior to his appointment. Research Program The present project aims to improve the power of models for toxicity prediction. It will take advantage of the recent advances in computer science (such as multivariate analysis, neural networks, expert systems, machine learning, and simulated annealing). Candidates in Computer Science are expected to work on Machine Learning and KDD and to develop theories and systems. These will be applied to the validation (verifying the robustness of reported models and new predictive ones) and development for real world applications. Different models will be developed for (eco)toxicology and for environmental fate prediction. For (eco)toxicology different species will be considered, as well as different mechanisms of toxic action. In environmental fate, partitioning between compartments in the environment and into biota will be studied, as well as degradation. - - - - Giuseppina C. Gini DEI, Politecnico di Milano piazza L. da Vinci 32, I-20133 MILANO fax: (+39) 02-2399.3411 phone: (+39) 02-2399.3626 e-mail: gini at elet.polimi.it http://www.elet.polimi.it/people/gini/ http://www.elet.polimi.it/AAAI-PT member http://www.worldses.org From palm at neuro.informatik.uni-ulm.de Thu Jan 20 06:06:24 2000 From: palm at neuro.informatik.uni-ulm.de (Guenther Palm) Date: Thu, 20 Jan 00 12:06:24 +0100 Subject: KES2000-Sessions Message-ID: <10001201106.AA17467@neuro.informatik.uni-ulm.de> Dear connectionists, I am organizing two special sessions for the KES2000 conference. The conference will take place from August 30 to September 1, 2000 in Brighton, Sussex, U.K. More information on the conference can be obtained from the KES2000 Web site: http://luna.bton.ac.uk/~kes2000/ The topics of the sessions are: 1) Processing of hierarchical structures in neural networks. 2) Biomedical applications of neural networks. Accepted session contributions will be published in the conference proceedings by IEEE. The procedure for submissions is as follows: 1) Please send me a short statement concerning the topic of your intended contribution BEFORE FEBRUARY 04, 2000. This statement may contain a title and a short abstract and should not exceed half a page. Based on these statements we may further focus the topics of the two workshops. I will send out a call for papers ON FEBRUARY 11, 2000. 2) Send the camera ready version of your paper (four A4 pages) UNTIL MARCH 15, 2000. The papers will be reviewd and the results will be communicated to the authors in APRIL, 2000. ------------------------------------------------------------- Guenther Palm Neural Information Processing University of Ulm D-89069 Ulm Germany palm at neuro.informatik.uni-ulm.de From mpessk at guppy.mpe.nus.edu.sg Fri Jan 21 01:52:16 2000 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Fri, 21 Jan 2000 14:52:16 +0800 (SGT) Subject: Tech Report on Convergence of SMO algorithm for SVMs Message-ID: The following Tech Report in gzipped postscript form is available at: http://guppy.mpe.nus.edu.sg/~mpessk/svm/conv1.ps.gz ----------------------------------------------------------------------- Convergence of a Generalized SMO Algorithm for SVM Classifier Design S.S. Keerthi Control Division Dept. of Mechanical and Production Engineering National University of Singapore Tech Rept. CD-00-01 Convergence of a generalized version of the modified SMO algorithms given by Keerthi et.al. for SVM classifier design is proved. The convergence results are also extended to modified SMO algorithms for solving $\nu$-SVM classifier problems. ----------------------------------------------------------------------- From Zoubin at gatsby.ucl.ac.uk Thu Jan 20 13:28:17 2000 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Thu, 20 Jan 2000 18:28:17 +0000 (GMT) Subject: Preprints Available Message-ID: <200001201828.SAA21740@cajal.gatsby.ucl.ac.uk> The following 8 preprints from the Gatsby Computational Neuroscience Unit are now available on the web. These papers will appear in the Proceedings of NIPS 99 (Advances in Neural Information Processing Systems 12, edited by S. A. Solla, T. K. Leen, and K.-R. M?ller, MIT Press). Zoubin Ghahramani Gatsby Computational Neuroscience Unit http://www.gatsby.ucl.ac.uk University College London ---------------------------------------------------------------------- Author: Hagai Attias Title: A Variational Bayesian Framework for Graphical Models URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99vb.ps ---------------------------------------------------------------------- Author: Hagai Attias Title: Independent Factor Analysis with Temporally Structured Sources URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99dfa.ps ---------------------------------------------------------------------- Author: Zoubin Ghahramani and Matthew J Beal Title: Variational Inference for Bayesian Mixtures of Factor Analysers URL: http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.ps.gz http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.pdf ---------------------------------------------------------------------- Author: Geoffrey E. Hinton and Andrew D. Brown Title: Spiking Boltzmann Machines URL: http://www.gatsby.ucl.ac.uk/~andy/papers/nips99_sbm.ps.gz ---------------------------------------------------------------------- Author: Geoffrey E. Hinton, Zoubin Ghahramani and Yee Whye Teh Title: Learning to Parse Images URL: http://www.gatsby.ucl.ac.uk/~ywteh/crednets ---------------------------------------------------------------------- Author: Zhaoping Li Title: Can V1 mechanisms account for figure-ground and medial axis effects? URL: http://www.gatsby.ucl.ac.uk/~zhaoping/prints/nips99abstract.html ---------------------------------------------------------------------- Author: Sam Roweis Title: Constrained Hidden Markov Models URL: http://www.gatsby.ucl.ac.uk/~roweis/papers/sohmm.ps.gz ---------------------------------------------------------------------- Author: Brian Sallans Title: Learning Factored Representations for Partially Observable Markov Decision Processes URL: PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps gzip'd PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps.gz PDF: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.pdf ==================================ABSTRACTS: ==================================Author: Hagai Attias Title: A Variational Bayesian Framework for Graphical Models URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99vb.ps ---------------------------------------------------------------------- Author: Hagai Attias Title: Independent Factor Analysis with Temporally Structured Sources URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99dfa.ps ---------------------------------------------------------------------- Authors: Zoubin Ghahramani and Matthew J Beal Title: Variational Inference for Bayesian Mixtures of Factor Analysers Abstract: We present an algorithm that infers the model structure of a mixture of factor analysers using an efficient and deterministic variational approximation to full Bayesian integration over model parameters. This procedure can automatically determine the optimal number of components and the local dimensionality of each component (i.e.\ the number of factors in each factor analyser). Alternatively it can be used to infer posterior distributions over number of components and dimensionalities. Since all parameters are integrated out the method is not prone to overfitting. Using a stochastic procedure for adding components it is possible to perform the variational optimisation incrementally and to avoid local maxima. Results show that the method works very well in practice and correctly infers the number and dimensionality of nontrivial synthetic examples. By importance sampling from the variational approximation we show how to obtain unbiased estimates of the true evidence, the exact predictive density, and the KL divergence between the variational posterior and the true posterior, not only in this model but for variational approximations in general. URL: http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.ps.gz http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.pdf ---------------------------------------------------------------------- Authors: Geoffrey E. Hinton and Andrew D. Brown Title: Spiking Boltzmann Machines Abstract: We first show how to represent sharp posterior probability distributions using real valued coefficients on broadly-tuned basis functions. Then we show how the precise times of spikes can be used to convey the real-valued coefficients on the basis functions quickly and accurately. Finally we describe a simple simulation in which spiking neurons learn to model an image sequence by fitting a dynamic generative model. URL: http://www.gatsby.ucl.ac.uk/~andy/papers/nips99_sbm.ps.gz ---------------------------------------------------------------------- Authors: Geoffrey E. Hinton, Zoubin Ghahramani and Yee Whye Teh Title: Learning to Parse Images Abstract: We describe a class of probabilistic models that we call credibility networks. Using parse trees as internal representations of images, credibility networks are able to perform segmentation and recognition simultaneously, removing the need for ad hoc segmentation heuristics. Promising results in the problem of segmenting handwritten digits were obtained. URL: http://www.gatsby.ucl.ac.uk/~ywteh/crednets ---------------------------------------------------------------------- Author: Zhaoping Li Title: Can V1 mechanisms account for figure-ground and medial axis effects? Abstract: When a visual image consists of a figure against a background, V1 cells are physiologically observed to give higher responses to image regions corresponding to the figure relative to their responses to the background. The medial axis of the figure also induces relatively higher responses compared to responses to other locations in the figure (except for the boundary between the figure and the background). Since the receptive fields of V1 cells are very small compared with the global scale of the figure-ground and medial axis effects, it has been suggested that these effects may be caused by feedback from higher visual areas. I show how these effects can be accounted for by V1 mechanisms when the size of the figure is small or is of a certain scale. They are a manifestation of the processes of pre-attentive segmentation which detect and highlight the boundaries between homogeneous image regions. URL: http://www.gatsby.ucl.ac.uk/~zhaoping/prints/nips99abstract.html ---------------------------------------------------------------------- Author: Sam Roweis Title: Constrained Hidden Markov Models Abstract: By thinking of each state in a hidden Markov model as corresponding to some spatial region of a fictitious _topology space_ it is possible to naturally define neighbouring states as those which are connected in that space. The transition matrix can then be constrained to allow transitions only between neighbours; this means that all valid state sequences correspond to connected paths in the topology space. I show how such _constrained HMMs_ can learn to discover underlying structure in complex sequences of high dimensional data, and apply them to the problem of recovering mouth movements from acoustics in continuous speech. URL: http://www.gatsby.ucl.ac.uk/~roweis/papers/sohmm.ps.gz ---------------------------------------------------------------------- Author: Brian Sallans University of Toronto and Gatsby Unit, UCL sallans at cs.toronto.edu Title: Learning Factored Representations for Partially Observable Markov Decision Processes Abstract: The problem of reinforcement learning in a non-Markov environment is explored using a dynamic Bayesian network, where conditional independence assumptions between random variables are compactly represented by network parameters. The parameters are learned on-line, and approximations are used to perform inference and to compute the optimal value function. The relative effects of inference and value function approximations on the quality of the final policy are investigated, by learning to solve a moderately difficult driving task. The two value function approximations, linear and quadratic, were found to perform similarly, but the quadratic model was more sensitive to initialization. Both performed below the level of human performance on the task. The dynamic Bayesian network performed comparably to a model using a localist hidden state representation, while requiring exponentially fewer parameters. URL: PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps gzip'd PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps.gz PDF: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.pdf ---------------------------------------------------------------------- From munro at lis.pitt.edu Fri Jan 21 17:00:38 2000 From: munro at lis.pitt.edu (Paul Munro) Date: Fri, 21 Jan 2000 17:00:38 -0500 (EST) Subject: Two NIPS papers available Message-ID: The following two papers from our group can be downloaded. The first paper is available only in postscript and the second is in both postscript and acrobat versions. The URLs can be found below with the abstracts. Paul Munro Internet: munro at sis.pitt.edu SIS Bldg 735 Voice: 412-624-9427 Department of Information Science Fax (new #): 412-624-2788 University of Pittsburgh Pittsburgh PA 15260 Personal HTML page = http://www.pitt.edu/~pwm/ (To be in :Advances in Neural Information Processing Systems 12, edited by S. A. Solla, T. K. Leen, and K.-R. Mueller, MIT Press) Effects of spatial and temporal contiguity on the acquisition of spatial information Thea Ghiselli-Crippa and Paul W. Munro URL: www.pitt.edu/~pwm/nips99a.ps ABSTRACT Spatial information comes in two forms: direct spatial information (for example, retinal position) and indirect temporal contiguity information, since objects encountered sequentially are in general spatially close. The acquisition of spatial information by a neural network is investigated here. Given a spatial layout of several objects, networks are trained on a prediction task. Networks using temporal sequences with no direct spatial information are found to develop internal representations that have distances correlated with distances in the external layout. The influence of spatial information is analyzed by providing direct spatial information to the system during training that is either consistent with the layout or inconsistent with it. This approach allows examination of the relative contributions os spatial and temporal contiguity. LTD facilitates learning in a noisy environment Paul Munro and Gerardina Hernandez URL: www.pitt.edu/~pwm/nips99b.ps www.pitt.edu/~pwm/nips99b.pdf ABSTRACT Long-term potentiation (LTP) has long been held as a biological substrate for associative learning. Recently, evidence has emerged that long-term depression (LTD) results when the presynaptic cell fires after the postsynaptic cell. The computational utility of LTD is explored here. Synaptic modification kernels for both LTP and LTD have been proposed by other laboratories based studies of one postsynaptic unit. Here, the interaction between time-dependent LTP and LTD is studied in small networks. From harnad at coglit.ecs.soton.ac.uk Sun Jan 23 18:02:18 2000 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sun, 23 Jan 2000 23:02:18 +0000 (GMT) Subject: GESTURAL ORIGIN OF LANGUAGE: Psyc Call for Commentators Message-ID: Place/Catania: THE ROLE OF THE HAND IN THE EVOLUTION OF LANGUAGE The target article whose abstract appears below has today appeared in PSYCOLOQUY, a refereed online journal of Open Peer Commentary sponsored by the American Psychological Association. http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.007 ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/2000.volume.11/ psyc.00.11.007.language-gesture.1.place OPEN PEER COMMENTARY on this target article is now invited. Qualified professional biobehavioural, neural or cognitive scientists should consult PSYCOLOQUY's Websites or send email (below) for Instructions if not familiar with format or acceptance criteria for commentaries (all submissions are refereed). To submit articles or to seek information: EMAIL: psyc at pucc.princeton.edu URLs: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc ----------------------------------------------------------------------- psycoloquy.00.11.007.language-gesture.1.place Sun Jan 23 2000 ISSN 1055-0143 (59 paras, 58 refs, 1 figure, 1281 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 2000 Ullin T. Place THE ROLE OF THE HAND IN THE EVOLUTION OF LANGUAGE Target Article on Language Origins Ullin T. Place School of Philosophy University of Leeds School of Psychology University of Wales, Bangor, Wales UK Charles Catania Department of Psychology University of Maryland, Baltimore County 1000 Hilltop Circle Baltimore, Maryland 21250 USA catania at umbc.edu ABSTRACT: This target article has four sections. Section I sets out four principles which should guide any attempt to reconstruct the evolution of an existing biological characteristic. Section II sets out thirteen principles specific to a reconstruction of the evolution of language. Section III sets out eleven pieces of evidence for the view that vocal language must have been preceded by an earlier language of gesture. Based on those principles and evidence, Section IV sets out seven proposed stages in the process whereby language evolved: (1) the use of mimed movement to indicate an action to be performed, (2) the development of referential pointing which, when combined with mimed movement, leads to a language of gesture, (3) the development of vocalisation, initially as a way of imitating the calls of animals, (4) counting on the fingers leading into (5) the development of symbolic as distinct from iconic representation, (6) the introduction of the practice of question and answer, and (7) the emergence of syntax as a way of disambiguating utterances that can otherwise be disambiguated only by gesture. KEYWORDS: evolution, equivalence, gesture, homesigning, iconic, language, miming, pointing, protolanguage, referring, sentence, symbolic, syntax, vocalisation EDITOR'S NOTE: Ullin T. Place died on January 2, 2000. His target article had been reviewed for PSYCOLOQUY and was essentially complete at the time of his death. Some minor editing has been done by PSYCOLOQUY Associate Editor A. Charles Catania, mainly to bring the manuscript into conformity with PSYCOLOQUY style. Catania will consider replying to commentaries on this article, but also welcomes the participation of others who may feel they are familiar enough with Place's perspectives to do so. Retrieve the full target article at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.007 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/2000.volume.11/ psyc.00.11.007.language-gesture.1.place From fmdist at hotmail.com Sun Jan 23 14:37:21 2000 From: fmdist at hotmail.com (Fionn Murtagh) Date: Sun, 23 Jan 2000 11:37:21 PST Subject: RA posn. -visualizn. of user behavior in information spaces Message-ID: <20000123193721.29808.qmail@hotmail.com> A Research Assistant position will be available soon in Computer Science, The Queen's University of Belfast, in the area of visualization of user behavior in information spaces. Please let Fionn Murtagh (address below) know of your interest in this position. A short description of the work to be undertaken follows. The European (5th Framework) project "IRAIA - Getting Orientation in Complex Information Spaces as an Emergent Behavior of Autonomous Information Agents", which will last for two years, will be starting in March 2000. "Information retrieval systems of the future will be huge information repositories, distributed all over the world. Even the users will contribute to these repositories by communicating their experiences to other users who follow their footsteps. IRAIA's design metaphor focuses on the ants' system for communicating information. Navigating the web should allow people to leave pointers for those who might also navigate along the same paths." QUB work in this project will include the development of an architecture of different layers of abstraction that support the construction of a coordinate system based on ontologies and investigation of user behavior. Such user profiling will be based on information visualizations such as Kohonen self-organizing feature maps or similar active maps based on linkage graphs. These will be interfaced to the agent (CORBA, EJB) environment used. In more open research, the fact that map visualizations are used means that we will also seek to relate and exploit intriguing technologies used in digital image transmission - thinwire transmission technologies, and foveation-based strategies, based on multiscale transforms. ------------------------------------------------------------ Prof. F. Murtagh, School of Computer Science, The Queen's University of Belfast, Belfast BT7 1NN, Northern Ireland http://www.cs.qub.ac.uk/~F.Murtagh f.murtagh at qub.ac.uk Centre for Image and Vision Systems http://www.qub.ac.uk/ivs ------------------------------------------------------------ ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From Thomas.Wennekers at mis.mpg.de Mon Jan 24 12:52:18 2000 From: Thomas.Wennekers at mis.mpg.de (Thomas Wennekers) Date: Mon, 24 Jan 2000 18:52:18 +0100 (MET) Subject: 3 papers on complex modes of synchronization Message-ID: <200001241752.SAA15915@s4-22.mis.mpg.de> Dear connectionists, The following three papers on complex modes of synchronization in networks of graded response and spiking neurons are now available from the web page: http://www.informatik.uni-ulm.de/ni/mitarbeiter/TWennekers.html Regards, Thomas. _________________________________________________________________ Generalized and Partial Synchronization of Coupled Neural Networks Frank Pasemann and Thomas Wennekers to appear in "Network: Computation in Neural Systems" Abstract: Synchronization of neural signals has been proposed as a temporal coding scheme representing cooperated computation in distributed cortical networks. Previous theoretical studies in that direction mainly focused on the synchronization of coupled oscillatory subsystems and neglected more complex dynamical modes, that already exist on the single-unit level. In the present work we study the parametrized time-discrete dynamics of two coupled recurrent networks of graded neurons. Conditions for the existence of partially synchronized dynamics of these systems are derived, referring to a situation where only subsets of neurons in each sub-network are synchronous. The coupled networks can have different architectures and even a different number of neurons. Periodic as well as quasiperiodic and chaotic attractors constrained to a manifold $M$ of synchronized components are observed. Examples are discussed for coupled 3-neuron networks having different architectures, and for coupled 2-neuron and 3-neuron networks. Partial synchronization of different degrees is demonstrated by numerical results for selected sets of parameters. In conclusion, the results show that synchronization phenomena far beyond completely synchronized oscillations can occur even in simple coupled networks. The type of the synchronization depends in an intricate way on stimuli, history and connectivity as well as other parameters of the network. Specific inputs can further switch between different operational modes in a complex way, suggesting a similarly rich spatio-temporal behavior in real neural systems. __________________________________________________________________ Complete Synchronization in Coupled Neuromodules of Different Types Frank Pasemann and Thomas Wennekers Theory in Biosciences 118:267-283, 1999. Abstract: We discuss the parametrized dynamics of two coupled recurrent neural networks comprising either additive sigmoid neurons in discrete time or biologically more plausible time-continuous leaky-integrate-and-fire cells. General conditions for the existence of synchronized activity in such networks are given, which guarantee that corresponding neurons in both coupled sub-networks evolve synchronously. It is, in particular, demonstrated that even the coupling of totally different network structures can result in complex dynamics constrained to a synchronization manifold $M$. For additive sigmoid neurons the synchronized dynamics can be periodic, quasiperiodic as well as chaotic, and its stability can be determined by Lyapunov exponent techniques. For leaky-integrate-and-fire cells synchronized orbits are typically periodic, often with an extremely long period duration. In addition to synchronized attractors there often co-exist asynchronous periodic, quasiperiodic and even chaotic attractors. ___________________________________________________________________ "Generalized Types of Synchronization in Networks of Spiking Neurons" Thomas Wennekers and Frank Pasemann: Submitted to Computational Neuroscience Conference, CNS 2000. Abstract: The synchronization of neural signals has been proposed as a temporal coding scheme in distributed cortical networks. Theoretical studies in that direction mainly focused on the synchronization of coupled oscillatory subsystems. In the present work we show that several complex types of synchronization previously described for graded response neurons appear similarly also in biologically realistic networks of spiking and compartmental neurons. This includes synchronized complex spatio-temporal behavior, partial and generalized synchronization. The results suggest a similarly rich spatio-temporal behavior in real neural systems and may guide experimental research towards the study of complex modes of synchronization and their neuromodulation. _________________________________________________________________ Thomas Wennekers Max-Planck-Institute for Mathematics in the Sciences Inselstrasse 22-26 04103 Leipzig Germany Phone: +49-341-9959-533 Fax: +49-341-9959-555 Email: Thomas.Wennekers at mis.mpg.de WWW : www.mis.mpg.de www.informatik.uni-ulm.de/ni/mitarbeiter/TWennekers.html ________________________________________________________________ From H.Bolouri at herts.ac.uk Mon Jan 24 15:03:23 2000 From: H.Bolouri at herts.ac.uk (Hamid Bolouri) Date: Mon, 24 Jan 2000 12:03:23 -0800 Subject: CFP: Computation in Cells Message-ID: <20000124120323.I13844@cns.caltech.edu> Call for Papers (submission deadline 14 February 2000) COMPUTATION IN CELLS : molecular & cellular networks as computational systems, e.g.: robustness in biochemical networks. computational and dynamical motifs in molecular biology. tools and algorithms for unravelling biochemical systems. computational properties of signalling pathways. models of gene regulation and genetic regulatory networks. evolution of biochemical and genetic networks. deterministic computation from stochastic interactions. cell differentiation and pattern formation. April 17 & 18th 2000 University of Hertfordshire, UK http://strc.herts.ac.uk/NSGweb/emergent/ A UK Eng. & Phys. Sci. Research Council Emergent Computing workshop Co-sponsored by: the Wellcome Trust, the British Computer Society, & British Telecom Invited speakers (who have confirmed so far, more to come!): Baltazar Aguda (Laurentian U, Canada) Maria Blair (Sheffield Hallam, UK) Mark Borisuk (Caltech, USA) Dennis Bray (U. Cambridge, UK) Igor Goryanin (Glaxo-SmithKline, UK) Charles Hodgman (Glaxo-SmithKline, UK) Maria Samsonova (Inst. for High Performance Computing, Russia) Denis Thieffry (Free U. of Brussels & U. Gent, Belgium) David Willshaw (U. Edinburgh, UK) Tau-Mu Yi (Caltech, USA) From taketani at ics.uci.edu Mon Jan 24 00:50:25 2000 From: taketani at ics.uci.edu (Makoto Taketani) Date: Mon, 24 Jan 2000 14:50:25 +0900 Subject: Paper on a new method to study in-vitro network operations Message-ID: <4.1-J.20000124144641.013f1a80@binky.ics.uci.edu> The following recent paper may be of interest to those in this list interested in new methods to study in-vitro network operations. A new planar multielectrode array for extracellular recording: application to hippocampal acute slice Journal of Neuroscience Methods, 93, 61-67. Abstract The present paper describes a new planar multielectrode array (the MED probe) and its electronics (the MED system) which perform electrophysiological studies on acute hippocampal slices. The MED probe has 64 planar microelectrodes, is covered with a non-toxic, uniform insulation layer, and is further coated with polyethylenimine and serum. The MED probe is shown to be appropriate for both stimulation and recording. In particular, multi-channel recordings of field EPSPs obtained by stimulating with a pair of planar microelectrodes were established for rat hippocampal acute slices. The recordings were stable for six hours. Finally a spatial distribution of long-term potentiation was studied using the MED system The full article can be downloaded from http://www.med64.com/publications.htm ------------------------------------------------------- Makoto Taketani, Ph.D. Center for the Neurobiology of Learning and Memory University of California Irvine, CA 92697-3800 Tel: 949-824-5770; FAX: 949-824-5737 Net: taketani at ics.uci.edu ------------------------------------------------------- From oby at cs.tu-berlin.de Tue Jan 25 05:35:55 2000 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Tue, 25 Jan 2000 11:35:55 +0100 (MET) Subject: preprints available Message-ID: <200001251035.LAA11438@pollux.cs.tu-berlin.de> Dear Connectionists, attached please find abstracts and preprint locations of two manuscripts on the analysis of optical recording data and on visual cortex modelling. Comments are welcome! Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ============================================================================= Principal component analysis and blind separation of sources for optical imaging of intrinsic signals M. Stetter^1, I. Schiessl^1, T. Otto^1, F. Sengpiel^2, M. Huebener^2, T. Bonhoeffer^2, and K. Obermayer^1 ^1 Fachbereich Informatik, Technische Universitaet Berlin ^2 Max-Planck-Institute for Neurobiology, Martinsried The analysis of data sets from optical imaging of intrinsic signals requires the separation of signals, which accurately reflect stimulated neuronal activity (mapping signal), from signals related to background activity. Here we show that blind separation of sources by Extended Spatial Decorrelation (ESD) is a powerful method for the extraction of the mapping signal from the total recorded signal. ESD is based on the assumptions, (i) that each signal component varies smoothly across space and (ii) that every component has zero cross-correlation functions with the other components. In contrast to the standard analysis of optical imaging data, the proposed method (i) is applicable to non-orthogonal stimulus-conditions, (ii) can remove the global signal, blood-vessel patterns and movement artifacts, (iii) works without ad hoc assumptions about the data structure in the frequency domain, and (iv) provides a confidence measure for the signals (Z-score). We first demonstrate on orientation maps from cat and ferret visual cortex, that Principal Component Analysis (PCA), which acts as a preprocessing step to ESD, can already remove global signals from image stacks, as long as data stacks for at least two -- not necessarily orthogonal -- stimulus conditions are available. We then show that the full ESD analysis can further reduce global signal components and -- finally -- concentrate the mapping signal within a single component both for differential image stacks and for image stacks recorded during presentation of a single stimulus. in: NeuroImage, in press available at: http://ni.cs.tu-berlin.de/publications/ ----------------------------------------------------------------------------- A mean field model for orientation tuning, contrast saturation and contextual effects in the primary visual cortex M. Stetter, H. Bartsch, and K. Obermayer Fachbereich Informatik, Technische Universitaet Berlin Orientation selective cells in the primary visual cortex of monkeys and cats are often characterized by an orientation-tuning width that is invariant under stimulus contrast. At the same time their contrast response function saturates or even super-saturates for high values of contrast. When two bar stimuli are presented within their classical receptive field, the neuronal response decreases with intersection angle. When two stimuli are presented inside and outside the classical receptive field, the response of the cell increases with intersection angle. Both cats and monkeys show iso-orientation suppression, which was sometimes reported to be combined with cross-orientation facilitation. This property has previously been described as sensitivity to orientation contrast. We address the emergence of these effects by a model which describes the processing of geniculocortical signals through cortical circuitry. We hypothesize that short intracortical fibers mediate the classical receptive field effects whereas long-range collaterals evoke contextual effects such as sensitivity to orientation contrast. We model this situation by setting up a mean-field description of two neighboring cortical hypercolumns, which may process a non-overlapping center and a (nonclassical) surround stimulus. Both hypercolumns interact via idealized long-range connections. For an isolated model hypercolumn we find, that either contrast saturation or contrast-invariant orientation tuning emerges, depending on the strength of the lateral excitation. There is no parameter regime, however, where both phenomena emerge simultaneously. In the regime, where contrast saturation is found, the model also correctly reproduces suppression due to a second, cross-oriented grid within the classical receptive field. If two model hypercolumns are mutually coupled by long-range connections which are iso-orientation specific, nonclassical surround stimuli show either suppression or facilitation for all surround orientations. Sensitivity to orientation contrast is not observed. This property requires excitatory-to-excitatory long-range couplings that are less orientation specific than those targeting inhibitory neurons. in Biological Cybernetics, in press available at: http://ni.cs.tu-berlin.de/publications/ From X.Yao at cs.bham.ac.uk Wed Jan 26 05:37:55 2000 From: X.Yao at cs.bham.ac.uk (Xin Yao) Date: Wed, 26 Jan 2000 10:37:55 +0000 (GMT) Subject: evolutionary computing + neural networks Message-ID: -------------------------------------------------------------------------- CALL FOR PAPERS Special Issue of Journal of INTEGRATED COMPUTER-AIDED ENGINEERING (Founded in 1993) on Evolutionary Computing and Neural Networks URL: http://www.cs.bham.ac.uk/~xin/icae_cfps.html The international journal Integrated Computer-Aided Engineering is planing a special issue on Combination of Evolutionary Computing and Neural Networks to be published in early 2001. We are particularly interested in manuscripts focusing on synergetic combination of evolutionary computation and neural networks. Please send five copies of your original unpublished manuscript by *March 1, 2000* to the Guest Editor: Professor Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, United Kingdom Phone: +44 121 414 3747, Fax: +44 121 414 4281 Email: x.yao at cs.bham.ac.uk And one copy to the Editor-in-Chief: Professor Hojjat Adeli, Editor-in-Chief, ICAE Dept. of Civil and Environmental Engineering and Geodetic Science The Ohio State University, 470 Hitchcock Hall, 2070 Neil Avenue Columbus, OH 43210 U.S.A. Email: Adeli.1 at osu.edu Submission of a manuscript implies that it is the author's original unpublished work and has not been submitted for publication elsewhere. Potential contributors can request a complimentary sample copy of the journal from the publisher, IOS Press (www.iospress.nl, Fax in Netherlands: 31-20-620 3419, Fax in U.S.A.: 1-703-323 3668). -------------------------------------------------------------------------- From jaksa at neuron-ai.fei.tuke.sk Wed Jan 26 12:33:57 2000 From: jaksa at neuron-ai.fei.tuke.sk (Rudolf Jaksa) Date: Wed, 26 Jan 2000 18:33:57 +0100 (CET) Subject: ISCI 2000 Message-ID: <14479.12293.769077.360302@neuron-ai.tuke.sk> ***************************************** Announcement of the fellowship program for young scientists participation on ISCI 2000 **************************************** Conference is supported by: International neural network society European neural network society Asian-Pacific neural network Assembly Nuclear Power Plant research Institute / Slovakia European Union - 5. FP financial support Type of event: Euroconference **************************************** Who can ask for financial support? Any research person from EU who wants to attend the symposium can ask for financial support that covers - travel expenses, accommodation and fee. Applicant must be under 35. Preference will be given to active persons who are submitting a paper to the symposium. Proceedings will be published in Springer-Verlag. ***************************************** Who from invited speakers will attend the event? The list of invited and confirmed speakers is as follows: Prof. Zadeh - USA Prof. Goldberg - USA Prof. Bezdek - USA Prof. Werbos - USA Prof. Zurada - USA Prof. Adeli - USA Dr. Igelnik - USA Dr. Merenyi - USA Prof. Hirota - Japan Prof. Fukushima - Japan Prof. Takagi - Japan Prof. Kasabov - New Zealand Prof. Moraga - Germany Prof. Pap - Yougoslavia Prof. Kacprzyk - Poland Prof. Duch - Poland Dr. Kurkova - Czech Republic Prof. Gams - Slovenia ***************************************** What is the deadline for fellowship application? The deadline for fellowship application is 25-th of February 2000. Applicant should send mail with his basic personal data to e-mail isci at neuron-ai.tuke.sk Decision about grant award will be announced until 25-th of March 2000 ****************************************** ****************************************** ****************************************** ****************************************************************** International Symposium on Computational Intelligence ISCI - 2000 Kosice - Slovakia August 30 - September 1, 2000 ****************************************************************** Web pages worldwide: -------------------- Europe: http://neuron-ai.tuke.sk/cig/isci USA: http://cns.bu.edu/~kopco/isci Symposium is organized by: -------------------------- Faculty of Electrical Engineering and Informatics Technical University of Kosice, Slovakia Faculty of Chemical Technology and Faculty of Civil Engineering Slovak Technical University, Bratislava, Slovakia Symposium is supported by: -------------------------- International Neural Network Society Asian Pacific Neural Network Assembly European Neural Network Society Nuclear Power Plants Research Institute, Inc., Trnava, Slovakia Invited speakers - confirmed : Prof. Zadeh - USA Prof. Goldberg - USA Prof. Bezdek - USA Prof. Zurada - USA Prof. Igelnik - USA Prof. Werbos - USA Dr. E. Merenyi - USA Prof. Hirota - Japan Prof. Fukushima - Japan Prof. H. Takagi - Japan Prof. Kasabov - New Zealand Prof. Moraga - Germany Prof. Gams - Slovenia Prof. W. Duch - Poland Prof. Kacprzyk - Poland Prof. Pap - Yugoslavia Dr. Kurkova - Czech Republic Presentation of Companies : - Ecanse - Siemens Software System some others are under negotiation. Honorary chairpersons: ---------------------- Gail Carpenter - USA Lotfi Zadeh - USA David Goldberg - USA General chairmen: ----------------- V. Kvasnicka - Slovakia R. Mesiar - Slovakia P. Sincak - Slovakia Program Committee: ------------------ B. Kosko - USA J. Zurada - USA H. Adeli - USA E. Merenyi - USA B. Igelnik - USA Z. Michalewicz - USA R. Yager - USA M. Pelikan - USA N. Kopco - USA C. Moraga - Germany B. Reusch - Germany T. Beck - Germany D. Nauck - Germany H. Takagi - Japan K. Hirota - Japan K. Fukushima - Japan N. Kasabov - New Zeland D. Floreano - Switzerland J. Godjevac - Switzerland V. Babovic - Denmark M. O. Odetayo - UK L. Smith - Scotland, UK B. Krose - Netherlands A. Sperduti - Italy T. Gedeon - Australia Mohammadian Masoud - Australia V. Kurkova - Czech Republic J. Tvrdik - Czech Republic P. Osmera - Czech Republic J. Lazansky - Czech Republic P. Hajek - Czech Republic M. Mares - Czech Republic M. Navara - Czech Republic M. Novak - Czech Republic I. Taufer - Czech Republic L. Rutkowski - Poland J. Kacprzyk - Poland R. Tadeusiewicz - Poland L. Trysbus - Poland M. Gams - Slovenia L. Koczy - Hungary A. Varkonyi - Koczy - Hungary I. Ajtonyi - Hungary I. Rudas - Hungary J. Dombi - Hungary M. Jelasity - Hungary L. Godo - Spain F. Esteva - Spain E. Kerre - Belgium B. Bouchon-Meunier - France G. Raidl - Austria A. Uhl - Austria E. P. Klement - Austria E. Pap - Yugoslavia Slovak Program Committee members: --------------------------------- M. Kolcun, M. Hrehus, P. Tino, A. Cizmar, L. Benuskova, V. Pirc S. Figedy, L. Michaeli, I. Mokris, V. Olej, J. Pospichal, B. Riecan, P. Vojtas, G. Andrejkova, I. Farkas, J. Csonto, J. Chmurny, J. Sarnovsky, S. Kozak, L. Madarasz, D. Durackova, D. Krokavec, A. Kolesarova, J. Kelemen, R. Blasko. Scope of the Symposium: ----------------------- This symposium will be looking for answers concerning the following questions: * What is the state of the art in Computational Intelligence? * What are the potential applications of Computational Intelligence for real-world problems? * What are the future trends in Computational Intelligence? Researchers from all over the world are welcome on this symposium, which will have the following goals: 1. Integration of scientific communities working with fuzzy systems, neural networks and evolutionary computation approaches. 2. Strengthening of links between theory and real-world applications of CI and Soft Computing. 3. Promotion of Computational Intelligence in Central Europe with emphasis on commercial presentation of CI-oriented companies. 4. Support of new technologies that improve quality of life and will lead to a user-friendly information society of the 21st century. 5. An opportunity for young researchers to learn and present new results from the CI domain and to integrate into the interna- tional research community. The symposium will cover the following topics: ---------------------------------------------- Neural Networks Fuzzy Systems Evolutionary Computation Neuro-Fuzzy & Fuzzy-Neuro Systems Neuro-Fuzzy Hybrid Systems Artificial Life Application of Computational Intelligence tools in various real-world applications Presentations of CI oriented companies and their products Presentations of Companies: --------------------------- One of the goals of the symposium is to offer a possibility for industrial companies to present their CI-based products and services aimed at various kinds of customers, including banking industry, control engineering, speech and voice recognition, prediction and pattern recognition techniques, medicine and many other application areas. Companies will have a chance to present their products in oral presentation and also in permanent display of the products in the Symposium building. Space of up to 9 square meters per company will be available for the display purposes. Venue: ------ The city of Kosice is located in the eastern part of Slovakia. The first signs of inhabitance can be traced back to the end of the Older Stone Age. The first written mention of the suburb is dated back to the year 1230. The city was granted royal privileges that were helpful in development of crafts, businesses, and production. The oldest guild regulations were registered from the year 1307 and the city received its own coat of arms - the oldest coat of arms out of all the cities in Europe - in 1369. In 1657, due to the economical, administrative and political importance of the city, the first university was established; later the university was converted into a royal university. Kosice is an important cultural, industrial, and educational center in Slovakia and Central Europe with more than 20,000 students. It has 250,000 inhabitants and the historical center is one of the best-renovated centers in central Europe. You may wish to visit Kosice virtually at http://www.kosice.sk A post-meeting trip will be organized to the High Tatras. This mountain region is an area whose natural beauty makes it one of the most remarkable recreation areas not only in Slovakia, but also in the rest of Europe. A complete spectrum of hotels and restaurants await guests whether they come looking for the beauty of the outdoors or for simple relaxation. The High Tatras environ- ment also provides a special variety of sports and recreation facilities. More information can be found at http://www.tatry.sk ------------------------------------------------------------------ Important Dates: ----------------- Extended Abstract (max.4 pages) Submission Deadline : February 21, 2000 * Notification of authors: March 20, 2000 * Camera-ready: May 15, 2000 * Symposium date: August 30 - Spetember 1, 2000 Proceedigs ---------- Proceedings from this symposium will be published in the Springer- Verlag's "Studies in CI" series (edited by Prof. Janusz Kacprzyk) Symposium fees: --------------- Before May 1 After May 1 University rate : 280 USD 360 USD Student's rate : 100 USD 150 USD Industrial rate: 400 USD 450 USD Univ Exhibition space (4 sq. meters) 380 USD 450 USD Industr. Exhibition stand (8 sq. meters) 680 USD 880 USD Symposium packages : Univ. Exhibition space (4 sq. meters) + ISCI fee (one person) = 590 USD 680 USD Industrial Exhibition stand (8 sq. meters) + ISCI fee (one person) = 910 USD 1100 USD ================================================================== ****************************************************************** Special Fellowship program is available for Slovak and Czech participants from the academia. For detailed information please contact the Symposium secretariat. ****************************************************************** All additional information including the format for electronic submission is available on the Symposium Web pages. Mailing address of the symposium secretariat: Dr. J. Vascak Computational Intelligence Group KKUI-FEI, TU Kosice, Letna 9, 042 00 Kosice, Slovakia E-mail: isci at neuron-ai.tuke.sk From arenart at delta.ft.uam.es Thu Jan 27 03:04:06 2000 From: arenart at delta.ft.uam.es (Alfonso Renart) Date: Thu, 27 Jan 2000 09:04:06 +0100 (CET) Subject: 3 papers on Multi-modular associative N.Networks. Message-ID: Dear Connectionists: The following 3 papers on are available at the website: http://www.ft.uam.es/neurociencia/GRUPO/publications_group.html They deal with the subject of autoassociative recurrent networks in systems of several modules and their aplication to the study of working memory mechanisms in delay tasks. Sincerely, Alfonso Renart. %%%%%%%%%%%%%%%%%%%%% Renart A., Parga N. and Rolls E. T. Backprojections in the cerebral cortex: implications for memory storage Neural Computation 11 (6): 1349-1388, 1999. Abstract: Cortical areas are characterized by forward and backward connections between adjacent cortical areas in a processing stream. Within each area there are recurrent collateral connections between the pyramidal cells. We analyze the properties of this architecture for memory storage and processing. Hebb-like synaptic modifiability in the connections, and attractor states, are incorporated. We show the following: (1) The number of memories that can be stored in the connected modules is of the same order of magnitude as the number that can be stored in any one module using the recurrent collateral connections, and is proportional to the number of effective connections per neuron. (2) Cooperation between modules leads to a small increase in the memory capacity. (3) Cooperation can also help retrieval in a module which is cued with a noisy or incomplete pattern. (4) If the connection strength between modules is strong, then global memory states which reflect the pairs of patterns on which the modules were trained together are found. (5) If the intermodule connection strengths are weaker, then separate, local, memory states can exist in each module. (6) The boundaries between the global and local retrieval states, and the non-retrieval state, are delimited. All these properties are analyzed quantitatively with the techniques of statistical physics. %%%%% Renart A., Parga N. and Rolls E. T. Associative memory properties of multiple cortical modules NETWORK 10: 237-255, 1999. Abstract: The existence of recurrent collateral connections between pyramidal cells within a cortical area and, in addition, reciprocal connections between connected cortical areas, is well established. In this work we analyze the properties of a tri-modular architecture of this type in which two input modules have convergent connections to a third module (which in the brain might be the next module in cortical processing or a bi-modal area receiving connections from two different processing pathways). Memory retrieval is analyzed in this system which has Hebb-like synaptic modifiability in the connections and attractor states. Local activity features are stored in the intra-modular connections while the associations between corresponding features in different modules present during training are stored in the inter-modular connections. The response of the network when tested with corresponding and contradictory stimuli to the two input pathways is studied in detail. The model is solved quantitatively using techniques of statistical physics. In one type of test, a sequence of stimuli was applied, with a delay between them. It is found that if the coupling between the modules is low a regime exists in which they retain the capability to retrieve any of their stored features independently of the features being retrieved by the other modules. Although independent in this sense, the modules still influence each other in this regime through persistent modulatory currents which are strong enough to initiate recall in the whole network when only a single module is stimulated, and to raise the mean firing rates of the neurons in the attractors if the features in the different modules are corresponding. Some of these mechanisms might be useful for the description of many phenomena observed in single neuron activity recorded during short term memory tasks such as delayed match-to-sample. It is also shown that with contradictory stimulation of the two input modules the model accounts for many of the phenomena observed in the McGurk effect, in which contradictory auditory and visual inputs can lead to misperception. %%%%% Renart A., Parga N. and E.T. Rolls A recurrent model of the interaction between PF and IT cortex in delay memory tasks Proceedings of: NEURAL INFORMATION PROCESSING SYSTEMS, 1999 (NIPS99) (Denver. Nov. 29 - Dec 4, 1999). Abstract: A very simple model of two reciprocally connected attractor neural networks is studied analytically in situations similar to those encountered in delay match-to-sample tasks with intervening stimuli and in tasks of memory guided attention. The model qualitatively reproduces many of the experimental data on these types of tasks and provides a framework for the understanding of the experimental observations in the context of the attractor neural network scenario. From kak at ee.lsu.edu Thu Jan 27 11:36:50 2000 From: kak at ee.lsu.edu (Subhash Kak) Date: Thu, 27 Jan 2000 10:36:50 -0600 (CST) Subject: Paper on instantaneously trained neural networks Message-ID: <200001271636.KAA00330@ee.lsu.edu> The following paper on instantaneous learning and its applications to time-series prediction and metasearch engine design is available at: http://www.ee.lsu.edu/kak/x5kak.lo.pdf ------------------------------ Subhash Kak, Faster web search and prediction using instantaneously trained neural networks, IEEE Intelligent Systems, vol. 14, pp. 79-82, November/December 1999. Abstract: Over the past few years, we have developed new neural network designs that model working memory in their ability to learn and generalize instantaneously. These networks are almost as good as backpropagation in the quality of their generalization. With their speed advantage, they can be used in many real-time applications of signal processing, data compression, forecasting, and pattern recognition. In this paper, we describe the networks and their applications to two problems: (1) prediction of time-series; (2) design of an intelligent metasearch engine for the Web. The description of these two applications will provide enough information to see how they could be used in other situations. From Leo.van.Hemmen at Physik.TU-Muenchen.DE Thu Jan 27 09:38:35 2000 From: Leo.van.Hemmen at Physik.TU-Muenchen.DE (J. Leo van Hemmen) Date: Thu, 27 Jan 2000 15:38:35 +0100 Subject: Biological Cybernetics' Welcome to 2000 Message-ID: Dear Friends: In the February issue 82/2 (2000) of ``Biological Cybernetics'', the first that was published this year, its Editors-in-Chief Gert Hauske and I have published an Editorial welcoming the new..., well, take whatever you like best: year, decade, century, or millennium. Since new publication formats have been introduced, we think it could make for interesting reading for most of you. We have therefore appended the text as a LaTeX file. If you don't have LaTeX, it is equally readable once you know that \emph{...} means that {...} should be italicized or, in LaTeX terminology, emphasized. Enjoy reading, Leo van Hemmen. >>>$<<< \documentclass[12pt]{article} \usepackage[]{} \begin{document} \pagestyle{empty} \section*{Editorial} As we begin the year 2000, it is time to step back for some historical perspective and to ask how a prominent journal in computational neuroscience and cybernetics might better serve its scientific community in the coming century. What has \emph{Biological Cybernetics} achieved as a long-standing forum for exchange of ideas in this field and where are we going next? It is fair to say that the first important papers in our field were published in this Journal, and that we continue to be a major conduit for influential literature in this domain. It is, however, also clear that progress in neuroscience and in our understanding of information processing in biological systems, in general, will accelerate into the next decade at an unprecedented rate. As we have stressed in our Editorial of last July (issue 81/1), such rapid growth of thought makes it more urgent than ever to facilitate the interaction between experimental reality and theoretical understanding. In the present context, our operational definition of `theory' is mathematical description of neurobiological reality. Theory, then, aims for more, viz., disclosure of underlying mathematical structures that, together, unify our understanding. To this end, we need fundamental concepts that give structure to the many particular observations we make as scientists, such as momentum and angular momentum in mechanics. As an everyday example of the insights gained from both experimental and theoretical understanding, consider how much better is our knowledge of a bicycle once we have both ridden one, and studied it as a device for creating and conserving angular momentum. And so, we expect, will be the progress in understanding computational neuroscience as a marriage of theory and experiment. In computational neuroscience the hunt for fundamental notions is open. Maybe there is none, which we doubt. A famous example underlining the usefulness of theoretical concepts is Hassenstein and Reichardt's velocity detector. In our opinion, it is a fascinating challenge to see what theoretical concepts look like and what they are. It is our aim that experiment and theory will join their efforts in advancing conceptual understanding and in so doing generate synergy that will benefit both. As for \emph{Biological Cybernetics}, the discussion will evolve in three clearly delineated publication formats. First, important new results are welcomed as \emph{Letters to the Editor}. As a rule, results satisfying the three criteria `novel, important, and well-presented' will be published within three months after submission. Fast publication speed requires electronic submission. Letters can be up to eight pages in print with no more than six pages text and two pages figures and references -- the shorter, the better. (See the ``Instructions for Authors'' for technical details.) Next, regular manuscripts will appear as \emph{Original Papers}. Finally, there are \emph{Reviews}, scholarly reports of rapid developments that the Editorial Board considers of key importance to information processing in neuronal systems. They can be either invited or unsollicited. We will also invite, and welcome, contributions submitted for publication as \emph{Prospects}, a novel form of `review' emphasizing future developments more than a the typical Review does, and giving more license to personal speculation, provided it is clearly explained. The Journal has seen a substantial increase in number of submitted manuscripts since our previous editorial calling for its intrinsic coverage, essentially all aspects of communication and control in biological information processing. In recognition of the need for vigorous and wide-ranging exchange of ideas for progress in science, we start the new millennium for \emph{Biological Cybernetics} with the redactional setup described above. We are looking forward to your participation in this exchange both as a reader and as an author of papers, be they Letter, Original Paper, Review, or Prospects. Together with the Editorial Board, it is you who makes the Journal a vital medium of communication -- the more so at the beginning of a new millennium. \vspace{0.7cm} \noindent Gert Hauske \\ J.\ Leo van Hemmen. \end{document} >>>$<<< Prof. Dr. J. Leo van Hemmen Physik Department TU M"unchen D-85747 Garching bei M"unchen Germany Phone: +49(89)289.12362 (office) and .12380 (secretary) Fax: +49(89)289.14656 e-mail: Leo.van.Hemmen at ph.tum.de From recruiting at phz.com Thu Jan 27 15:54:16 2000 From: recruiting at phz.com (PHZ Recruiting) Date: Thu, 27 Jan 2000 15:54:16 -0500 Subject: Financial Modeling/Trading Positions Available Message-ID: <200001272054.PAA23876@phz-9.phz.com> PHZ Capital Partners LP is a Boston area trading firm that manages client money using proprietary quantitative algorithms. Our models of the global financial markets are based on a cross-disciplinary blend of financial market theory, novel statistics, and advanced computing technology. PHZ's unique approach and strong trading performance to date have led to exceptional client interest and rapid asset growth. To further expand our business, PHZ is now looking for one or more talented, hard working people to join our research and trading team to work on our next generation of trading systems. Depending on candidate interests and skills, these positions will involve exploratory market and data analysis, development of cutting edge modeling, trading, and risk management algorithms and models, and execution of these strategies through live trading. The successful applicant for these positions will have a demonstrated knack for solving real world analytical problems. We are looking for candidates with a Ph.D. in computer science, statistics, finance, or a related field, or someone with 3-5 years of work experience in an applied research setting. Strong software engineering skills are required (esp. on PCs and Unix). Experience working with large real world numerical data sets and statistical modeling tools (e.g. Splus or SAS) is highly desirable. Applicants should have a keen interest in learning more about the world financial markets, although finance industry experience is not required. The growth potential of these positions is large, both in terms of responsibilities and compensation. Initial compensation will be competitive based on qualifications and will include a significant variable component linked to firm trading performance. PHZ was founded in 1993 and is partially owned by Goldman Sachs. Our clients include large institutions and high net worth individuals. Our staff is a group of highly motivated, friendly people, and we have a fun, comfortable working environment. We are located in a pleasant suburb 17 miles west of Boston. Interested applicants should fax resumes to Jim Hutchinson at 508-653-1745, or email resumes (plain ascii or MS Word format) to recruiting at phz.com. From moatl at cs.tu-berlin.de Sat Jan 29 11:28:05 2000 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Sat, 29 Jan 2000 17:28:05 +0100 Subject: EU ADVANCED COURSE IN COMPUTATIONAL NEUROSCIENCE: ANNOUNCEMENT Message-ID: <38931515.F0ECE310@cs.tu-berlin.de> EU ADVANCED COURSE IN COMPUTATIONAL NEUROSCIENCE (AN IBRO NEUROSCIENCE SCHOOL) AUGUST 21 - SEPTEMBER 15, 2000 INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS, TRIESTE, ITALY DIRECTORS: Erik De Schutter (University of Antwerp, Belgium) Klaus Obermayer (Technical University Berlin, Germany) Alessandro Treves (SISSA, Trieste, Italy) Eilon Vaadia (Hebrew University, Jerusalem, Israel) The EU Advanced Course in Computational Neuroscience introduces students to the panoply of problems and methods of computational neuroscience, simultaneously addressing several levels of neural organisation, from subcellular processes to operations of the entire brain. The course consists of two complementary parts. A distinguished international faculty gives morning lectures on topics in experimental and computational neuroscience. The rest of the day is devoted to practicals, including learning how to use simulation software and how to implement a model of the system the student wishes to study on individual unix workstations. The first week of the course introduces students to essential neuro- biological concepts and to the most important techniques in modeling single cells, networks and neural systems. Students learn how to apply software packages like GENESIS, MATLAB, NEURON, XPP, etc. to the solution of their problems. During the following three weeks the lectures will cover specific brain functions. Each week topics ranging from modeling single cells and subcellular processes through the simulation of simple circuits, large neuronal networks and system level models of the brain will be covered. The course ends with a presentation of the students' projects. The EU Advanced Course in Computational Neuroscience is designed for advanced graduate students and postdoctoral fellows in a variety of disciplines, including neuroscience, physics, electrical engineering, computer science and psychology. Students are expected to have a basic background in neurobiology as well as some computer experience. A total of 32 students will be accepted. Students of any nationality can apply. About 20 students will be from the European Union and affiliated countries (Iceland, Israel, Liechtenstein and Norway plus all countries which are negotiating future membership with the EU). These students are supported by the European Commission and we specifically encourage applications from researchers who work in less-favoured regions of the EU, from women and from researchers from industry. IBRO and ICTP provide support for participation from students of non-Western countries, in particular countries from the former Soviet Union, Africa and Asia, while The Brain Science Foundation supports Japanese students. Students receiving support from the mentioned sources will receive travel grants and free full board at the Adriatico Guest House. More information and application forms can be obtained: - http://www.bbf.uia.ac.be/EU_course.shtml Please apply electronically using a web browser if possible. - email: eucourse at bbf.uia.ac.be - by mail: Prof. E. De Schutter Born-Bunge Foundation University of Antwerp - UIA, Universiteitsplein 1 B2610 Antwerp Belgium FAX: +32-3-8202669 APPLICATION DEADLINE: April 15, 2000. Applicants will be notified of the results of the selection procedures by May 31, 2000. COURSE FACULTY: Moshe Abeles (Hebrew University of Jerusalem, Israel), Carol Barnes (University of Arizona, USA), Avrama Blackwell (George Mason University, Washington, USA), Valentino Braitenberg (MPI Tuebingen, Germany), Jean Bullier (Universite Paul Sabatier, Toulouse, France), Ron Calabrese (Emory University, Atlanta, USA), Carol Colby (University Pittsburgh, USA), Virginia de Sa (University California San Francisco, USA), Alain Destexhe (Laval University, Canada), Opher Donchin (Hebrew University of Jerusalem, Israel), Karl J. Friston (Institute of Neurology, London, England), Bruce Graham (University of Edinburgh, Scotland), Julian J.B. Jack (Oxford University, England), Mitsuo Kawato (ATR HIP Labs, Kyoto, Japan), Jennifer Lund (University College London, England), Miguel Nicolelis (Duke University, Durham, USA), Klaus Obermayer (Technical University Berlin, Germany), Stefano Panzeri (University of Newcastle, England), Alex Pouget (University of Rochester, USA), John M. Rinzel (New York University, USA), Nicolas Schweighofer (ATR ERATO, Kyoto, Japan), Idan Segev (Hebrew University of Jerusalem, Israel), Terry Sejnowski (Salk Institute, USA), Haim Sompolinsky (Hebrew University of Jerusalem, Israel), Martin Stetter (Technical University Berlin, Germany), Shigeru Tanaka (RIKEN, Japan), Alex M. Thomson (Royal Free Hospital, London, England), Naftali Tishby (Hebrew University of Jerusalem, Israel), Alessandro Treves (SISSA, Trieste, Italy), Eilon Vaadia (Hebrew University of Jerusalem, Israel), Charlie Wilson (University of Texas, San Antonio, USA), More to be announced... The 2000 EU Advanced Course in Computational Neuroscience is supported by the European Commission (5th Framework program), by the International Centre for Theoretical Physics (Trieste), by the Boehringer Ingelheim Foundation, by the International Brain Research Organization and by The Brain Science Foundation (Tokyo). -- ---------------------------------------------------------------------- Dr. Martin Stetter phone: ++49-30-314-73117 FR2-1, Informatik fax: ++49-30-314-73121 Technische Universitaet Berlin web: http://www.ni.cs.tu-berlin.de Franklinstrasse 28/29 D-10587 Berlin, Germany ---------------------------------------------------------------------- From fmdist at hotmail.com Sun Jan 30 11:18:50 2000 From: fmdist at hotmail.com (Fionn Murtagh) Date: Sun, 30 Jan 2000 08:18:50 PST Subject: position - Internship, IBM, time series prediction Message-ID: <20000130161850.21940.qmail@hotmail.com> Internship starting May 2000, IBM TJ Watson research center, must have PhD and experience of wavelet/multiscale methods, also neural nets, for signal/time series modeling and prediction. Contact F Murtagh, f.murtagh at qub.ac.uk ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From rod at dcs.gla.ac.uk Mon Jan 31 06:59:41 2000 From: rod at dcs.gla.ac.uk (Roderick Murray-Smith) Date: Mon, 31 Jan 2000 11:59:41 +0000 Subject: Ph.D. & Post-doc vacancies in European Network Message-ID: <3895792D.CD2290F0@dcs.gla.ac.uk> Several positions in this European Commission funded research network might be of interest to researchers who are active in statistically-oriented work, Bayesian networks, or stochastic simulation and who are interested in engineering applications, especially with dynamic systems. Multi-Agent Control: Probabilistic reasoning, optimal coordination, stability analysis and controller design for intelligent hybrid systems http://www.dcs.gla.ac.uk/mac/ Vacancies in the MAC network (deadline 1st March 2000): ------------------------------------------------------------------------ The Multi-Agent Control (MAC) network is a collaboration between the Universities of Glasgow, Strathclyde, Maynooth, NTNU, DTU and the Jozef Stefan Insitute (participants). This project is funded by the European Commission as a Research Training Network. The University of Glasgow acts as the project coordinator. There are vacancies for researchers at each of the members of the network as follows. 1.Pre-doctoral position at University of Glasgow 2.Pre-doctoral position at University of Strathclyde 3.Pre-doctoral position at National University of Ireland, Maynooth 4.Post-Doctoral Position at Norwegian University of Science and Technology 5.Pre-doctoral Position at Technical University of Denmark 6.Post-Doctoral Position at Institut Jozef Stefan, Ljubljana, Slovenia. NOTE: To be elegible it is *essential* that applicants satisfy EU requirements (i.e. be citizen/resident of EU member or associated state - see below for further details). Highlights of the programme for potential applicants are: Challenging programme of interdisciplinary research Industrially relevant research problems Excellent training programme Mobility between network nodes Industrial secondments Competitive salary and relocation package Project Goals ---------------- The overall research objective of the network is to develop rigorous methods for analysis and design of Multi-Agent Control systems. Due to the interdisciplinary nature of this objective, the network has been structured to include expertise from relevant problem domains; probabilistic reasoning, optimisation, stability analysis, control theory and computing science. The specific design problems to be addressed are: 1.To develop probabilistic reasoning methods for design that accommodate the inherent uncertainty in the system's knowledge of the state of the world. The work will build on new developments of computationally-intensive statistical inference tools, for modelling complex physical systems and human control behaviour. 2.To develop tools for rigorously analysing the potentially very strong and safety critical interactions between the outcome of controller decisions and the dynamic behaviour of the overall system. 3.To develop formal methods of design, which incorporate in a single framework, the design of the switching logic, co-ordination between multiple agents as well as optimisation of performance within given constraints on the overall system behaviour. The emphasis in this network is to develop a theory to support the design of computer-controlled systems where performance and safety are crucial. The efficacy of the research results will be evaluated using a number of test-bed industrial applications (aerospace, automotive, process and renewable energy fields). These applications will be supplied by a number of major European industrial companies, some of which are members of the network, and others that have expressed an interest in the scientific output of the network. Software tools developed during prototyping, as well as the scientific results, will be made available to the wider academic and industrial community. Funding ---------- The project is funded by the European Commission under a Research Training Network. The European Commission requires that the candidate is aged 35 years or less at the time of his appointment and must be a national of a Member State of the Community or of an Associated State excluding the country in which you plan to work) or have resided in the Community for at least five year prior to the appointment. It is emphasised that these elegibility conditions are strict requirements. Note that pre-doctoral positions are essentially fully paid Ph.D. positions, where the candidate is expected to gain a Ph.D. by the end of the work period. Post-doctoral positions require the candidate to have qualified for a Ph.D. or equivalent before starting work. Further information ----------------------- Please visit the project web site http://www.dcs.gla.ac.uk/mac/ for more information. Details about the individual vacancies can be found at http://www.dcs.gla.ac.uk/mac/vacancies.htm -- Roderick Murray-Smith Department of Computing Science Glasgow University Glasgow G12 8QQ Scotland http://www.dcs.gla.ac.uk/~rod From nnsp2000 at ee.usyd.edu.au Mon Jan 31 23:38:57 2000 From: nnsp2000 at ee.usyd.edu.au (NNSP 2000) Date: Tue, 1 Feb 2000 15:38:57 +1100 Subject: IEEE NNSP 2000 Call for Papers Message-ID: <00ec01bf6c6f$2d7e8820$581b4e81@ee.usyd.edu.au.pe088> ***************************************************************** CALL FOR PAPERS 2000 IEEE Workshop on Neural Networks for Signal Processing December 11-13, 2000, Sydney, Australia Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council (pending) ***************************************************************** Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council the tenth of a series of IEEE workshops on Neural Networks for Signal Processing will be held at the University of Sydney Campus, Sydney, Australia. The workshop will feature keynote lectures, technical presentations, and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithm and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, generalization, design algorithms, optimization, parameter estimation, nonlinear signal processing, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, sonar and radar, data fusion, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, blind source separation, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. PAPER SUBMISSION PROCEDURE Prospective authors are invited to submit a full paper using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2000 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academica Publishers. SCHEDULE Submission of full paper: March 31, 2000 Notification of acceptance: May 31, 2000 Submission of photo-ready accepted paper: July 15, 2000 Advanced registration, before: September 15, 2000 ORGANIZATION Honorary Chair Bernard WIDROW Stanford University General Chairs Ling GUAN University of Sydney email: ling at ee.usyd.edu.au Kuldip PALIWA Griffith University email: kkp at shiva2.me.gu.edu.au Program Chairs T=FClay ADALI University of Maryland, Baltimore County email: adali at umbc.edu Jan LARSEN Technical University of Denmark email: jl at imm.dtu.dk Finance Chair Raymond Hau-San WONG University of Sydney email: hswong at ee.usyd.edu.au Proceedings Chairs Elizabeth J. WILSON Raytheon Co. email: bwilson at ed.ray.com Scott C. DOUGLAS Southern Methodist University email: douglas at seas.smu.edu Publicity Chair Marc van HULLE Katholieke Universiteit, Leuven email: marc at neuro.kuleuven.ac.be Registration and Local Arrangements Stuart PERRY Defense Science and Technology Organisation email: Stuart.Perry at dsto.defence.gov.au Europe Liaison Jean-Francois CARDOSO ENST email: cardoso at sig.enst.fr America Liaison Amir ASSADI University of Wisconsin at Madison email: ahassadi at facstaff.wisc.edu Asia Liaison Andrew BACK Katestone Scientific email: andrew.back at usa.net PROGRAM COMMITTEE: Amir Assadi Yianni Attikiouzel John Asenstorfer Andrew Back Geoff Barton Herv=E9 Bourlard Andy Chalmers Zheru Chi Andrzej Cichocki Tharam Dillon Tom Downs Hsin Chia Fu Suresh Hangenahally Marwan Jabri Haosong Kong Shigeru Katagiri Anthony Kuh Yi Liu Fa-Long Luo David Miller Christophe Molina M Mohammadian Erkki Oja Soo-Chang Pei Jose Principe Ponnuthurai Suganthan Ah Chung Tsoi Marc Van Hulle A.N. Venetsanopoulos Yue Wang Wilson Wen From mieko at hip.atr.co.jp Mon Jan 3 21:05:12 2000 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Tue, 4 Jan 2000 11:05:12 +0900 Subject: Neural Networks 13(1) Message-ID: <200001040206.LAA29389@mailhost.hip.atr.co.jp> NEURAL NETWORKS 13(1) Contents - Volume 13, Number 1 - 2000 _______________________________________________________________ EDITORIAL: Our millennium issue! S. Grossberg, M. Kawato, J.G. Taylor NEURAL NETWORKS LETTERS: A learning rule for dynamic recruitment and decorrelation K.P. Kording, P. Konig CURRENT OPINIONS: A proposed name for aperiodic brain activity: stochastic chaos W.J. Freeman Neural networks are useful tools for drug design G. Schneider How good are support vector machines? S. Raudys ARTICLES: *** Psychology and Cognitive Science *** Anxiety-like behavior in rats: a computational model C. Salum, S. Morato, A. Roque-Da-Silva *** Neuroscience and Neuropsychology *** Self-organization of orientation maps in a formal neuron model using a cluster learning rule J. Kuroiwa, S. Inawashiro, S. Miyake, H. Aso *** Mathematical and Computational Analysis *** A cascade associative memory model with a hierarchical memory structure M. Hirahara, N. Oka, T. Kindo Cascade associative memory storing hierarchically correlated patterns with various correlations M. Hirahara, O. Oka, T. Kindo On impulsuve autoassociative neural networks Z.-H. Guan, J. Lam, G. Chen Pattern segmentation in a binary/analog world: unsupervised learning versus memory storing C. Lourenco, A. Babloyantz, M. Hougardy Partially pre-calculated weights for the backpropagation learning regime and high accuracy function mapping using continuous input RAM-based sigma-pi Nets R.S. Neville, T.J. Stonham, R.J. Glover *** Engineering and Design *** Neural net based MRAC for a class of nonlinear plants M.S. Ahmed *** Technology and Applications *** Training neural networks to be insensitive to weight random variations S. Orcioni BOOK REVIEW: Reinforcement learning: an introduction R.P.N. Rao _______________________________________________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** From jfgf at cs.berkeley.edu Mon Jan 3 19:00:20 2000 From: jfgf at cs.berkeley.edu (Nando de Freitas) Date: Mon, 03 Jan 2000 16:00:20 -0800 Subject: NIPS MCMC (Markov Chain Monte-Carlo methods) Workshop update Message-ID: <38713814.451FE4D5@cs.berkeley.edu> Dear connectionists The talks, software for the tutorial examples and several related papers are now available from the NIPS MCMC for machine learning workshop page: http://www.cs.berkeley.edu/~jfgf/nips99.html [ Moderator's note: Here is the description of the workshop from the web page: MCMC techniques are a set of powerful simulation methods that may be applied to solve integration and optimisation problems in large dimensional spaces. These two types of problems are the major stumbling blocks of Bayesian statistics and decision analysis. The basic idea of MCMC methods is to draw a large number of samples distributed according to the posterior distributions of interest or weighted such that it is possible to estimate simulation-based consistent estimates. MCMC methods were introduced in the physics literature in the 1950's, but only became popular in other fields at the beginning of the 1990's. The development of these methods is at the origin of the recent Bayesian revolution in applied statistics and related fields including econometrics and biometrics. The methods are not yet well-known in machine learning and neural networks, despite their ability to allow statistical estimation to be performed for realistic and thus often highly complex models. Neal (1996) introduced MCMC methods, specifically the hybrid Monte Carlo method, into the analysis of neural networks. He showed that the approach can lead to many benefits. MCMC methods have also been successfully applied to interesting inference problems in probabilistic graphical models. However, many recent advances in MCMC simulation, including model selection and model mixing, perfect sampling, parallel chains, forward-backward sampling and sequential MCMC among others, have been overlooked by the neural networks community. This workshop will attempt to provide a simple tutorial review of these state-of-the-art simulation-based computational methods. It will also focus on application domains and encourage audience participation. Speakers will be encouraged to keep the presentation at a tutorial level. -- Dave Touretzky, CONNECTIONISTS moderator ] Happy New Year !!! Nando -- Computer Science Division | Phone : (510) 642-2038 387 Soda Hall | Fax : (510) 642-5775 University of California, Berkeley | E-mail: jfgf at cs.berkeley.edu Berkeley, CA 94720-1776 USA | URL : http://www.cs.berkeley.edu/~jfgf From smyth at sifnos.ICS.UCI.EDU Tue Jan 4 17:41:37 2000 From: smyth at sifnos.ICS.UCI.EDU (Padhraic Smyth) Date: Tue, 04 Jan 2000 14:41:37 -0800 Subject: faculty positions in biomedical image/signal analysis at UC Irvine Message-ID: <200001041441.aa10281@gremlin-relay.ics.uci.edu> Dear Connectionist Colleagues, UCI has 5 new faculty slots currently open in our biomedical engineering department - the department was started last year based on an award from the Whittaker Foundation and is expected to grow substantially over the next few years. See http://soeweb.eng.uci.edu/bme/jobs.stm for details. One area of particular relevance to readers of this list is biomedical image and signal analysis: there are excellent opportunities for collaborative research across campus in this area at present, e.g., we have very active medical and biological research programs in brain imaging (e.g., in Alzheimer's research, autism research) with significant opportunities for interdisciplinary projects. Note that researchers whose focus is specifically in medical imaging (for example) are likely to be of more interest to UCI than researchers interested in image analysis in general. Other listed research areas of potential interest to connectionists are computational neuroscience, quantitative modeling of biological systems, and parallel and/or distributed biomedical computational systems. Note that although the advertised deadline is January 1, late applications are still welcome - I meant to send this email out last Fall :). Positions are available at both senior and junior levels. Please do not send your applications to me personally (use the address on the Web page). But feel free to let me know you have applied, particularly if you apply in the image/signal analysis area. Padhraic Smyth Information and Computer Science University of California, Irvine From terry at salk.edu Wed Jan 5 15:59:26 2000 From: terry at salk.edu (terry@salk.edu) Date: Wed, 5 Jan 2000 12:59:26 -0800 (PST) Subject: NEURAL COMPUTATION 12:1 Message-ID: <200001052059.MAA09969@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 1 - January 1, 2000 ARTICLES Dorrectness of Local Probability Propagation in Graphical Models with Loops Yair Weiss Population Dynamics of Spiking Neurons Fast Transients, Asynchronous States, and Locking Wulfram Gerstner Dynamics of Strongly-Coupled Spiking Neurons Paul C. Bresloff and S. Coombes NOTES On Connectedness: A Solution Based On Oscillatory Correlation DeLiang L. Wang Practical Identifiability of Finite Mixtures of Multivariate Bernoulli Distributions Miguel A. Carreira-Perpinan and Steve Renals LETTERS The Effects of Pair-wise And Higher-order Correlations on the Firing Rate of a Postsynaptic Neuron S. M. Bohte, H. Spekreijse and P. R. Roelfsema Effects of Spike Timing On Winner-Take-All Competition in Model Cortical Circuits Erik D. Lumer Model Dependence in Quantification of Spike Interdependence by Joint Peri-Stimulus Time Histogram Hiroyuki Ito and Satoshi Tsuji Reinforcement Learning in Continuous Time and Space Kenji Doya ----- ON-LINE - http://neco.mitpress.org/ ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From morten at compute.it.siu.edu Thu Jan 6 20:09:55 2000 From: morten at compute.it.siu.edu (Dr. Morten H. Christiansen) Date: Thu, 6 Jan 2000 19:09:55 -0600 (CST) Subject: Special Issue of Cognitive Science on Connectionist Language Processing Message-ID: The members of this list may be interested in the most recent issue of Cognitive Science which is a Special Issue on connectionist language processing: Christiansen, M.H., Chater, N. & Seidenberg, M.S. (Eds.) (1999). Connectionist models of human language processing: Progress and prospects. Special issue of Cognitive Science, Vol. 23(4), 415-634. PREFACE Connectionist Models of Human Language Processing: Progress and Prospects Editors Morten H. Christiansen, Nick Chater & Mark S. Seidenberg This Special Issue appraises the progress made so far and the prospects for future development of connectionist models of natural language processing. This project is timely - the decade since the publication of Rumelhart & McClelland's influential PDP volumes has seen an explosive growth of connectionist modeling of natural language, ranging from models of early speech perception, to syntax and to discourse level phenomena. The breadth and variety of this work is illustrated in the review, which forms the introductory paper in the volume. How much has been achieved by this vast research effort? Part I presents some of the most recent progress by leading connectionist researchers, in a range of topics of central interest in language processing. Gaskell & Marslen-Wilson describe recent developments in connectionist models of speech perception. Plunkett & Juola report on progress in the highly controversial area of connectionist models of morphology. Tabor & Tanenhaus describe their work utilizing recurrent networks to model parsing within a dynamic perspective. Dell, Chang & Griffin provide accounts of lexical and syntactic aspects of language production. Plaut outlines recent developments in connectionist models of reading. Where Part I brings us to the forefront of current connectionist modeling of natural language processing, Part II considers the prospects for future research. Seidenberg and MacDonald argue that connectionism provides a fundamentally new way of looking at language processing and acquisition, which challenges traditional viewpoints derived from linguistics. By contrast, Smolensky attempts to synthesize lessons learned from both linguistics and connectionist research, arguing that progress will come from providing an integration of the two approaches. Steedman takes on the role as an "outside" observer, seeking to put connectionist natural language processing in perspective. Connectionist modeling has had a vast impact throughout cognitive science, and has been both most productive and most controversial in the area of natural language processing and acquisition. This issue can be used as an overview of the "state of the art" in connectionist models of natural language processing. But more important, we hope that it serves also as a contribution to the current research effort in this area, and as a stimulus to informed debate concerning future research on human natural language. TABLE OF CONTENTS (Abstracts can be found at http://siva.usc.edu/~morten/cs.SI-abtracts.html) Introduction Connectionist Natural Language Processing: The State of the Art. Morten H. Christiansen & Nick Chater Part I: Progress Ambiguity, Competition and Blending in Spoken Word Recognition. M. Gareth Gaskell & William D. Marslen-Wilson A Connectionist Model of English Past Tense and Plural Morphology. Kim Plunkett & Patrick Juola Dynamical Models of Sentence Processing. Whitney Tabor & Michael K. Tanenhaus Connectionist Models of Language Production: Lexical Access and Grammatical Encoding. Gary S. Dell, Franklin Chang & Zenzi M. Griffin A Connectionist Approach to Word Reading and Acquired Dyslexia: Extension to Sequential Processing. David C. Plaut Part II: Prospects A Probabilistic Constraints Approach to Language Acquisition and Processing. Mark S. Seidenberg & Maryellen C. MacDonald Grammar-based Connectionist Approaches to Language. Paul Smolensky Connectionist Sentence Processing in Perspective. Mark Steedman [Sorry, I can provide no hardcopies - for electronic copies, please contact the authors directly]. Best regards, Morten Christiansen ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A Personal Web Page: http://www.siu.edu/~psycho/faculty/mhc.html Lab Web Site: http://www.siu.edu/~morten/csl ---------------------------------------------------------------------- From morten at compute.it.siu.edu Fri Jan 7 11:42:57 2000 From: morten at compute.it.siu.edu (Dr. Morten H. Christiansen) Date: Fri, 7 Jan 2000 10:42:57 -0600 (CST) Subject: Graduate Openings in Brain and Cognitive Sciences Message-ID: Dear Colleague, Please bring the following information to the attention of potential graduate school applicants from your program with an interest in Brain and Cognitive Sciences. GRADUATE PROGRAM IN BRAIN AND COGNITIVE SCIENCES IN THE DEPARTMENT OF PSYCHOLOGY AT SOUTHERN ILLINOIS UNIVERSITY, CARBONDALE. The Department of Psychology at Southern Illinois University has several openings for fall 2000 admission to its newly established Ph.D. program in Brain and Cognitive Sciences. The program emphasizes cognitive behavior approached from a combination of developmental (infancy and childhood, adolescence and aging), neurobiological (neurophysiology, neuropsychology, genetics), behavioral (human and animal experimentation) and computational (neural networks, statistical analyses) perspectives. As an integral part of their training, students become active participants in ongoing faculty research programs in the Brain and Cognitive Sciences. Students will receive training in two or more different research methodologies, and are expected to develop a multidisciplinary approach to their own research. Current research by the Brain and Cognitive Sciences faculty includes perinatal risk factors in child development, neurophysiological and behavioral correlates of infant and child cognitive and language development, personality and social correlates of cognitive aging, child play and social behaviors, identity development across the life span, neural network modeling of language acquisition and processing, artificial grammar learning, sentence processing, evolution of language and the brain, the pharmacological modulation of memory, effects of psychoactive drugs, reversible inactivation of discrete brain areas and memory, recovery of function from brain damage, electrophysiological models (e.g., long-term potentiation), the neurophysiology of memory, animal learning, and human learning and memory. For more information about the program and application procedures, please visit our web site at: http://www.siu.edu/~psycho/bcs Visit also the Department's web site at: http://www.siu.edu/~psycho The deadline for applications is February 1st, 2000. Complete applications received by January 15, 2000 may be considered for one of the prestigious Morris Fellowships. Best regards, Morten Christiansen Coordinator of the Brain and Cognitive Sciences Program ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A Personal Web Page: http://www.siu.edu/~psycho/faculty/mhc.html Lab Web Site: http://www.siu.edu/~morten/csl ---------------------------------------------------------------------- From caroly at cns.bu.edu Fri Jan 7 15:02:12 2000 From: caroly at cns.bu.edu (Carol Yanakakis Jefferson) Date: Fri, 7 Jan 2000 15:02:12 -0500 Subject: Cognitive and Neural Systems: A Tenth Anniversary Celebration Message-ID: <200001072002.PAA01349@cochlea.bu.edu> COGNITIVE AND NEURAL SYSTEMS: A TENTH ANNIVERSARY CELEBRATION Tuesday, May 23,2000 at the Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, MA 02215 This one-day event celebrates the tenth anniversary of our department. It will be filled with talks by past graduates of the department, and will include plenty of time for discussion and celebration. The event is open to the public and there is no admission fee. If you plan to attend, please send email to Carol Jefferson (caroly at cns.bu.edu) by May 1, 2000 so that we can estimate attendance for purposes of planning enough food and drink. The celebration will come right before the Fourth International Conference on Cognitive and Neural Systems, which occurs from Wednesday, May 24 through Saturday, May 27. This conference drew around 300 participants from 31 countries last year, and focuses on the two themes: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? For further information about this conference, see http://cns.bu.edu/meetings/ TENTH ANNIVERSARY PROGRAM 8:30-9:00 Provost Dennis Berkey and Stephen Grossberg, Boston University Introduction and Welcome 9:00-9:30 Gregory Francis, Purdue University Orientational Afterimages: Evidence for FACADE 9:30-10:00 Alexander Grunewald, Cal Tech The Perception of Visual Motion: Psychophysics, Physiology and Modeling 10:00-10:30 John Reynolds, NIMH Visual Salience, Competition and Selective Attention 10:30-11:00 Coffee Break 11:00-11:30 David Somers, MIT Attentional Mechanisms in Human Visual Cortex: Evidence from fMRI 11:30-12:00 Luiz Pessoa, NIMH Attentional Strategies for Object Recognition 12:00-12:30 Bruce Fischl, Mass General Hospital Surface-Based Analysis of the Human Cerebral Cortex 12:30-2:00 Lunch (everyone on their own) 2:00-2:30 Paul Cisek, University of Montreal Two Action Systems: Specification and Selection in the Cerebral Cortex 2:30-3:00 John Fiala, Boston University Structural Dynamics of Synapses 3:00-3:30 Karen Roberts, Cognex Corp. Alignment and Inspection of Boundary Contours 3:30-4:00 Coffee Break 4:00-4:30 Gary Bradski, Intel Corp. Motion Segmentation and Pose Recognition with Motion History Gradients 4:30-5:00 Rob Cunningham, MIT Lincoln Laboratory Detecting Computer Attackers: Recognizing Patterns of Malicious, Stealthy Behavior 5:00-8:00 Reception From arbib at pollux.usc.edu Fri Jan 7 18:52:35 2000 From: arbib at pollux.usc.edu (Michael Arbib) Date: Fri, 07 Jan 2000 15:52:35 -0800 Subject: Faculty Position in Computational Neuroscience/Neuroinformatics Message-ID: <200001072355.PAA01066@relay.usc.edu> The University of Southern California (USC) Department of Computer Science (www.usc.edu/dept/cs) invites applications for a tenure-track position in computational neuroscience/neuroinformatics. We are looking for an individual who has an exceptional track record in modeling large-scale neural systems and working with experimentalists to link their data to these models. The successful applicant will be involved in teaching in a computer science and interdisciplinary environment, and will also have the technical ability to serve as Associate Director of the USC Brain Project (http://www-hbp.usc.edu). In particular, the candidate is expected to supervise and contribute to the development of a database environment for integration of empirical neuroscience data and brain models. The computer science community at USC is large and diverse, with faculty both on the University Park Campus and at USC's Information Sciences Institute (ISI). Research topics include algorithms and cryptography, collaborative agents, computational neuroscience, computer architecture, databases and information management, educational technology, genomics & DNA computing, graphics and multi-media, knowledge acquisition, knowledge representation, learning, natural language processing, networking, neural networks, neuroinformatics, ontologies, planning, robotics, software engineering, virtual humans, and vision. Computer Science at USC has a long history as a key component of the University's Neuroscience Program (NIBS: Neural, Informational and Behavioral Sciences) with work in cognitive neuroscience, computational neuroscience, language mechanisms, neural engineering (through the Center for Neural Engineering (http://www.usc.edu/dept/engineering/CNE), neuroinformatics, vision, and visuomotor coordination (with links to biomimetic robotics). In particular, the USC Brain Project (USCBP), funded in part by the Human Brain Project consortium, integrates empirical research in the neuroanatomy, neurochemistry and neurophysiology of synaptic plasticity, motivation, and visuomotor coordination with research in neuroinformatics, adapting such computational techniques as databases, the World Wide Web, data mining, and visualization to the analysis of neuro-science data, and employing computational neuroscience to study the relations between structure and function. USC is also part of the Dynamic Brain Project, an international research focus on computational motor control. USC's broader computer science community includes not only the Computer Science Department, the Computer Engineering Program and ISI but also the Integrated Media Systems Center (IMSC), an NSF Engineering Research Center focusing on computer interfaces, information management and media communications; and the newly created Institute for Creative Technologies that brings together expertise from USC's Schools of Engineering, Cinema-Television, and Communications (Annenberg) plus the entertainment industry, to develop the art and technology for compellingly realistic virtual experiences. Preliminary enquiries and requests for further information may be sent to Michael Arbib (arbib at pollux.usc.edu). Applicants should send a comprehensive resume, a list of references, and a statement of goals to: Paulina Tagle, Computer Science, USC SAL 300, Los Angeles, CA 90089-0781 (paulina at pollux.usc.edu). USC is an Equal Opportunity/Affirmative Action Employer and encourages applications from women and minorities. From arbib at pollux.usc.edu Fri Jan 7 19:01:06 2000 From: arbib at pollux.usc.edu (Michael Arbib) Date: Fri, 07 Jan 2000 16:01:06 -0800 Subject: Faculty Position in Machine Learning Message-ID: <200001080004.QAA01197@relay.usc.edu> The University of Southern California (USC) Department of Computer Science (http://www.usc.edu/dept/cs/) invites applications for tenure-track positions from outstanding candidates in Machine Learning. Exceptional candidates in other areas of Artificial Intelligence (and Computer Science) may also be considered. We are particularly seeking candidates with strong collaborative inclinations. USC's Intelligent Systems Group comprises faculty both on the main campus and at USC's Information Sciences Institute (ISI). It ranks fourth overall in terms of AAAI Fellows and includes the current Chair of SIGART along with two former chairs. Research topics include collaborative agents, computational neuroscience, educational technology, knowledge acquisition, knowledge representation, learning, natural language processing, neural networks, ontologies, planning, robotics, virtual humans, and vision. The computer science community at USC is similarly large and diverse, with emphases in addition to intelligent systems in such areas as algorithms and cryptography, computer architecture, databases and information management, genomics & DNA computing, graphics and multi-media, networking, neuroinformatics, and software engineering. The broader computer science community includes not only the Computer Science Department, the Computer Engineering Program and ISI but also the Integrated Media Systems Center (IMSC), an NSF Engineering Research Center focusing on computer interfaces, information management and media communications; and the newly created Institute for Creative Technologies that brings together expertise from USC's Schools of Engineering, Cinema-Television, and Communications (Annenberg) plus the entertainment industry, to develop the art and technology for compellingly realistic virtual experiences. If interested, please send a comprehensive resume, a list of references, and a statement of goals to: Paulina Tagle, Computer Science, USC SAL 300, Los Angeles, CA 90089-0781 (paulina at pollux.usc.edu). USC is an Equal Opportunity/Affirmative Action Employer and encourages applications from women and minorities. Distributed Multimedia Information Management and Databases Committee: Databases and Information Management: McLeod (Chair), Boehm [?], Nikias Long Ad: The University of Southern California (USC) Department of Computer Science (http://www.usc.edu/dept/cs/) invites applications for tenure-track positions from outstanding candidates in Distributed Multimedia Information Management and Databases. This position will be in conjunction with the USC Integrated Media Systems Center (IMSC), an NSF ERC in the area of Integrated Media Systems. The applicant will be expected to take a major leadership role in IMSC Research and collaboratory projects. The computer science community at USC is large and diverse, with emphases in such areas as algorithms and cryptography, collaborative agents, computational neuroscience, computer architecture, databases and information management, educational technology, genomics & DNA computing, graphics and multi-media, knowledge acquisition, knowledge representation, learning, natural language processing, networking, neural networks, neuroinformatics, ontologies, planning, robotics, software engineering, virtual humans, and vision. The broader computer science community includes not only the Computer Science Department, the Computer Engineering Program and IMSC, but also the Information Sciences Institute and the newly created Institute for Creative Technologies that brings together expertise from USC's Schools of Engineering, Cinema-Television, and Communications (Annenberg) plus the entertainment industry, to develop the art and technology for compellingly realistic virtual experiences. If interested, please send a comprehensive resume, a list of references, and a statement of goals to: Paulina Tagle, Computer Science, USC SAL 300, Los Angeles, CA 90089-0781 (paulina at pollux.usc.edu). USC is an Equal Opportunity/Affirmative Action Employer and encourages applications from women and minorities. From aslin at cvs.rochester.edu Sun Jan 9 15:00:28 2000 From: aslin at cvs.rochester.edu (Richard Aslin) Date: Sun, 9 Jan 2000 15:00:28 -0500 Subject: postdoc positions at the University of Rochester Message-ID: The University of Rochester seeks five or more outstanding postdoctoral fellows with research interests in several areas of the Cognitive Sciences, including language, learning, and development. Three grants from NIH and NSF provide support. (1) An NIH training grant is affiliated with the Center for the Sciences of Language. The Center brings together faculty and students with interests in spoken and signed languages from the Departments of Brain and Cognitive Sciences, Computer Science, Linguistics, and Philosophy, as well as the interdepartmental program in Neuroscience. We encourage applicants from any of these disciplines who have expertise in any area of natural language. We are particularly interested in postdoctoral fellows who want to contribute to an interdisciplinary community. (2) A second NIH training grant spans the disciplines of Learning, Development, and Behavior. Applicants should have expertise in human or animal research on learning and developmental plasticity or in computational modeling. Contributing faculty are in the Departments of Brain and Cognitive Sciences, Computer Science, and the interdepartmental program in Neuroscience. (3) An NSF research grant on Learning and Intelligent Systems is directed to questions of rapid statistical learning in a variety of domains. Applicants should have expertise in behavioral, computational, or neurobiological approaches to statistical learning in humans or animals. Contributing faculty are in the Departments of Brain and Cognitive Sciences at Rochester and the Department of Psychology at Harvard. The NIH fellowships are open only to US citizens or permanent residents. Applicants should send a letter describing their graduate training and research interests, a curriculum vitae, and arrange to have three letters of recommendation sent to: Professor Richard N. Aslin, Department of Brain and Cognitive Sciences, Meliora Hall, University of Rochester, Rochester, NY 14627-0268. Review of applications will begin on February 15, 2000 and continue until all of the positions are filled, with expected start dates ranging from June 30 to September 1, 2000. Learn more about the relevant departments, faculty, and training opportunities by visiting the University of Rochester web site at http://www.rochester.edu. -------------------------------------------------------- Richard N. Aslin Department of Brain and Cognitive Sciences Meliora Hall University of Rochester Rochester, NY 14627 email: aslin at cvs.rochester.edu phone: (716) 275-8687 FAX: (716) 442-9216 http://www.cvs.rochester.edu/people/r_aslin/r_aslin.html From berthouz at etl.go.jp Tue Jan 11 01:39:26 2000 From: berthouz at etl.go.jp (Luc Berthouze) Date: Tue, 11 Jan 2000 15:39:26 +0900 Subject: postdoc position at the Electrotechnical Laboratory (ETL), Japan Message-ID: <200001110639.PAA08517@aidan.etl.go.jp> We are seeking a postdoctoral fellow with research interests in computational neuroscience and cognitive science to collaborate in a project aiming at identifying the neural correlates of sensorimotor categorization. Candidates should have expertise in computational modeling, in human or animal research on learning and some experience in applying neural models to artificial systems (robots or simulations). The fellowship, for a 2-years period, include salary, accomodation and airfare. Candidates should contact Luc Berthouze for more details. ----- Dr. Luc Berthouze Information Science Division Electrotechnical Laboratory Umezono 1-1-4, Tsukuba 305-8568, Japan Tel: +81-298-545369 Fax: +81-298-545857 email: berthouz at etl.go.jp  From m.niranjan at dcs.shef.ac.uk Tue Jan 11 08:16:59 2000 From: m.niranjan at dcs.shef.ac.uk (Mahesan Niranjan) Date: Tue, 11 Jan 2000 13:16:59 +0000 (GMT) Subject: Research Assistantships Message-ID: <200001111316.NAA02109@bayes.dcs.shef.ac.uk> ______________________________________________________________________________ Research Assistantships: Modelling Tools for Air Pollution Two-year post doctoral positions, one at Sheffield University CS Department and one at Anglia Polytechnic University(APU) Geography Department, will be advertised shortly for immediate start. The project, funded by the European Community involves a consortium of 9 partners, the lead contractor being the Department of Environmental Sciences, University of East Anglia. The project acronym is APPETISE [the 'A' is for 'Air', one of the 'P's is for Pollution, and the rest is necessary condition for EC funding :-)]. A substantial part of the project will involve the use of data driven models, such as neural networks and other time series modelling tools applied to air-pollution, traffic and meteorological data from an urban environment. As is common with such projects, there are milestones, workpackages and deliverables, but there will also be good space for original research. The work at APU will have a slight bias towards instrumentation, data collection and handling. The work at Sheffield will have a theoretical/ modelling bias. If interested, or if you know anyone who might be interested, please let us know: Mahesan Niranjan : m.niranjan at dcs.shef.ac.uk Alison Greig : A.J.Greig at anglia.ac.uk ____________________________________________________________________ Mahesan Niranjan Phone: 44 114 222 1805 Professor of Computer Science FaX: 44 114 222 1810 University of Sheffield Email: M.Niranjan at dcs.shef.ac.uk http://www.dcs.shef.ac.uk/~niranjan ____________________________________________________________________ From terry at salk.edu Tue Jan 11 15:35:51 2000 From: terry at salk.edu (terry@salk.edu) Date: Tue, 11 Jan 2000 12:35:51 -0800 (PST) Subject: NEURAL COMPUTATION 12:2 Message-ID: <200001112035.MAA12438@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 2 - February 1, 2000 ARTICLE Minimizing Binding Errors Using Learned Conjunctive Features Bartlett Mel and Jozsef Fiser NOTES Relationship Between Phase And Energy Methods For Disparity Computation Ning Qian and Sam Mikaelian N-tuple Network, CART And Bagging Aleksander Kolcz Improving The Practice Of Classifier Performance Assessment N. M. Adams and D. J. Hand LETTER Do Simple Cells In Primary Visual Cortex Form A Tight Frame? Emilio Salinas and L.F. Abbott Learning Overcomplete Representations Michael S. Lewicki and Terrence J. Sejnowski Noise In Integrate-And-Fire Neurons: From Stochastic Input To Excape Rates Hans E. Plesser and Wulfram Gerstner Modeling Synaptic Plasticity In Conjunction With The Timing of Pre- And Postsynaptic Action Potentials Werner M. Kistler and J. Leo van Hemmen On-line EM Algorithm For the Normalized Gaussian Network Masa-aki Sato and Shin Ishii A General Probability Estimation Approach for Neural Computation Maxim Khaikine and Klaus Holthausen On The Synthesis Of Brain-State-In-A-Box Neural Models With Application To Associative Memory Fation Sevrani and Kennichi Abe ----- ON-LINE - http://neco.mitpress.org/ ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From bert at mbfys.kun.nl Wed Jan 12 08:17:20 2000 From: bert at mbfys.kun.nl (bert@mbfys.kun.nl) Date: Wed, 12 Jan 2000 14:17:20 +0100 (MET) Subject: Graphical model software Message-ID: <200001121317.OAA02671@bertus.mbfys.kun.nl> SOFTWARE ANNOUNCEMENT We would like to announce BayesBuilder, a tool for constructing and testing Bayesian networks. This software can be used free of charge for non-commercial purposes. BayesBuilder supports the following features: - User friendly graphical interface. - Comprehensive help function. - Defining several views on parts of the network, which is essential for building large networks. - Importing networks from the Hugin Format, the Netica Format, the Microsoft Bayesian Network Format and the Bayesian Interchange Format. - Exporting the status of the network to a database of cases, and importing from the cases database to the network. - Undo/redo support. - Automatic network layout. System requirements: Win32 Release for Windows 95, Windows 98 and Windows NT (4.0) on Intel hardware. A Pentium or faster processor. A minimum of 32 megabytes of RAM is required. A minimum of 22 Mb on harddisk is required. Minimum desktop area: 800x600 pixels. Color pallette: 256 colors, VGA. For more information and for free downloading of BayesBuilder, please go to http://www.mbfys.kun.nl/snn/Research/bayesbuilder/ BayesBuilder was used by our group to develop Promedas, a diagnostic decision support system for anaemia. For more information about Promedas and a free demo CD, see http://www.mbfys.kun.nl/snn/Research/promedas/ Suggestions and comments are welcome. SNN PO Box 9101 6500 HB Nijmegen The Netherlands University of Nijmegen Tel.: +31-(0)24 3614241 Fax.: +31-(0)24 3541435 mailto:snn at mbfys.kun.nl Best regards, Bert Kappen From ecai2000 at mcculloch.ing.unifi.it Wed Jan 12 10:34:35 2000 From: ecai2000 at mcculloch.ing.unifi.it (Paolo Frasconi) Date: Wed, 12 Jan 2000 16:34:35 +0100 Subject: ECAI-2000 Workshop: Connectionist-symbolic integration Message-ID: <000a01bf5d12$87b69e70$6e0fd996@dsi.unifi.it> CALL FOR PAPERS AND PARTICIPATION ECAI-2000 Workshop on Foundations of connectionist-symbolic integration: representation, paradigms, and algorithms http://www.dsi.unifi.it/~paolo/ECAI2000 Humboldt University, Berlin (Germany) August 21, 2000 BACKGROUND In recent years much attention has been paid to the integration of connectionist systems with symbol based techniques. Whereas such an approach has clear advantages, it also encounters serious difficulties and challenges. Various models and ideas have been proposed to address various problems and aspects in this integration. The few unified approaches that have been proposed are still very limited, showing both the lack of a full understanding of the relevant aspects of this new discipline and the broad complexity in scope and tasks. In this workshop, we aim at fostering a deep discussion about at least three topics that we believe to be fundamental for the development of a successful theory of Connectionist-Symbolic Integration: representation, paradigms, and algorithms. Concerning representation, it is fully recognized that structured representations are ubiquitous in different fields such as knowledge representation, language modeling and pattern recognition. The interest in developing connectionist architectures capable of dealing with these rich representations (as opposed to "flat" or vector-based representations) can be traced back to the end of the 80's. Today, after more than ten years since the explosion of interest in connectionism, research in architectures and algorithms for learning structured representations still has a lot to explore and no definitive answers have emerged. Different integration paradigms have also been proposed: these are the unified and the hybrid approaches to integration. Whereas the purely connectionist ("connectionist-to-the-top") approach claims that complex symbol processing functionalities can be achieved via neural networks alone, the hybrid approach is premised on the complementarity of the two paradigms and aims at their synergistic combination in systems comprising both neural and symbolic components. In fact, these trends can be viewed as two ends of an entire spectrum. Topics of interest include: - Algorithms for extraction, injection and refinement of structured knowledge from, into and by neural networks. - Inductive discovery/formation of structured knowledge. - Classification, recognition, prediction, matching and manipulation of structured information. - Neural models to infer hierarchical categories. - Applications of hybrid symbolic-connectionist models to real world problems. All these topics are usually investigated and probed independently from each other and making use of different assumptions and techniques. The organizers believe it is necessary to enforce a higher level of cross-interaction among these issues, making use of all the computational tools we have available, such as deterministic and probabilistic approaches, event-based modeling, computational logic, computational learning theory, and so on. Moreover, special attention will be given to applications domains, with the aim to devise a taxonomy that may be useful to the selection of the most suited integration paradigms and techniques to be used. We hope, also, to be able to discuss some application cases where to verify the basic ideas emerged in the literature and in the workshop's discussion itself. PARTICIPATION Participation in the workshop is open to all members of the AI community. Participants are expected to register for the main ECAI-2000 conference (please see http://www.ecai2000.hu-berlin.de for details). The number of participants is limited. The workshop will feature invited talks, contributed presentations, and open discussion. Submitted papers will be reviewed by at least two referees. Articles reporting work in progress are encouraged. However, papers should be original and not already submitted for publication. All submissions should be sent to the organizers by e-mail, in PostScript or PDF format, to the address ecai2000 at mcculloch.ing.unifi.it. Common compression utilities (such as gzip, compress, or winzip) can be used. Submitted papers should not exceed 10 pages. Other researchers interested in attending the workshop without contributing a paper should send a position paper of 1-2 pages describing their interest in the mentioned topics. IMPORTANT DATES Submission Deadline: March 31, 2000 Submission Notification: May 15, 2000 Final Submission Due: June 10, 2000 Workshop Held: August 21, 2000 WORKSHOP ORGANIZERS: Paolo Frasconi, University of Florence, Italy (paolo at dsi.unifi.it) Marco Gori, University of Siena, Italy, (marco at ing.unisi.it) Franz Kurfess, Concordia University, Canada (franz at cs.concordia.ca) Alessandro Sperduti, University of Pisa, Italy (perso at di.unipi.it) From renner at ecst.csuchico.edu Wed Jan 12 13:36:38 2000 From: renner at ecst.csuchico.edu (Renee Renner) Date: Wed, 12 Jan 2000 10:36:38 -0800 (PST) Subject: IC-AI 2000 CFP Message-ID: *** apologies to recipients of multiple lists *** C A L L F O R P A P E R S ============================= Neural Network Subsystems A SPECIAL SESSION OF ==================== The 2000 International Conference on Artificial Intelligence (IC-AI'2000) June 26 - 29, 2000 Monte Carlo Resort, Las Vegas, Nevada, USA SESSION CHAIR: R.S. Renner renner at ecst.csuchico.edu SESSION INFORMATION: ************************************************************************** Artificial neural networks are increasingly being incorporated as components or subsystems of higher-order systems. These systems may represent ensembles of neural networks, fuzzy-neural systems, neural-genetic systems, or other such hybrid intelligent systems intended for classification, prediction, model selection, analysis, data mining or control. This session is open to neural network architectures, algorithms, applications, and tools that contribute to or provide a framework for interfacing ANNs with larger intelligent systems. *************************************************************************** GENERAL INFORMATION: The IC-AI'2000 will be held simultaneously (same location and dates) with The International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA) and The International Conference on Imaging Science, Systems, and Technology (CISST). (A link to the IC-AI'2000 official web site is available from: http://www.ashland.edu/~iajwa/Conferences/index.html) IMPORTANT DATES: February 28, 2000 (Monday): Draft papers (about 4 pages) due April 3, 2000 (Monday): Notification of acceptance May 1, 2000 (Monday): Camera-Ready papers & Prereg. due June 26 - 29, 2000: IC-AI'2000 Conference All accepted papers are expected to be presented at the conference. SCOPE: Topics of interest for other sessions include, but are not limited to, the following: O. Intelligent Information Systems O. Intelligent Software Engineering O. Intelligent Agents O. Intelligent Networks O. Intelligent Databases O. Brain Models O. Evolutionary Algorithms O. Data mining O. Machine Learning O. Reasoning Strategies O. Automated Problem Solving O. Distributed AI Algorithms and Techniques O. Distributed AI Systems and Architectures O. Expert Systems O. Fuzzy Logic O. Genetic Algorithms O. Heuristic Searching O. Knowledge Acquisition O. Knowledge Discovery O. Knowledge Representation O. Knowledge-Intensive Problem Solving Techniques O. Languages and Programming Techniques for AI O. Software Tools for AI O. Natural Language Processing O. Neural Networks and Applications O. Multisource Information Fusion: Theory and Applications O. Multisource-Multisensor Data Fusion O. Learning and Adaptive Sensor Fusion O. Multisensor Data Fusion Using Neural and Fuzzy Techniques O. Integration of AI with other Technologies O. Evaluation of AI Tools O. Evolutionary Computation O. Social Impact of AI O. Applications - Computer Vision O. Applications - Signal Processing O. Applications - Military O. Applications - Surveillance O. Applications - Robotics O. Applications - Medicine O. Applications - Pattern Recognition O. Applications - Face Recognition O. Applications - Finger Print Recognition O. Applications - Finance and Marketing O. Applications - Stock Market O. Applications - Education O. Emerging Applications SUBMISSION OF PAPERS: Prospective authors are invited to submit three copies of their draft paper (about 4 pages) to the session chair, Dr. R.S. Renner, by the due date: R.S. Renner California State University, Chico Department of Computer Science Chico, CA 95929-0410, U.S.A. Tel: (530) 898-5419 Fax: (530) 898-5995 E-mail: renner at ecst.csuchico.edu E-mail and Fax submissions are also acceptable. The length of the Camera-Ready papers (if accepted) will be limited to 7 pages. Papers must not have been previously published or currently submitted for publication elsewhere. The first page of the draft paper should include: title of the paper, name, affiliation, postal address, E-mail address, telephone number, and Fax number for each author. The first page should also include the name of the author who will be presenting the paper (if accepted) and a maximum of 5 keywords. EVALUATION PROCESS: Papers will be evaluated for originality, significance, clarity, and soundness. Each paper will be refereed by two researchers in the topical area. The Camera-Ready papers will be reviewed by one person. PUBLICATION: The conference proceedings will be published by CSREA Press (ISBN). The proceedings will be available at the conference. Some accepted papers will also be considered for journal publication (soon after the conference). ORGANIZERS/SPONSORS: A number of university faculty members and their staff in cooperation with the Monte Carlo Resort (Conference Division, Las Vegas ), will be organizing the conference. The conference will be sponsored by Computer Science Research, Education, & Applications Press (CSREA: USA Federal EIN # 58-2171953) in cooperation with research centers, international associations, international research groups, and developers of high-performance machines and systems. The complete list of sponsors and co-sponsors will be available at a later time. (Last conference's sponsors included: CSREA, the National Supercomputing Center for Energy and the Environment - DOE, The International Association for Mathematics and Computers in Simulation, The International Technology Institute (ITI), The Java High Performance Computing research group, World Scientific and Engineering Society, Sundance Digitial Signal Processing Inc., the Computer Vision Research and Applications Tech., ...) LOCATION OF CONFERENCE: The conference will be held in the Monte Carlo Resort hotel, Las Vegas, Nevada, USA. This is a mega hotel with excellent conference facilities and over 3000 rooms. The hotel is minutes from the Las Vegas airport with free shuttles to and from the airport. This hotel has many vacation and recreational attractions, including: waterfalls, casino, spa, pools & kiddie pools, sunning decks, Easy River water ride, wave pool with cascades, lighted tennis courts, health spa (with workout equipment, whirlpool, sauna, ...), arcade virtual reality game rooms, nightly shows, snack bars, a number of restaurants, shopping area, bars, ... Many of these attractions are open 24 hours a day and most are suitable for families and children. The negotiated hotel's room rate for conference attendees is very reasonable (79USD + tax) per night (no extra charge for double occupancy) for the duration of the conference. The hotel is within walking distance from most other Las Vegas attractions (major shopping areas, recreational destinations, fine dining and night clubs, free street shows, ...). For the benefit of our international colleagues: the state of Nevada neighbors with the states of California, Oregon, Idaho, Utah, and Arizona. Las Vegas is only a few driving hours away from other major cities, including: Los Angeles, San Diego, Phoenix, Grand Canyon, ... EXHIBITION: An exhibition is planned for the duration of the conference. We have reserved 20+ exhibit spaces. Interested parties should contact H. R. Arabnia (address is given below). All exhibitors will be considered to be the co-sponsors of the conference. SESSION CONTACT: Renee S. Renner California State University, Chico Department of Computer Science Chico, CA 95929-0410, U.S.A. Tel: (530) 898-5419 Fax: (530) 898-5995 E-mail: renner at ecst.csuchico.edu CONFERENCE CONTACT: Hamid R. Arabnia The University of Georgia Department of Computer Science 415 Graduate Studies Research Center Athens, Georgia 30602-7404, U.S.A. Tel: (706) 542-3480 Fax: (706) 542-2966 E-mail: hra at cs.uga.edu *********************************************************************** From ted.carnevale at yale.edu Wed Jan 12 16:08:44 2000 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Wed, 12 Jan 2000 16:08:44 -0500 Subject: Two new papers Message-ID: <387CED5C.4728CDFD@yale.edu> The following two articles may be of interest to those who are interested in synaptic integration, either from the theoretical or experimental standpoint, and/or empirically-based modeling of neurons. The first paper will be most relevant to those who are interested in modeling the roles of biophysical mechanisms and use-dependent plasticity in the operation of individual neurons and neural networks. Carnevale, N.T., and Hines, M.L. Expanding NEURON=92s repertoire of mechanisms with NMODL. Neural Computation 12:839-851, 2000 This is an "executive summary"; for those who need to know the details, a preprint of this paper that includes many more figures, examples, and an extensive index is available from our WWW site at http://www.neuron.yale.edu/neuron/papers/nc99/nmodl.htm The second paper shows that many classes of neurons (especially nonpyramidal cells in vertebrate CNS) are fundamentally similar to the processing elements of artificial neural nets, in the sense that synaptic impact at the spike trigger zone is determined by the properties of the synapse itself, and not by the anatomical location of the synapse. It also challenges the widely-held notion that active currents are necessary to overcome location-dependent attenuation of synaptic inputs (with one important exception, as noted below). Passive normalization of synaptic integration influenced by dendritic architecture. David B. Jaffe and Nicholas T. Carnevale J. Neurophysiol. 1999 82(6): p. 3268-3285 If you or your institutuion subscribes to Journal of Neurophysiology, you can get their PDF file of the article from a link at http://jn.physiology.org/cgi/content/abstract/82/6/3268 Otherwise, you may pick up a preprint from http://www.neuron.yale.edu/neuron/papers/jnp99/pasnorm.pdf The two most significant findings reported in this paper are: 1. The peak amplitude of individual PSPs as a function of synaptic location is best predicted by the spatial profile of transfer impedance (Zc), rather than the more commonly studied somatopetal voltage transfer ratio (Vsoma/Vsynapse). 2. Active currents are generally NOT necessary to overcome location-dependent attenuation of PSP amplitudes in real neurons. Dendritic fields that are organized around a central or "primary" dendrite were the only exception to this rule. In other words, peak PSP amplitude observed at the spike trigger zone is very nearly as large as at the synaptic location, and shows little variation with synaptic location in cells such as interneurons, granule cells of the dentate gyrus, and CA3 pyramidal neurons, even when active currents are NOT present. This also applies to synapses onto the basilar branches of CA1 pyramidal cells and deep neocortical pyramids. Since this reduction of location-dependent PSP amplitude variance does not require active currents, we call this phenomenon "passive synaptic normalization." As noted above, passive synaptic normalization does not occur in dendritic fields that have terminal branches organized around a central or "primary" dendrite, e.g. the apical dendrites of CA1 and deep neocortical pyramidal cells. In subsequent work that we presented at the most recent meeting of the Neuroscience Society Carnevale, N.T. and Jaffe, D.B.: Dendritic architecture can compensate for synaptic location without active currents: the importance of input and transfer impedance for synaptic integration. Neuroscience Society Abstracts 25:1741, 1999. we found that the lack of passive synaptic normalization in such dendritic fields is due to the loading effect of terminal branches, which tend to flatten the spatial profile of input impedance along the primary dendrite. --Ted From ken at phy.ucsf.EDU Wed Jan 12 18:24:15 2000 From: ken at phy.ucsf.EDU (Ken Miller) Date: Wed, 12 Jan 2000 15:24:15 -0800 (PST) Subject: UCSF Postdoctoral Fellowships in Theoretical Neurobiology: 2nd Notice Message-ID: <14461.3359.538556.812685@django.ucsf.edu> The Sloan Center for Theoretical Neurobiology at UCSF solicits applications for post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. Applications for postdoctoral fellowships are due Feb.~1, 2000. We also offer predoctoral training, but the application deadline for this year has passed. FOR FULL INFORMATION, PLEASE SEE: http://www.sloan.ucsf.edu/sloan/sloan-info.html In particular, we have recently added to our web site a description of a set of sample projects of particular current interest to the Sloan faculty, to give a more concrete idea of our work to those contemplating entering neuroscience from another field. PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR EMAIL sloan-info at phy.ucsf.edu. From erik at bbf.uia.ac.be Thu Jan 13 09:11:22 2000 From: erik at bbf.uia.ac.be (Erik De Schutter) Date: Thu, 13 Jan 2000 15:11:22 +0100 Subject: CNS 2000 Call For Papers Message-ID: CALL FOR PAPERS Ninth Annual Computational Neuroscience Meeting CNS*2000 July 16-20, 2000 Brugge, Belgium http://cns.numedeon.com/cns2000 DEADLINE FOR SUMMARIES AND ABSTRACTS: **>> 11:59 pm January 26, 2000 <<<<** This is the ninth annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. THIS YEAR'S MEETING The meeting in 2000 will take place for the first time in Europe, in Brugge, Belgium from the 16th to the 20th of July. The meeting will officially start at 9 am, Sunday, July 16th and end with the annual banquet on Thursday evening, July 20th. There will be no parallel sessions. The meeting will include time for informal workshops organized both before and during the meeting. The meeting will be held at the Congress Centre Old Saint-John in Brugge (http://www.brugge.be/toerisme/en/meetinge.htm). Brugge is known in some circles as "The Venice of the North" and is an old and beautiful city with a modern conference center and easy access to international travel connections. Housing accommodations will be available in numerous nearby hotels. SUBMISSION INSTRUCTIONS With this announcement we solicit paper submissions to the meeting. Papers can include experimental, model-based, as well as more abstract theoretical approaches to understanding neurobiological computation. We especially encourage papers that mix experimental and theoretical studies. We also accept papers that describe new technical approaches to theoretical and experimental issues in computational neuroscience. Papers for the meeting should be submitted electronically using a custom designed JAVA/HTML interface found at the meeting web site: http://cns.numedeon.com/cns2000. Authors must submit two descriptions of completed work. First, a 100 word abstract must be provided that succinctly describes the research results. This abstract is published in the conference program as well as on the meeting web site. Authors must also submit a 1000 word description of their research. This description is in the review process and should clearly state the objectives and context for the work as well as the results and its significance. Information on all authors must be entered. Submissions will not be considered if they lack any of the required information or if they arrive late. All submissions will be acknowledged immediately by email. It is important to note that this notice, as well as all other communication related to the paper will be sent to the designated correspondence author only. Full instructions for submission can be found at the meeting web site: http://cns.numedeon.com/cns2000. THE REVIEW PROCESS All papers submitted to CNS are peer reviewed. Because the meeting this year will be held in Europe, we have accelerated the process of paper acceptance to allow more time to make travel plans. For this reason, the review process will take place in two rounds. In the first papers will be judged and accepted for the meeting based on the clarity with which the work is described and the biological relevance of the research. For this reason authors should be careful to make the connection to biology clear in both the 100 word abstract and the 1000 word research summary. We expect to notify authors of meeting acceptance by the second week of February. The second stage of review will take place in March and involves evaluation of each submission by two referees. The primary objective of this round of review will be to select papers for oral presentation. All accepted papers not selected for oral talks as well as papers explicitly submitted as poster presentations will be included in one of three evening poster sessions. Authors will be notified of the presentation format of their papers no later than the second week of May, 2000. CONFERENCE PROCEEDINGS All research accepted and presented at the CNS meeting is eligible for publication in the CNS proceedings. The proceedings volume is published each year as a special supplement to the journal 'Neurocomputing'. In addition the proceedings are published in a hard bound edition by Elsevier Press. 6 page proceedings papers are submitted in October following the meeting. For reference, papers presented at CNS*98 can be found in volumes 26 and 27 of Neurocomputing published in 1999. STUDENT TRAVEL GRANTS We have made an extra effort this year to raise funds to provide travel grant supplements for students and postdoctoral fellows presenting papers. Also, we will have travels grants available both for Europeans and USA participants. While grants are awarded based on need, we anticipate that any presenting student requiring a travel supplement will be able to receive some assistance. In addition, the program committee has arranged very inexpensive housing for students in Brugge. FURTHER MEETING CORRESPONDENCE Additional questions about this year's meeting or the paper submission process can be sent via email to cns2000 at bbb.caltech.edu or via surface mail to: CNS*2000 Division of Biology 216-76 Caltech Pasadena, CA 91125 CNS*2000 ORGANIZING COMMITTEE: Co-meeting Chair / Logistics - Erik De Schutter, University of Antwerp Co-meeting Chair / Finances and Program - Jim Bower, Caltech Governmental Liaison - Dennis Glanzman, NIMH/NIH Workshop Organizer - Maneesh Sahani, University College, London CNS*2000 PROGRAM COMMITTEE: Avrama Blackwell, George Mason University Anders Lansner, Royal Institute of Technology, Sweden Chip Levy, University of Virginia Ray Glantz, Rice University David Horn, University of Tel Aviv Ranu Jung, University of Kentucky Steven J. Schiff, George Mason University Simon Thorpe, CNRS, Toulouse, France From eric at research.nj.nec.com Thu Jan 13 15:28:20 2000 From: eric at research.nj.nec.com (Eric B. Baum) Date: Thu, 13 Jan 2000 15:28:20 -0500 (EST) Subject: Career Opportunities Message-ID: <14462.13606.868870.801430@borg22.nj.nec.com> The NEC Research Institute (NECI) has immediate openings for outstanding researchers in computer science. Candidates are expected to establish a basic research program of international stature. Ph.D.s are required. NECI currently has programs in theory; machine learning; web computing; computer vision; computational linguistics; operating systems, programming languages and compilers; and parallel architecture. We are primarily seeking applicants who work on machine learning of relevance to web applications, or on systems with relevance to web applications. However, we will also give consideration to exceptional applicants in any of our existing areas of focus. The NEC Research Institute, founded ten years ago, has as its mission basic research in Computer Science and Physics underlying future computer and communication technologies. The Institute offers unusual opportunities in that: 1)Members are free to decide on their own basic research directions and projects; 2)Positions include budgets for research, support staff, travel, and equipment; and 3)All results are published in the open literature. Located near Princeton, NJ, NECI has close ties with many outstanding research universities in the area. The Institute's laboratories are state-of-the-art and include several high-end parallel compute servers. For more details about NECI, see http://www.neci.nj.nec.com. Applicants must show documentation of eligibility for employment. NECI is an equal opportunity employer. Full applications should include resumes, copies of selected publications, names of at least three references, and a two-page statement of proposed research directions. Applications will be reviewed beginning February 1, 2000. Please send applications or inquiries to: David L. Waltz VP, Computer Science Research NEC Research Institute 4 Independence Way Princeton, NJ 08540 cs-candidates at research.nj.nec.com From akaysha at kongzi.unm.edu Fri Jan 14 01:30:49 2000 From: akaysha at kongzi.unm.edu (Akaysha Tang) Date: Thu, 13 Jan 2000 23:30:49 -0700 (MST) Subject: No subject Message-ID: From jchsieh at vghtpe.gov.tw Sat Jan 15 20:32:26 2000 From: jchsieh at vghtpe.gov.tw (Jen-Chuen Hsieh) Date: Sun, 16 Jan 2000 09:32:26 +0800 Subject: Faculty and Post-doc positions available Message-ID: <006b01bf5fc1$92bab660$c43efea9@fmrilab> Faculty and Post-Doc Position(s) Available at in Taiwan (National Yang-Ming University and Taipei Veterans General Hospital) on fMRI/Magnetoencephalography, Advanced Signal Processing, and Human Brain Science (preferably cognitive neuropsychology). Wanted: Cognitive neuroscientists, cognitive neuropsychologist, programmers, computer scientists, and physicists to join our growing Human Brain Rseaarch Group. The National Yang-Ming University and Taipei-Veterans General Hospital of Taiwan have in the campus an encampossing setup of PET, 3T-MRI, whole-head MEG, ERP and TMS for brain research. With my best regards! JC -------------------------------------------------- Jen-Chuen Hsieh, MD, PhD Associate Professor & Project Coordinator Integrated Brain Research Unit Taipei Veterans General Hospital; Institute of Neuroscience, School of Life Science Department of Medicine, School of Medicine National Yang-Ming University No.201, Sect.2, Shih-Pai Rd. 112, Taipei Taiwan email: jchsieh at vghtpe.gov.tw tel: (886)-2-28757480 fax: (886)-2-28757612 From terry at salk.edu Mon Jan 17 06:23:12 2000 From: terry at salk.edu (terry@salk.edu) Date: Mon, 17 Jan 2000 03:23:12 -0800 (PST) Subject: Telluride Workshop 2000 Message-ID: <200001171123.DAA15542@hebb.salk.edu> NEUROMORPHIC ENGINEERING WORKSHOP Sunday, JUNE 25 - Saturday, JULY 15, 2000 TELLURIDE, COLORADO ------------------------------------------------------------------------ Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Timmer HORIUCHI (Johns Hopkins University) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) ------------------------------------------------------------------------ We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, June 26 to Sunday, July 15, 2000. The application deadline is Friday, March 3, and application instructions are described at the bottom of this document. The 1999 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, NASA, the Office of Naval Research, and by the Center for Neuromorphic Systems Engineering at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available here. We strongly encourage interested parties to browse through the previous workshop web pages. GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on active participation, with demonstration systems and hands-on-experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware (with an emphasis on analog and asynchronous digital VLSI), are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures on systems neuroscience (in particular learning, oculo-motor and other motor systems and attention), practical tutorials on analog VLSI design, small mobile robots (Koalas and Kheperas), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed (soon to be defined). They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. The analog VLSI practical tutorials will cover all aspects of analog VLSI design, simulation, layout, and testing during the three weeks of the workshop. The first week covers basics of transistors, simple circuit design and simulation. This material is intended for participants who have no experience with analog VLSI. The second week will focus on design frames for silicon retinas, from the silicon compilation and layout of on-chip video scanners, to building the peripheral boards necessary for interfacing analog VLSI retinas to video output monitors. Retina chips will be provided. The third week will feature sessions on floating gates, including lectures on the physics of tunneling and injection, and on inter-chip communication systems. We will also feature a tutorial on the use of small, mobile robots, focussing on Koala's, as an ideal platform for vision, auditory and sensory-motor circuits. Projects that are carried out during the workshop will be centered in a number of groups, including * active vision * audition * olfaction * motor control * central pattern generator * robotics, multichip communication * analog VLSI * learning The active perception project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include a robot head active vision system consisting of a three degree-of-freedom binocular camera system that is fully programmable. The central pattern generator group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The robotics group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The audition group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The multichip communication project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communication will be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place at the new Telluride Public High School (and not at the Elementary School which is being renovated this year) located in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours drive away from Denver (350 miles). Continental and United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of workstations running UNIX, MACs and PCs running LINUX and Microsoft Windows. Unless otherwise arranged with one of the organizers, we expect participants to stay for the entire duration of this three week workshop. FINANCIAL ARRANGEMENT: We have several funding requests pending to pay for most of the costs associated with this workshop. As in 1999, after notification of acceptances have been mailed out around April 1. 2000, participants are expected to pay a $275.- workshop fee. In case of real hardship, this can be waived. Shared condominiums will be provided for all academic participants at no cost to them. We expect participant from National Laboratories and Industry to pay for these modestly priced condominiums. We expect to have funds to reimburse student participants for travel (up to $500 for US domestic travel and up to $800 for overseas travel). Please specify on the application whether such financial help is needed. HOW TO APPLY: The deadline for receipt of applications is March 3. 2000. Applicants should be at the level of graduate students or above (i.e. postdoctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: * First name, Last name, valid email address. * Curriculum Vitae. * One page summary of background and interests relevant to the workshop. * Description of special equipment needed for demonstrations that could be brought to the workshop. * Two letters of recommendation Complete applications should be sent to: Prof. Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 email: terry at salk.edu FAX: (619) 587 0417 Applicants will be notified by email around March 31. 1999 From vera at cs.cas.cz Mon Jan 17 17:30:35 2000 From: vera at cs.cas.cz (Vera Kurkova) Date: Mon, 17 Jan 00 17:30:35 CET Subject: Call for papers NNW'2000 Message-ID: <63036.vera@uivt1.uivt.cas.cz> Call for papers and participation: NNW 2000 10th Anniversary International Conference on Artificial Neural Networks and Intelligent Systems Prague, Czech Republic, July 9-12, 2000 Purpose of the conference The main focus of NNW2000 is the development and application of computational paradigms inspired by natural processes, namely artificial neural networks, evolutionary algorithms, and related subjects including adaptive agents, artificial life, soft computing, etc. The conference takes place in the year of the 10th anniversary of founding the Neural Network World international scientific journal. It is jointly organized by Institute of Computer Science, Academy of Sciences of the Czech Republic, Neural Network World Editorial Board and Action M Agency. Important Dates Submission of draft version of paper: February 15, 2000 Notification of acceptance: April 15, 2000 Delivery of revised papers: May 30, 2000 NNW2000 conference: July 9-12, 2000 Conference topics The following list indicates the areas of interest, but it is not exhaustive: * Neural Networks: Architectures, Algorithms, Approximation, Complexity, Biological Foundations * Evolutionary Computation: Genetic Algorithms, Genetic Programming, Classifier Systems, Artificial Life * Hybrid Systems: Fuzzy Logic, Soft Computing, Neuro-Fuzzy Controller, Genetic Learning of Neural Networks * Adaptive Agents: Models and Architectures, Distribution and Cooperation, AI Agents, Software Agents, Complex Adaptive Systems * Applications: Pattern Recognition, Signal Processing, Simulation, Hardware Implementation, Robotics Call for papers Prospective authors are invited to submit a draft version of paper describing original results for review by an international Program Committee. The paper should be written in English and should not exceed 8 pages. Proposals for tutorials, special sessions, and workshops are also invited. Electronic submission of paper is preferred. It is possible via the conference web site or by email to nnw2000 at uivt.cas.cz Alternatively, three printed copies can be sent to the following address: Roman Neruda Institute of Computer Science Academy of Sciences of the Czech Republic PO Box 5 18207 Prague Czech Republic Accepted papers will be published in the special issue of the Neural Network World journal and available at the conference. Submitting the final version of papers will follow the standard Neural Network World instructions for authors: LaTeX versions (standard article document class) with Encapsulated Postscript figures are preferred, alternative formats (such as MS Word) are possible. Further information * Updated information is available at the conference web site: http://www.cs.cas.cz/nnw2000 * Contact the organizers by email: nnw2000 at cs.cas.cz From barba at cvs.rochester.edu Mon Jan 17 12:13:07 2000 From: barba at cvs.rochester.edu (Barbara Arnold) Date: Mon, 17 Jan 2000 13:13:07 -0400 Subject: 22nd CVS Symposium 2000 (Center for Visual Science) Message-ID: 22nd CVS Symposium 2000 NEURAL CODING June 1 - 3, 2000 For more information or an application, see our website: www.cvs.rochester.edu or contact Barbara Arnold at 716-275-8659 or barba at cvs.rochester.edu One of the fundamental difficulties in understanding the neural basis of perception/cognition is understanding the computational or informational significance of neural activity. This is true at all levels: from individual synapses and neurons, to local circuits and large-scale organization. The enormous complexity of the brain and the behavior it generates demands more sophisticated development of theories of neural coding and communication on a large scale. In the tradition of past CVS Symposia, our goal is to bring recent developments in this fundamentally important topic to a broader audience than that captured by more specialized meetings. We have designed the symposium to bring together leading scientists with diverse perspectives to provide an opportunity for cross-fertilization and interaction that is not usually available. PROGRAM FOR THE MEETING Thursday, June 1 I. Information Coding in Spike Trains II. Early Circuits Friday, June 2 III. Coding Experience: development and plasticity IV. Functional specialization and distributed codes Saturday, June 3 V. Large Scale information flow. SPEAKERS FOR MEETING MOSHE ABELES, Hebrew University DANA BALLARD, University of Rochester KEN BRITTEN, UC Davis CAROL COLBY, University of Pittsburgh MAURIZIO CORBETTA, Washington University DAVID J. FIELD, Cornell University ZACHARY F. MAINEN, Cold Spring Harbor Lab RAFAEL MARCOS YUSTE, Columbia University KEN MILLER, University of California-San Francisco R.CLAY REID, Harvard Medical School TERRENCE J. SEJNOWSKI, Salk Institute for Biological Studies JEFFREY D. SCHALL, Vanderbilt University ROBERT SHAPLEY, New York University ADAM SILLITO, University College London MICHAEL WELIKY, University of Rochester ANTHONY M. ZADOR, Salk Institute ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Barbara N. Arnold Administrator email: barba at cvs.rochester.edu Center for Visual Science phone: 716 275 8659 University of Rochester fax: 716 271 3043 Meliora Hall 274 Rochester NY 14627-0270 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From amari at brain.riken.go.jp Tue Jan 18 00:09:35 2000 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Tue, 18 Jan 2000 14:09:35 +0900 Subject: RIKEN Summer School Message-ID: <20000118140935P.amari@brain.riken.go.jp> RIKEN Brain Science Institute will organize Summer Program 2000 The Brain Science Institute (BSI) at RIKEN is offering a summer program to train advanced students interested in brain function. Applicants may choose either a laboratory internship for two months with one of the 30 Laboratories at BSI, or participate in an intensive two-week lecture course featuring a distinguished international faculty. Summer Interns (Plan A) also enroll in the Lecture Course (Plan B). Travel and lodging expenses will be supported. Deadline : March 31, 2000 INTERNSHIP (PLAN A) July 4 - Sept 1 (Laboratories at RIKEN BSI) ?Neuronal Function Research Group (K.Mori/Y.Yoshihara/R.Yano/T.K.Hensch) ?Neuronal Circuit Mechanisms Research Group (M.Ito/H.Niki/T.Knopfel) ?Cognitive Brain Science Group (K.Tanaka/M.Tanifuji/A.A.Ioannides) ?Developmental Brain Science Group (K.Mikoshiba/H.Okamoto/T.Furuichi/K.Kajiwara) ?Molecular Neuropathology Group (N.Nukina/K.Yamakawa/R.Takahashi/T.Okamoto) ?Aging and Psychiatric Research Group (T.C.Saido/A.Takashima/T.Yoshikawa) ?Brainway Group (G.Matsumoto/M.Ichikawa) ?Brain-Style Information Systems Research Group (S.Amari/S.Tanaka/A.J.Cichocki) ?Advanced Technology Development Center(ATDC) (C.Itakura/T.Hashikawa/S.Itohara/A.Miyawaki/M.Ogawa) LECTURE COURSE (PLAN B) Topic:'How the Brain Works: Experimental and Theoretical Approaches' July 4 - July 15 The purpose of the course is to provide basic concepts necessary for understanding computation in the brain at different levels from synapses to systems, and from both experimental and theoretical perspectives. Individual lecturers will provide basics of their field and advanced topics. (Lecturers) Tomoyuki Takahashi (University of Tokyo) Anthony M. Zador (Cold Spring Harbor Laboratory) Idan Segev (The Hebrew University) Bruce L. McNaughton (University of Arizona) Kensaku Mori (RIKEN/University of Tokyo) Ad Aertsen (Albert-Ludwigs-University) Shun-ichi Amari (RIKEN) Kathleen S. Rockland (University of Iowa) Masakazu Konishi (California Institute of Technology) Mitsuo Kawato (ATR Human Info. Proc. Res. Labs.) Earl K.Miller (RIKEN/Massachusetts Institute of Technology) Nancy G. Kanwisher (Massachusetts Institute of Technology) Shimon Ullman (The Weizmann Institute of Science) Keiji Tanaka (RIKEN) Okihide Hikosaka (Juntendo University) Jun Tanji (Tohoku University) Charles Jennings (Nature Neuroscience, Editor) others to be included FURTHER INFORMATION Application forms : visit our web site http://summer.brain.riken.go.jp or send inquiries to Summer Program Organizing Committee, BSI, RIKEN 2-1 Hirosawa, Wako-shi, Saitama 351-0198, JAPAN E-mail : info at summer.brain.riken.go.jp Fax : +81-48-462-4914 From priel at math.tau.ac.il Tue Jan 18 05:37:09 2000 From: priel at math.tau.ac.il (avner priel) Date: Tue, 18 Jan 2000 12:37:09 +0200 (GMT+0200) Subject: PhD Thesis Message-ID: Dear Connectionists, My PhD thesis is now available from the following URL: http://www.math.tau.ac.il/~priel/papers.html Below please find the abstract. Best wishes, Priel Avner. ----------------------------------------------------- Priel Avner < priel at math.tau.ac.il > < http://www.math.tau.ac.il/~priel > School of Mathematical Sciences, Tel-Aviv University, Israel. ----------------------------------------------------------------------- ----------------------------------------------------------------------- "Dynamic and Static Properties of Neural Networks with FeedBack" Ph.D. Thesis Avner Priel Department of Physics Bar-Ilan University, Israel. ABSTRACT: This thesis describes analytical and numerical study of time series generated by a special type of recurrent neural networks, a continuous-valued feed-forward network in which the next input vector is determined from past output values. The topics covered in this work include the analysis of the sequences generated by the network in the stable and unstable regimes of the parameter space, the effect of an additive noise on the long-term properties of the network and the ability of the model to capture the rule of a long-range correlated sequence. The asymptotic solutions of the sequences generated by the model in the stable regime are found analytically for various architectures, transfer functions and choice of the weights. We find that the generic solution is a quasi-periodic attractor (excluding the cases where the solution is a fixed point). We find a hierarchy among the complexity of time series generated by different architectures; more hidden units can generate higher dimensional attractors. The relaxation time from an arbitrary initial condition to the vicinity of the asymptotic attractor is studied for the case of a perceptron and a two-layered perceptron. In both cases, the relaxation time scales linearly with the size of the network. Although networks with monotonic, as well as non-monotonic, transfer functions are capable of generating chaotic sequences, the unstable regions of the parameter space exhibit different features. Non-monotonic functions can produce robust chaos, whereas monotonic functions generate fragile chaos only. In the case of non-monotonic functions, the number of positive Lyapunov exponents increases as a function of one of the free parameters in the model; hence, high dimensional chaotic attractors can be generated. We study also a combination of monotonic and non-monotonic functions. The stability of the asymptotic results obtained for the model is tested by analysing the effect of an additive noise introduced in the output of the network. A single attractor in the presence of noise is broadened. The phase of a noisy model diffuses w.r.t.\ a noise-free model with a diffusion constant which is inversely proportional to the size of the network; hence, phase coherence is maintained for a time length that is proportional to the network's size. When the network has more than a single possible attractor, they become meta-stable in the presence of noise. We study the properties of an important quantity - the mean first passage time, and derive a relation between the size of the network, the distance from the bifurcation point and the mean first passage time to escape from the basin of attraction. The last subject we address concerns the capability of our model to learn the rule of a sequence obeying a power-law correlation function. An ensemble of long sequences is generated and used to train the network. The trained networks are then used to generate a subsequent sequences. It is found that the generated sequences have a similar power-law correlation function as the training sequences. By studying the properties of the trained networks, we conclude that the correlation function of the weight matrix should be dominated by vertical power-law correlations in order to generate long-range correlated sequences. This conclusion is verified by numerical simulations. Analysis of the mean-field approximation of the correlation function leads to the same qualitative conclusion regarding the weight matrix. From kruschke at indiana.edu Wed Jan 19 06:35:15 2000 From: kruschke at indiana.edu (John K. Kruschke) Date: Wed, 19 Jan 2000 06:35:15 -0500 (EST) Subject: Postdoctoral Position in Cognitive Modeling Message-ID: POSTDOCTORAL TRAINING FELLOWSHIPS in the COGNITIVE SCIENCE PROGRAM at INDIANA UNIVERSITY in MODELING OF COGNITIVE PROCESSES The Psychology Department and Cognitive Science Program at Indiana University anticipate one or more Postdoctoral Traineeships in the area of Modeling of Cognitive Processes, funded by the National Institutes of Health. Appointments will pay rates appropriate for a new or recent Ph.D. and will be for one or two years, beginning July 1, 2000 or later. Traineeships will be offered to qualified individuals who wish to further their training in mathematical modeling or computer simulation modeling, in any substantive area of cognitive psychology or Cognitive Science. Trainees will be expected to carry out original theoretical and empirical research in association with one or more of these faculty and their laboratories, and to interact with other relevant faculty and other pre- and postdoctoral trainees. In addition, they should plan to take or audit courses offered within the Cognitive Modeling Program. We are particularly interested in applicants with strong mathematical, scientific, and research credentials. Indiana University has superb computational and research facilities, and faculty with outstanding credentials in this area of research, including James Townsend, director of the training program, and Jerome Busemeyer, Robert Nosofsky, John Kruschke, Michael Gasser, Robert Goldstone, Geoffrey Bingham, Tom Busey, Donald Robinson, Robert Port, and Richard Shiffrin. Applicants should send an up-to-date vita, relevant reprints and preprints, a personal letter describing their research interests, background, goals, and career plans, and reference letters from two individuals. Women, minority group members, and handicapped individuals are urged to apply. Deadline for submission of application materials is April 1, 2000, but we encourage earlier applications. PLEASE NOTE: The conditions of our grant restrict all awards to U.S. citizens or current green card holders. Awards also have a 'payback' provision, generally requiring awardees to carry out research or teach (not necessarily at IU) for a minimum period after termination of the traineeship. Cognitive Science information may be obtained at http://www.psych.indiana.edu/ Send Materials to Professor Jerome R. Busemeyer Department of Psychology, Rm 367 Indiana University 1101 E. 10th St. Bloomington, IN 47405-7007 Voice: 812-855-4882 Fax: 812-855-1086 email: jbusemey at indiana.edu Indiana University is an Affirmative Action Employer From gini at elet.polimi.it Wed Jan 19 12:55:10 2000 From: gini at elet.polimi.it (Giuseppina Gini) Date: Wed, 19 Jan 2000 18:55:10 +0100 Subject: Research fellowships Message-ID: The positions offered can be of interest to people in this list. - Giuseppina Gini ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +++++ PRE- AND POSTDOCTORAL POSITIONS AVAILABLE Applications are invited from young researchers with an interest in joining a four-year (2000-2003) EU network project aimed at the development of models to predict toxicity, starting Spring 2000. The network consists of laboratories (see the list below) working in toxicology, computational chemistry and computer science. Grants are available to work in one of the seven laboratories of the Network. Grant will be from 1650 to 4320 Euro/month, depending on qualification, experience and location. Pre-doc positions are compatible with PhD studies. Post-doc positions require that the candidate holds (or is near to obtain) a PhD. Interested candidates should send an application (including CV, summary of research interests and preferred location) to: Dr Emilio Benfenati Head, Laboratory of Environmental Chemistry and Toxicology Istituto di Ricerche Farmacologiche "Mario Negri" Via Eritrea 62, 20157 Milano, Italy Tel: +39-02-39014420 Fax: +39-02-39001916 e-mail: benfenati at irfmn.mnegri.it http://www.irfmn.mnegri.it/ambsal/chem-toxi/Benfenati.htm http://www.irfmn.mnegri.it/ambsal/chem-toxi/Default.htm Network Title: Intelligent Modelling Algorithms for the General Evaluation of TOXicities (IMAGETOX) The Co-ordinator: Istituto di Ricerche Farmacologiche "Mario Negri", Laboratory of Environmental Chemistry and Toxicology, Milan, Italy Contact person: Dr Emilio Benfenati (benfenati at irfmn.mnegri.it) The Members - Dr. M. Cronin, Liverpool John Moores University, School of Pharmacy and Chemistry, United Kingdom (m.t.cronin at livjm.ac.uk) - Prof. J. Hermens, Utrecht University, Research Institute of Toxicology, Faculty of Veterinary Sciences, The Netherlands (j.hermens at ritox.vet.uu.nl) - Prof. G. Gini, Politecnico di Milano, Dipartimento di Elettronica e Informazione, Milano, Italy (gini at elet.polimi.it) - Dr. M. Vracko, National Institute of Chemistry, Ljubljana, Slovenia (marjan.vracko at ki.si) - Prof. G. Sch??rmann, UFZ Centre for Environmental Research, Leipzig, Germany (gs at uoe.ufz.de) - Prof. M. Karelson, University of Tartu, Department of Chemistry, Estonia (mati at chem.ut.ee). Conditions for EU grants The researcher must be 35 years old or less at the time of his appointment (allowances are possible for special cases). The researcher must be a holder of a doctoral degree or of a degree in an appropriate subject in Science or Engineering. The appointment will be for a fixed-term. The applicant must be a national of a Member State of the European Community or of an Associated State or have resided in the European Community for at least five years prior to his appointment. The applicant must choose a Centre located in a state different from his national state and he must not have carried out his activities in that state for more than 12 of the 24 months prior to his appointment. Research Program The present project aims to improve the power of models for toxicity prediction. It will take advantage of the recent advances in computer science (such as multivariate analysis, neural networks, expert systems, machine learning, and simulated annealing). Candidates in Computer Science are expected to work on Machine Learning and KDD and to develop theories and systems. These will be applied to the validation (verifying the robustness of reported models and new predictive ones) and development for real world applications. Different models will be developed for (eco)toxicology and for environmental fate prediction. For (eco)toxicology different species will be considered, as well as different mechanisms of toxic action. In environmental fate, partitioning between compartments in the environment and into biota will be studied, as well as degradation. - - - - Giuseppina C. Gini DEI, Politecnico di Milano piazza L. da Vinci 32, I-20133 MILANO fax: (+39) 02-2399.3411 phone: (+39) 02-2399.3626 e-mail: gini at elet.polimi.it http://www.elet.polimi.it/people/gini/ http://www.elet.polimi.it/AAAI-PT member http://www.worldses.org From palm at neuro.informatik.uni-ulm.de Thu Jan 20 06:06:24 2000 From: palm at neuro.informatik.uni-ulm.de (Guenther Palm) Date: Thu, 20 Jan 00 12:06:24 +0100 Subject: KES2000-Sessions Message-ID: <10001201106.AA17467@neuro.informatik.uni-ulm.de> Dear connectionists, I am organizing two special sessions for the KES2000 conference. The conference will take place from August 30 to September 1, 2000 in Brighton, Sussex, U.K. More information on the conference can be obtained from the KES2000 Web site: http://luna.bton.ac.uk/~kes2000/ The topics of the sessions are: 1) Processing of hierarchical structures in neural networks. 2) Biomedical applications of neural networks. Accepted session contributions will be published in the conference proceedings by IEEE. The procedure for submissions is as follows: 1) Please send me a short statement concerning the topic of your intended contribution BEFORE FEBRUARY 04, 2000. This statement may contain a title and a short abstract and should not exceed half a page. Based on these statements we may further focus the topics of the two workshops. I will send out a call for papers ON FEBRUARY 11, 2000. 2) Send the camera ready version of your paper (four A4 pages) UNTIL MARCH 15, 2000. The papers will be reviewd and the results will be communicated to the authors in APRIL, 2000. ------------------------------------------------------------- Guenther Palm Neural Information Processing University of Ulm D-89069 Ulm Germany palm at neuro.informatik.uni-ulm.de From mpessk at guppy.mpe.nus.edu.sg Fri Jan 21 01:52:16 2000 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Fri, 21 Jan 2000 14:52:16 +0800 (SGT) Subject: Tech Report on Convergence of SMO algorithm for SVMs Message-ID: The following Tech Report in gzipped postscript form is available at: http://guppy.mpe.nus.edu.sg/~mpessk/svm/conv1.ps.gz ----------------------------------------------------------------------- Convergence of a Generalized SMO Algorithm for SVM Classifier Design S.S. Keerthi Control Division Dept. of Mechanical and Production Engineering National University of Singapore Tech Rept. CD-00-01 Convergence of a generalized version of the modified SMO algorithms given by Keerthi et.al. for SVM classifier design is proved. The convergence results are also extended to modified SMO algorithms for solving $\nu$-SVM classifier problems. ----------------------------------------------------------------------- From Zoubin at gatsby.ucl.ac.uk Thu Jan 20 13:28:17 2000 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Thu, 20 Jan 2000 18:28:17 +0000 (GMT) Subject: Preprints Available Message-ID: <200001201828.SAA21740@cajal.gatsby.ucl.ac.uk> The following 8 preprints from the Gatsby Computational Neuroscience Unit are now available on the web. These papers will appear in the Proceedings of NIPS 99 (Advances in Neural Information Processing Systems 12, edited by S. A. Solla, T. K. Leen, and K.-R. M?ller, MIT Press). Zoubin Ghahramani Gatsby Computational Neuroscience Unit http://www.gatsby.ucl.ac.uk University College London ---------------------------------------------------------------------- Author: Hagai Attias Title: A Variational Bayesian Framework for Graphical Models URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99vb.ps ---------------------------------------------------------------------- Author: Hagai Attias Title: Independent Factor Analysis with Temporally Structured Sources URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99dfa.ps ---------------------------------------------------------------------- Author: Zoubin Ghahramani and Matthew J Beal Title: Variational Inference for Bayesian Mixtures of Factor Analysers URL: http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.ps.gz http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.pdf ---------------------------------------------------------------------- Author: Geoffrey E. Hinton and Andrew D. Brown Title: Spiking Boltzmann Machines URL: http://www.gatsby.ucl.ac.uk/~andy/papers/nips99_sbm.ps.gz ---------------------------------------------------------------------- Author: Geoffrey E. Hinton, Zoubin Ghahramani and Yee Whye Teh Title: Learning to Parse Images URL: http://www.gatsby.ucl.ac.uk/~ywteh/crednets ---------------------------------------------------------------------- Author: Zhaoping Li Title: Can V1 mechanisms account for figure-ground and medial axis effects? URL: http://www.gatsby.ucl.ac.uk/~zhaoping/prints/nips99abstract.html ---------------------------------------------------------------------- Author: Sam Roweis Title: Constrained Hidden Markov Models URL: http://www.gatsby.ucl.ac.uk/~roweis/papers/sohmm.ps.gz ---------------------------------------------------------------------- Author: Brian Sallans Title: Learning Factored Representations for Partially Observable Markov Decision Processes URL: PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps gzip'd PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps.gz PDF: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.pdf ==================================ABSTRACTS: ==================================Author: Hagai Attias Title: A Variational Bayesian Framework for Graphical Models URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99vb.ps ---------------------------------------------------------------------- Author: Hagai Attias Title: Independent Factor Analysis with Temporally Structured Sources URL: http://www.gatsby.ucl.ac.uk/~hagai/nips99dfa.ps ---------------------------------------------------------------------- Authors: Zoubin Ghahramani and Matthew J Beal Title: Variational Inference for Bayesian Mixtures of Factor Analysers Abstract: We present an algorithm that infers the model structure of a mixture of factor analysers using an efficient and deterministic variational approximation to full Bayesian integration over model parameters. This procedure can automatically determine the optimal number of components and the local dimensionality of each component (i.e.\ the number of factors in each factor analyser). Alternatively it can be used to infer posterior distributions over number of components and dimensionalities. Since all parameters are integrated out the method is not prone to overfitting. Using a stochastic procedure for adding components it is possible to perform the variational optimisation incrementally and to avoid local maxima. Results show that the method works very well in practice and correctly infers the number and dimensionality of nontrivial synthetic examples. By importance sampling from the variational approximation we show how to obtain unbiased estimates of the true evidence, the exact predictive density, and the KL divergence between the variational posterior and the true posterior, not only in this model but for variational approximations in general. URL: http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.ps.gz http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips99.pdf ---------------------------------------------------------------------- Authors: Geoffrey E. Hinton and Andrew D. Brown Title: Spiking Boltzmann Machines Abstract: We first show how to represent sharp posterior probability distributions using real valued coefficients on broadly-tuned basis functions. Then we show how the precise times of spikes can be used to convey the real-valued coefficients on the basis functions quickly and accurately. Finally we describe a simple simulation in which spiking neurons learn to model an image sequence by fitting a dynamic generative model. URL: http://www.gatsby.ucl.ac.uk/~andy/papers/nips99_sbm.ps.gz ---------------------------------------------------------------------- Authors: Geoffrey E. Hinton, Zoubin Ghahramani and Yee Whye Teh Title: Learning to Parse Images Abstract: We describe a class of probabilistic models that we call credibility networks. Using parse trees as internal representations of images, credibility networks are able to perform segmentation and recognition simultaneously, removing the need for ad hoc segmentation heuristics. Promising results in the problem of segmenting handwritten digits were obtained. URL: http://www.gatsby.ucl.ac.uk/~ywteh/crednets ---------------------------------------------------------------------- Author: Zhaoping Li Title: Can V1 mechanisms account for figure-ground and medial axis effects? Abstract: When a visual image consists of a figure against a background, V1 cells are physiologically observed to give higher responses to image regions corresponding to the figure relative to their responses to the background. The medial axis of the figure also induces relatively higher responses compared to responses to other locations in the figure (except for the boundary between the figure and the background). Since the receptive fields of V1 cells are very small compared with the global scale of the figure-ground and medial axis effects, it has been suggested that these effects may be caused by feedback from higher visual areas. I show how these effects can be accounted for by V1 mechanisms when the size of the figure is small or is of a certain scale. They are a manifestation of the processes of pre-attentive segmentation which detect and highlight the boundaries between homogeneous image regions. URL: http://www.gatsby.ucl.ac.uk/~zhaoping/prints/nips99abstract.html ---------------------------------------------------------------------- Author: Sam Roweis Title: Constrained Hidden Markov Models Abstract: By thinking of each state in a hidden Markov model as corresponding to some spatial region of a fictitious _topology space_ it is possible to naturally define neighbouring states as those which are connected in that space. The transition matrix can then be constrained to allow transitions only between neighbours; this means that all valid state sequences correspond to connected paths in the topology space. I show how such _constrained HMMs_ can learn to discover underlying structure in complex sequences of high dimensional data, and apply them to the problem of recovering mouth movements from acoustics in continuous speech. URL: http://www.gatsby.ucl.ac.uk/~roweis/papers/sohmm.ps.gz ---------------------------------------------------------------------- Author: Brian Sallans University of Toronto and Gatsby Unit, UCL sallans at cs.toronto.edu Title: Learning Factored Representations for Partially Observable Markov Decision Processes Abstract: The problem of reinforcement learning in a non-Markov environment is explored using a dynamic Bayesian network, where conditional independence assumptions between random variables are compactly represented by network parameters. The parameters are learned on-line, and approximations are used to perform inference and to compute the optimal value function. The relative effects of inference and value function approximations on the quality of the final policy are investigated, by learning to solve a moderately difficult driving task. The two value function approximations, linear and quadratic, were found to perform similarly, but the quadratic model was more sensitive to initialization. Both performed below the level of human performance on the task. The dynamic Bayesian network performed comparably to a model using a localist hidden state representation, while requiring exponentially fewer parameters. URL: PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps gzip'd PS: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.ps.gz PDF: http://www.gatsby.ucl.ac.uk/~sallans/papers/nips99.pdf ---------------------------------------------------------------------- From munro at lis.pitt.edu Fri Jan 21 17:00:38 2000 From: munro at lis.pitt.edu (Paul Munro) Date: Fri, 21 Jan 2000 17:00:38 -0500 (EST) Subject: Two NIPS papers available Message-ID: The following two papers from our group can be downloaded. The first paper is available only in postscript and the second is in both postscript and acrobat versions. The URLs can be found below with the abstracts. Paul Munro Internet: munro at sis.pitt.edu SIS Bldg 735 Voice: 412-624-9427 Department of Information Science Fax (new #): 412-624-2788 University of Pittsburgh Pittsburgh PA 15260 Personal HTML page = http://www.pitt.edu/~pwm/ (To be in :Advances in Neural Information Processing Systems 12, edited by S. A. Solla, T. K. Leen, and K.-R. Mueller, MIT Press) Effects of spatial and temporal contiguity on the acquisition of spatial information Thea Ghiselli-Crippa and Paul W. Munro URL: www.pitt.edu/~pwm/nips99a.ps ABSTRACT Spatial information comes in two forms: direct spatial information (for example, retinal position) and indirect temporal contiguity information, since objects encountered sequentially are in general spatially close. The acquisition of spatial information by a neural network is investigated here. Given a spatial layout of several objects, networks are trained on a prediction task. Networks using temporal sequences with no direct spatial information are found to develop internal representations that have distances correlated with distances in the external layout. The influence of spatial information is analyzed by providing direct spatial information to the system during training that is either consistent with the layout or inconsistent with it. This approach allows examination of the relative contributions os spatial and temporal contiguity. LTD facilitates learning in a noisy environment Paul Munro and Gerardina Hernandez URL: www.pitt.edu/~pwm/nips99b.ps www.pitt.edu/~pwm/nips99b.pdf ABSTRACT Long-term potentiation (LTP) has long been held as a biological substrate for associative learning. Recently, evidence has emerged that long-term depression (LTD) results when the presynaptic cell fires after the postsynaptic cell. The computational utility of LTD is explored here. Synaptic modification kernels for both LTP and LTD have been proposed by other laboratories based studies of one postsynaptic unit. Here, the interaction between time-dependent LTP and LTD is studied in small networks. From harnad at coglit.ecs.soton.ac.uk Sun Jan 23 18:02:18 2000 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sun, 23 Jan 2000 23:02:18 +0000 (GMT) Subject: GESTURAL ORIGIN OF LANGUAGE: Psyc Call for Commentators Message-ID: Place/Catania: THE ROLE OF THE HAND IN THE EVOLUTION OF LANGUAGE The target article whose abstract appears below has today appeared in PSYCOLOQUY, a refereed online journal of Open Peer Commentary sponsored by the American Psychological Association. http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.007 ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/2000.volume.11/ psyc.00.11.007.language-gesture.1.place OPEN PEER COMMENTARY on this target article is now invited. Qualified professional biobehavioural, neural or cognitive scientists should consult PSYCOLOQUY's Websites or send email (below) for Instructions if not familiar with format or acceptance criteria for commentaries (all submissions are refereed). To submit articles or to seek information: EMAIL: psyc at pucc.princeton.edu URLs: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc ----------------------------------------------------------------------- psycoloquy.00.11.007.language-gesture.1.place Sun Jan 23 2000 ISSN 1055-0143 (59 paras, 58 refs, 1 figure, 1281 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 2000 Ullin T. Place THE ROLE OF THE HAND IN THE EVOLUTION OF LANGUAGE Target Article on Language Origins Ullin T. Place School of Philosophy University of Leeds School of Psychology University of Wales, Bangor, Wales UK Charles Catania Department of Psychology University of Maryland, Baltimore County 1000 Hilltop Circle Baltimore, Maryland 21250 USA catania at umbc.edu ABSTRACT: This target article has four sections. Section I sets out four principles which should guide any attempt to reconstruct the evolution of an existing biological characteristic. Section II sets out thirteen principles specific to a reconstruction of the evolution of language. Section III sets out eleven pieces of evidence for the view that vocal language must have been preceded by an earlier language of gesture. Based on those principles and evidence, Section IV sets out seven proposed stages in the process whereby language evolved: (1) the use of mimed movement to indicate an action to be performed, (2) the development of referential pointing which, when combined with mimed movement, leads to a language of gesture, (3) the development of vocalisation, initially as a way of imitating the calls of animals, (4) counting on the fingers leading into (5) the development of symbolic as distinct from iconic representation, (6) the introduction of the practice of question and answer, and (7) the emergence of syntax as a way of disambiguating utterances that can otherwise be disambiguated only by gesture. KEYWORDS: evolution, equivalence, gesture, homesigning, iconic, language, miming, pointing, protolanguage, referring, sentence, symbolic, syntax, vocalisation EDITOR'S NOTE: Ullin T. Place died on January 2, 2000. His target article had been reviewed for PSYCOLOQUY and was essentially complete at the time of his death. Some minor editing has been done by PSYCOLOQUY Associate Editor A. Charles Catania, mainly to bring the manuscript into conformity with PSYCOLOQUY style. Catania will consider replying to commentaries on this article, but also welcomes the participation of others who may feel they are familiar enough with Place's perspectives to do so. Retrieve the full target article at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.007 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/2000.volume.11/ psyc.00.11.007.language-gesture.1.place From fmdist at hotmail.com Sun Jan 23 14:37:21 2000 From: fmdist at hotmail.com (Fionn Murtagh) Date: Sun, 23 Jan 2000 11:37:21 PST Subject: RA posn. -visualizn. of user behavior in information spaces Message-ID: <20000123193721.29808.qmail@hotmail.com> A Research Assistant position will be available soon in Computer Science, The Queen's University of Belfast, in the area of visualization of user behavior in information spaces. Please let Fionn Murtagh (address below) know of your interest in this position. A short description of the work to be undertaken follows. The European (5th Framework) project "IRAIA - Getting Orientation in Complex Information Spaces as an Emergent Behavior of Autonomous Information Agents", which will last for two years, will be starting in March 2000. "Information retrieval systems of the future will be huge information repositories, distributed all over the world. Even the users will contribute to these repositories by communicating their experiences to other users who follow their footsteps. IRAIA's design metaphor focuses on the ants' system for communicating information. Navigating the web should allow people to leave pointers for those who might also navigate along the same paths." QUB work in this project will include the development of an architecture of different layers of abstraction that support the construction of a coordinate system based on ontologies and investigation of user behavior. Such user profiling will be based on information visualizations such as Kohonen self-organizing feature maps or similar active maps based on linkage graphs. These will be interfaced to the agent (CORBA, EJB) environment used. In more open research, the fact that map visualizations are used means that we will also seek to relate and exploit intriguing technologies used in digital image transmission - thinwire transmission technologies, and foveation-based strategies, based on multiscale transforms. ------------------------------------------------------------ Prof. F. Murtagh, School of Computer Science, The Queen's University of Belfast, Belfast BT7 1NN, Northern Ireland http://www.cs.qub.ac.uk/~F.Murtagh f.murtagh at qub.ac.uk Centre for Image and Vision Systems http://www.qub.ac.uk/ivs ------------------------------------------------------------ ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From Thomas.Wennekers at mis.mpg.de Mon Jan 24 12:52:18 2000 From: Thomas.Wennekers at mis.mpg.de (Thomas Wennekers) Date: Mon, 24 Jan 2000 18:52:18 +0100 (MET) Subject: 3 papers on complex modes of synchronization Message-ID: <200001241752.SAA15915@s4-22.mis.mpg.de> Dear connectionists, The following three papers on complex modes of synchronization in networks of graded response and spiking neurons are now available from the web page: http://www.informatik.uni-ulm.de/ni/mitarbeiter/TWennekers.html Regards, Thomas. _________________________________________________________________ Generalized and Partial Synchronization of Coupled Neural Networks Frank Pasemann and Thomas Wennekers to appear in "Network: Computation in Neural Systems" Abstract: Synchronization of neural signals has been proposed as a temporal coding scheme representing cooperated computation in distributed cortical networks. Previous theoretical studies in that direction mainly focused on the synchronization of coupled oscillatory subsystems and neglected more complex dynamical modes, that already exist on the single-unit level. In the present work we study the parametrized time-discrete dynamics of two coupled recurrent networks of graded neurons. Conditions for the existence of partially synchronized dynamics of these systems are derived, referring to a situation where only subsets of neurons in each sub-network are synchronous. The coupled networks can have different architectures and even a different number of neurons. Periodic as well as quasiperiodic and chaotic attractors constrained to a manifold $M$ of synchronized components are observed. Examples are discussed for coupled 3-neuron networks having different architectures, and for coupled 2-neuron and 3-neuron networks. Partial synchronization of different degrees is demonstrated by numerical results for selected sets of parameters. In conclusion, the results show that synchronization phenomena far beyond completely synchronized oscillations can occur even in simple coupled networks. The type of the synchronization depends in an intricate way on stimuli, history and connectivity as well as other parameters of the network. Specific inputs can further switch between different operational modes in a complex way, suggesting a similarly rich spatio-temporal behavior in real neural systems. __________________________________________________________________ Complete Synchronization in Coupled Neuromodules of Different Types Frank Pasemann and Thomas Wennekers Theory in Biosciences 118:267-283, 1999. Abstract: We discuss the parametrized dynamics of two coupled recurrent neural networks comprising either additive sigmoid neurons in discrete time or biologically more plausible time-continuous leaky-integrate-and-fire cells. General conditions for the existence of synchronized activity in such networks are given, which guarantee that corresponding neurons in both coupled sub-networks evolve synchronously. It is, in particular, demonstrated that even the coupling of totally different network structures can result in complex dynamics constrained to a synchronization manifold $M$. For additive sigmoid neurons the synchronized dynamics can be periodic, quasiperiodic as well as chaotic, and its stability can be determined by Lyapunov exponent techniques. For leaky-integrate-and-fire cells synchronized orbits are typically periodic, often with an extremely long period duration. In addition to synchronized attractors there often co-exist asynchronous periodic, quasiperiodic and even chaotic attractors. ___________________________________________________________________ "Generalized Types of Synchronization in Networks of Spiking Neurons" Thomas Wennekers and Frank Pasemann: Submitted to Computational Neuroscience Conference, CNS 2000. Abstract: The synchronization of neural signals has been proposed as a temporal coding scheme in distributed cortical networks. Theoretical studies in that direction mainly focused on the synchronization of coupled oscillatory subsystems. In the present work we show that several complex types of synchronization previously described for graded response neurons appear similarly also in biologically realistic networks of spiking and compartmental neurons. This includes synchronized complex spatio-temporal behavior, partial and generalized synchronization. The results suggest a similarly rich spatio-temporal behavior in real neural systems and may guide experimental research towards the study of complex modes of synchronization and their neuromodulation. _________________________________________________________________ Thomas Wennekers Max-Planck-Institute for Mathematics in the Sciences Inselstrasse 22-26 04103 Leipzig Germany Phone: +49-341-9959-533 Fax: +49-341-9959-555 Email: Thomas.Wennekers at mis.mpg.de WWW : www.mis.mpg.de www.informatik.uni-ulm.de/ni/mitarbeiter/TWennekers.html ________________________________________________________________ From H.Bolouri at herts.ac.uk Mon Jan 24 15:03:23 2000 From: H.Bolouri at herts.ac.uk (Hamid Bolouri) Date: Mon, 24 Jan 2000 12:03:23 -0800 Subject: CFP: Computation in Cells Message-ID: <20000124120323.I13844@cns.caltech.edu> Call for Papers (submission deadline 14 February 2000) COMPUTATION IN CELLS : molecular & cellular networks as computational systems, e.g.: robustness in biochemical networks. computational and dynamical motifs in molecular biology. tools and algorithms for unravelling biochemical systems. computational properties of signalling pathways. models of gene regulation and genetic regulatory networks. evolution of biochemical and genetic networks. deterministic computation from stochastic interactions. cell differentiation and pattern formation. April 17 & 18th 2000 University of Hertfordshire, UK http://strc.herts.ac.uk/NSGweb/emergent/ A UK Eng. & Phys. Sci. Research Council Emergent Computing workshop Co-sponsored by: the Wellcome Trust, the British Computer Society, & British Telecom Invited speakers (who have confirmed so far, more to come!): Baltazar Aguda (Laurentian U, Canada) Maria Blair (Sheffield Hallam, UK) Mark Borisuk (Caltech, USA) Dennis Bray (U. Cambridge, UK) Igor Goryanin (Glaxo-SmithKline, UK) Charles Hodgman (Glaxo-SmithKline, UK) Maria Samsonova (Inst. for High Performance Computing, Russia) Denis Thieffry (Free U. of Brussels & U. Gent, Belgium) David Willshaw (U. Edinburgh, UK) Tau-Mu Yi (Caltech, USA) From taketani at ics.uci.edu Mon Jan 24 00:50:25 2000 From: taketani at ics.uci.edu (Makoto Taketani) Date: Mon, 24 Jan 2000 14:50:25 +0900 Subject: Paper on a new method to study in-vitro network operations Message-ID: <4.1-J.20000124144641.013f1a80@binky.ics.uci.edu> The following recent paper may be of interest to those in this list interested in new methods to study in-vitro network operations. A new planar multielectrode array for extracellular recording: application to hippocampal acute slice Journal of Neuroscience Methods, 93, 61-67. Abstract The present paper describes a new planar multielectrode array (the MED probe) and its electronics (the MED system) which perform electrophysiological studies on acute hippocampal slices. The MED probe has 64 planar microelectrodes, is covered with a non-toxic, uniform insulation layer, and is further coated with polyethylenimine and serum. The MED probe is shown to be appropriate for both stimulation and recording. In particular, multi-channel recordings of field EPSPs obtained by stimulating with a pair of planar microelectrodes were established for rat hippocampal acute slices. The recordings were stable for six hours. Finally a spatial distribution of long-term potentiation was studied using the MED system The full article can be downloaded from http://www.med64.com/publications.htm ------------------------------------------------------- Makoto Taketani, Ph.D. Center for the Neurobiology of Learning and Memory University of California Irvine, CA 92697-3800 Tel: 949-824-5770; FAX: 949-824-5737 Net: taketani at ics.uci.edu ------------------------------------------------------- From oby at cs.tu-berlin.de Tue Jan 25 05:35:55 2000 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Tue, 25 Jan 2000 11:35:55 +0100 (MET) Subject: preprints available Message-ID: <200001251035.LAA11438@pollux.cs.tu-berlin.de> Dear Connectionists, attached please find abstracts and preprint locations of two manuscripts on the analysis of optical recording data and on visual cortex modelling. Comments are welcome! Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ============================================================================= Principal component analysis and blind separation of sources for optical imaging of intrinsic signals M. Stetter^1, I. Schiessl^1, T. Otto^1, F. Sengpiel^2, M. Huebener^2, T. Bonhoeffer^2, and K. Obermayer^1 ^1 Fachbereich Informatik, Technische Universitaet Berlin ^2 Max-Planck-Institute for Neurobiology, Martinsried The analysis of data sets from optical imaging of intrinsic signals requires the separation of signals, which accurately reflect stimulated neuronal activity (mapping signal), from signals related to background activity. Here we show that blind separation of sources by Extended Spatial Decorrelation (ESD) is a powerful method for the extraction of the mapping signal from the total recorded signal. ESD is based on the assumptions, (i) that each signal component varies smoothly across space and (ii) that every component has zero cross-correlation functions with the other components. In contrast to the standard analysis of optical imaging data, the proposed method (i) is applicable to non-orthogonal stimulus-conditions, (ii) can remove the global signal, blood-vessel patterns and movement artifacts, (iii) works without ad hoc assumptions about the data structure in the frequency domain, and (iv) provides a confidence measure for the signals (Z-score). We first demonstrate on orientation maps from cat and ferret visual cortex, that Principal Component Analysis (PCA), which acts as a preprocessing step to ESD, can already remove global signals from image stacks, as long as data stacks for at least two -- not necessarily orthogonal -- stimulus conditions are available. We then show that the full ESD analysis can further reduce global signal components and -- finally -- concentrate the mapping signal within a single component both for differential image stacks and for image stacks recorded during presentation of a single stimulus. in: NeuroImage, in press available at: http://ni.cs.tu-berlin.de/publications/ ----------------------------------------------------------------------------- A mean field model for orientation tuning, contrast saturation and contextual effects in the primary visual cortex M. Stetter, H. Bartsch, and K. Obermayer Fachbereich Informatik, Technische Universitaet Berlin Orientation selective cells in the primary visual cortex of monkeys and cats are often characterized by an orientation-tuning width that is invariant under stimulus contrast. At the same time their contrast response function saturates or even super-saturates for high values of contrast. When two bar stimuli are presented within their classical receptive field, the neuronal response decreases with intersection angle. When two stimuli are presented inside and outside the classical receptive field, the response of the cell increases with intersection angle. Both cats and monkeys show iso-orientation suppression, which was sometimes reported to be combined with cross-orientation facilitation. This property has previously been described as sensitivity to orientation contrast. We address the emergence of these effects by a model which describes the processing of geniculocortical signals through cortical circuitry. We hypothesize that short intracortical fibers mediate the classical receptive field effects whereas long-range collaterals evoke contextual effects such as sensitivity to orientation contrast. We model this situation by setting up a mean-field description of two neighboring cortical hypercolumns, which may process a non-overlapping center and a (nonclassical) surround stimulus. Both hypercolumns interact via idealized long-range connections. For an isolated model hypercolumn we find, that either contrast saturation or contrast-invariant orientation tuning emerges, depending on the strength of the lateral excitation. There is no parameter regime, however, where both phenomena emerge simultaneously. In the regime, where contrast saturation is found, the model also correctly reproduces suppression due to a second, cross-oriented grid within the classical receptive field. If two model hypercolumns are mutually coupled by long-range connections which are iso-orientation specific, nonclassical surround stimuli show either suppression or facilitation for all surround orientations. Sensitivity to orientation contrast is not observed. This property requires excitatory-to-excitatory long-range couplings that are less orientation specific than those targeting inhibitory neurons. in Biological Cybernetics, in press available at: http://ni.cs.tu-berlin.de/publications/ From X.Yao at cs.bham.ac.uk Wed Jan 26 05:37:55 2000 From: X.Yao at cs.bham.ac.uk (Xin Yao) Date: Wed, 26 Jan 2000 10:37:55 +0000 (GMT) Subject: evolutionary computing + neural networks Message-ID: -------------------------------------------------------------------------- CALL FOR PAPERS Special Issue of Journal of INTEGRATED COMPUTER-AIDED ENGINEERING (Founded in 1993) on Evolutionary Computing and Neural Networks URL: http://www.cs.bham.ac.uk/~xin/icae_cfps.html The international journal Integrated Computer-Aided Engineering is planing a special issue on Combination of Evolutionary Computing and Neural Networks to be published in early 2001. We are particularly interested in manuscripts focusing on synergetic combination of evolutionary computation and neural networks. Please send five copies of your original unpublished manuscript by *March 1, 2000* to the Guest Editor: Professor Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, United Kingdom Phone: +44 121 414 3747, Fax: +44 121 414 4281 Email: x.yao at cs.bham.ac.uk And one copy to the Editor-in-Chief: Professor Hojjat Adeli, Editor-in-Chief, ICAE Dept. of Civil and Environmental Engineering and Geodetic Science The Ohio State University, 470 Hitchcock Hall, 2070 Neil Avenue Columbus, OH 43210 U.S.A. Email: Adeli.1 at osu.edu Submission of a manuscript implies that it is the author's original unpublished work and has not been submitted for publication elsewhere. Potential contributors can request a complimentary sample copy of the journal from the publisher, IOS Press (www.iospress.nl, Fax in Netherlands: 31-20-620 3419, Fax in U.S.A.: 1-703-323 3668). -------------------------------------------------------------------------- From jaksa at neuron-ai.fei.tuke.sk Wed Jan 26 12:33:57 2000 From: jaksa at neuron-ai.fei.tuke.sk (Rudolf Jaksa) Date: Wed, 26 Jan 2000 18:33:57 +0100 (CET) Subject: ISCI 2000 Message-ID: <14479.12293.769077.360302@neuron-ai.tuke.sk> ***************************************** Announcement of the fellowship program for young scientists participation on ISCI 2000 **************************************** Conference is supported by: International neural network society European neural network society Asian-Pacific neural network Assembly Nuclear Power Plant research Institute / Slovakia European Union - 5. FP financial support Type of event: Euroconference **************************************** Who can ask for financial support? Any research person from EU who wants to attend the symposium can ask for financial support that covers - travel expenses, accommodation and fee. Applicant must be under 35. Preference will be given to active persons who are submitting a paper to the symposium. Proceedings will be published in Springer-Verlag. ***************************************** Who from invited speakers will attend the event? The list of invited and confirmed speakers is as follows: Prof. Zadeh - USA Prof. Goldberg - USA Prof. Bezdek - USA Prof. Werbos - USA Prof. Zurada - USA Prof. Adeli - USA Dr. Igelnik - USA Dr. Merenyi - USA Prof. Hirota - Japan Prof. Fukushima - Japan Prof. Takagi - Japan Prof. Kasabov - New Zealand Prof. Moraga - Germany Prof. Pap - Yougoslavia Prof. Kacprzyk - Poland Prof. Duch - Poland Dr. Kurkova - Czech Republic Prof. Gams - Slovenia ***************************************** What is the deadline for fellowship application? The deadline for fellowship application is 25-th of February 2000. Applicant should send mail with his basic personal data to e-mail isci at neuron-ai.tuke.sk Decision about grant award will be announced until 25-th of March 2000 ****************************************** ****************************************** ****************************************** ****************************************************************** International Symposium on Computational Intelligence ISCI - 2000 Kosice - Slovakia August 30 - September 1, 2000 ****************************************************************** Web pages worldwide: -------------------- Europe: http://neuron-ai.tuke.sk/cig/isci USA: http://cns.bu.edu/~kopco/isci Symposium is organized by: -------------------------- Faculty of Electrical Engineering and Informatics Technical University of Kosice, Slovakia Faculty of Chemical Technology and Faculty of Civil Engineering Slovak Technical University, Bratislava, Slovakia Symposium is supported by: -------------------------- International Neural Network Society Asian Pacific Neural Network Assembly European Neural Network Society Nuclear Power Plants Research Institute, Inc., Trnava, Slovakia Invited speakers - confirmed : Prof. Zadeh - USA Prof. Goldberg - USA Prof. Bezdek - USA Prof. Zurada - USA Prof. Igelnik - USA Prof. Werbos - USA Dr. E. Merenyi - USA Prof. Hirota - Japan Prof. Fukushima - Japan Prof. H. Takagi - Japan Prof. Kasabov - New Zealand Prof. Moraga - Germany Prof. Gams - Slovenia Prof. W. Duch - Poland Prof. Kacprzyk - Poland Prof. Pap - Yugoslavia Dr. Kurkova - Czech Republic Presentation of Companies : - Ecanse - Siemens Software System some others are under negotiation. Honorary chairpersons: ---------------------- Gail Carpenter - USA Lotfi Zadeh - USA David Goldberg - USA General chairmen: ----------------- V. Kvasnicka - Slovakia R. Mesiar - Slovakia P. Sincak - Slovakia Program Committee: ------------------ B. Kosko - USA J. Zurada - USA H. Adeli - USA E. Merenyi - USA B. Igelnik - USA Z. Michalewicz - USA R. Yager - USA M. Pelikan - USA N. Kopco - USA C. Moraga - Germany B. Reusch - Germany T. Beck - Germany D. Nauck - Germany H. Takagi - Japan K. Hirota - Japan K. Fukushima - Japan N. Kasabov - New Zeland D. Floreano - Switzerland J. Godjevac - Switzerland V. Babovic - Denmark M. O. Odetayo - UK L. Smith - Scotland, UK B. Krose - Netherlands A. Sperduti - Italy T. Gedeon - Australia Mohammadian Masoud - Australia V. Kurkova - Czech Republic J. Tvrdik - Czech Republic P. Osmera - Czech Republic J. Lazansky - Czech Republic P. Hajek - Czech Republic M. Mares - Czech Republic M. Navara - Czech Republic M. Novak - Czech Republic I. Taufer - Czech Republic L. Rutkowski - Poland J. Kacprzyk - Poland R. Tadeusiewicz - Poland L. Trysbus - Poland M. Gams - Slovenia L. Koczy - Hungary A. Varkonyi - Koczy - Hungary I. Ajtonyi - Hungary I. Rudas - Hungary J. Dombi - Hungary M. Jelasity - Hungary L. Godo - Spain F. Esteva - Spain E. Kerre - Belgium B. Bouchon-Meunier - France G. Raidl - Austria A. Uhl - Austria E. P. Klement - Austria E. Pap - Yugoslavia Slovak Program Committee members: --------------------------------- M. Kolcun, M. Hrehus, P. Tino, A. Cizmar, L. Benuskova, V. Pirc S. Figedy, L. Michaeli, I. Mokris, V. Olej, J. Pospichal, B. Riecan, P. Vojtas, G. Andrejkova, I. Farkas, J. Csonto, J. Chmurny, J. Sarnovsky, S. Kozak, L. Madarasz, D. Durackova, D. Krokavec, A. Kolesarova, J. Kelemen, R. Blasko. Scope of the Symposium: ----------------------- This symposium will be looking for answers concerning the following questions: * What is the state of the art in Computational Intelligence? * What are the potential applications of Computational Intelligence for real-world problems? * What are the future trends in Computational Intelligence? Researchers from all over the world are welcome on this symposium, which will have the following goals: 1. Integration of scientific communities working with fuzzy systems, neural networks and evolutionary computation approaches. 2. Strengthening of links between theory and real-world applications of CI and Soft Computing. 3. Promotion of Computational Intelligence in Central Europe with emphasis on commercial presentation of CI-oriented companies. 4. Support of new technologies that improve quality of life and will lead to a user-friendly information society of the 21st century. 5. An opportunity for young researchers to learn and present new results from the CI domain and to integrate into the interna- tional research community. The symposium will cover the following topics: ---------------------------------------------- Neural Networks Fuzzy Systems Evolutionary Computation Neuro-Fuzzy & Fuzzy-Neuro Systems Neuro-Fuzzy Hybrid Systems Artificial Life Application of Computational Intelligence tools in various real-world applications Presentations of CI oriented companies and their products Presentations of Companies: --------------------------- One of the goals of the symposium is to offer a possibility for industrial companies to present their CI-based products and services aimed at various kinds of customers, including banking industry, control engineering, speech and voice recognition, prediction and pattern recognition techniques, medicine and many other application areas. Companies will have a chance to present their products in oral presentation and also in permanent display of the products in the Symposium building. Space of up to 9 square meters per company will be available for the display purposes. Venue: ------ The city of Kosice is located in the eastern part of Slovakia. The first signs of inhabitance can be traced back to the end of the Older Stone Age. The first written mention of the suburb is dated back to the year 1230. The city was granted royal privileges that were helpful in development of crafts, businesses, and production. The oldest guild regulations were registered from the year 1307 and the city received its own coat of arms - the oldest coat of arms out of all the cities in Europe - in 1369. In 1657, due to the economical, administrative and political importance of the city, the first university was established; later the university was converted into a royal university. Kosice is an important cultural, industrial, and educational center in Slovakia and Central Europe with more than 20,000 students. It has 250,000 inhabitants and the historical center is one of the best-renovated centers in central Europe. You may wish to visit Kosice virtually at http://www.kosice.sk A post-meeting trip will be organized to the High Tatras. This mountain region is an area whose natural beauty makes it one of the most remarkable recreation areas not only in Slovakia, but also in the rest of Europe. A complete spectrum of hotels and restaurants await guests whether they come looking for the beauty of the outdoors or for simple relaxation. The High Tatras environ- ment also provides a special variety of sports and recreation facilities. More information can be found at http://www.tatry.sk ------------------------------------------------------------------ Important Dates: ----------------- Extended Abstract (max.4 pages) Submission Deadline : February 21, 2000 * Notification of authors: March 20, 2000 * Camera-ready: May 15, 2000 * Symposium date: August 30 - Spetember 1, 2000 Proceedigs ---------- Proceedings from this symposium will be published in the Springer- Verlag's "Studies in CI" series (edited by Prof. Janusz Kacprzyk) Symposium fees: --------------- Before May 1 After May 1 University rate : 280 USD 360 USD Student's rate : 100 USD 150 USD Industrial rate: 400 USD 450 USD Univ Exhibition space (4 sq. meters) 380 USD 450 USD Industr. Exhibition stand (8 sq. meters) 680 USD 880 USD Symposium packages : Univ. Exhibition space (4 sq. meters) + ISCI fee (one person) = 590 USD 680 USD Industrial Exhibition stand (8 sq. meters) + ISCI fee (one person) = 910 USD 1100 USD ================================================================== ****************************************************************** Special Fellowship program is available for Slovak and Czech participants from the academia. For detailed information please contact the Symposium secretariat. ****************************************************************** All additional information including the format for electronic submission is available on the Symposium Web pages. Mailing address of the symposium secretariat: Dr. J. Vascak Computational Intelligence Group KKUI-FEI, TU Kosice, Letna 9, 042 00 Kosice, Slovakia E-mail: isci at neuron-ai.tuke.sk From arenart at delta.ft.uam.es Thu Jan 27 03:04:06 2000 From: arenart at delta.ft.uam.es (Alfonso Renart) Date: Thu, 27 Jan 2000 09:04:06 +0100 (CET) Subject: 3 papers on Multi-modular associative N.Networks. Message-ID: Dear Connectionists: The following 3 papers on are available at the website: http://www.ft.uam.es/neurociencia/GRUPO/publications_group.html They deal with the subject of autoassociative recurrent networks in systems of several modules and their aplication to the study of working memory mechanisms in delay tasks. Sincerely, Alfonso Renart. %%%%%%%%%%%%%%%%%%%%% Renart A., Parga N. and Rolls E. T. Backprojections in the cerebral cortex: implications for memory storage Neural Computation 11 (6): 1349-1388, 1999. Abstract: Cortical areas are characterized by forward and backward connections between adjacent cortical areas in a processing stream. Within each area there are recurrent collateral connections between the pyramidal cells. We analyze the properties of this architecture for memory storage and processing. Hebb-like synaptic modifiability in the connections, and attractor states, are incorporated. We show the following: (1) The number of memories that can be stored in the connected modules is of the same order of magnitude as the number that can be stored in any one module using the recurrent collateral connections, and is proportional to the number of effective connections per neuron. (2) Cooperation between modules leads to a small increase in the memory capacity. (3) Cooperation can also help retrieval in a module which is cued with a noisy or incomplete pattern. (4) If the connection strength between modules is strong, then global memory states which reflect the pairs of patterns on which the modules were trained together are found. (5) If the intermodule connection strengths are weaker, then separate, local, memory states can exist in each module. (6) The boundaries between the global and local retrieval states, and the non-retrieval state, are delimited. All these properties are analyzed quantitatively with the techniques of statistical physics. %%%%% Renart A., Parga N. and Rolls E. T. Associative memory properties of multiple cortical modules NETWORK 10: 237-255, 1999. Abstract: The existence of recurrent collateral connections between pyramidal cells within a cortical area and, in addition, reciprocal connections between connected cortical areas, is well established. In this work we analyze the properties of a tri-modular architecture of this type in which two input modules have convergent connections to a third module (which in the brain might be the next module in cortical processing or a bi-modal area receiving connections from two different processing pathways). Memory retrieval is analyzed in this system which has Hebb-like synaptic modifiability in the connections and attractor states. Local activity features are stored in the intra-modular connections while the associations between corresponding features in different modules present during training are stored in the inter-modular connections. The response of the network when tested with corresponding and contradictory stimuli to the two input pathways is studied in detail. The model is solved quantitatively using techniques of statistical physics. In one type of test, a sequence of stimuli was applied, with a delay between them. It is found that if the coupling between the modules is low a regime exists in which they retain the capability to retrieve any of their stored features independently of the features being retrieved by the other modules. Although independent in this sense, the modules still influence each other in this regime through persistent modulatory currents which are strong enough to initiate recall in the whole network when only a single module is stimulated, and to raise the mean firing rates of the neurons in the attractors if the features in the different modules are corresponding. Some of these mechanisms might be useful for the description of many phenomena observed in single neuron activity recorded during short term memory tasks such as delayed match-to-sample. It is also shown that with contradictory stimulation of the two input modules the model accounts for many of the phenomena observed in the McGurk effect, in which contradictory auditory and visual inputs can lead to misperception. %%%%% Renart A., Parga N. and E.T. Rolls A recurrent model of the interaction between PF and IT cortex in delay memory tasks Proceedings of: NEURAL INFORMATION PROCESSING SYSTEMS, 1999 (NIPS99) (Denver. Nov. 29 - Dec 4, 1999). Abstract: A very simple model of two reciprocally connected attractor neural networks is studied analytically in situations similar to those encountered in delay match-to-sample tasks with intervening stimuli and in tasks of memory guided attention. The model qualitatively reproduces many of the experimental data on these types of tasks and provides a framework for the understanding of the experimental observations in the context of the attractor neural network scenario. From kak at ee.lsu.edu Thu Jan 27 11:36:50 2000 From: kak at ee.lsu.edu (Subhash Kak) Date: Thu, 27 Jan 2000 10:36:50 -0600 (CST) Subject: Paper on instantaneously trained neural networks Message-ID: <200001271636.KAA00330@ee.lsu.edu> The following paper on instantaneous learning and its applications to time-series prediction and metasearch engine design is available at: http://www.ee.lsu.edu/kak/x5kak.lo.pdf ------------------------------ Subhash Kak, Faster web search and prediction using instantaneously trained neural networks, IEEE Intelligent Systems, vol. 14, pp. 79-82, November/December 1999. Abstract: Over the past few years, we have developed new neural network designs that model working memory in their ability to learn and generalize instantaneously. These networks are almost as good as backpropagation in the quality of their generalization. With their speed advantage, they can be used in many real-time applications of signal processing, data compression, forecasting, and pattern recognition. In this paper, we describe the networks and their applications to two problems: (1) prediction of time-series; (2) design of an intelligent metasearch engine for the Web. The description of these two applications will provide enough information to see how they could be used in other situations. From Leo.van.Hemmen at Physik.TU-Muenchen.DE Thu Jan 27 09:38:35 2000 From: Leo.van.Hemmen at Physik.TU-Muenchen.DE (J. Leo van Hemmen) Date: Thu, 27 Jan 2000 15:38:35 +0100 Subject: Biological Cybernetics' Welcome to 2000 Message-ID: Dear Friends: In the February issue 82/2 (2000) of ``Biological Cybernetics'', the first that was published this year, its Editors-in-Chief Gert Hauske and I have published an Editorial welcoming the new..., well, take whatever you like best: year, decade, century, or millennium. Since new publication formats have been introduced, we think it could make for interesting reading for most of you. We have therefore appended the text as a LaTeX file. If you don't have LaTeX, it is equally readable once you know that \emph{...} means that {...} should be italicized or, in LaTeX terminology, emphasized. Enjoy reading, Leo van Hemmen. >>>$<<< \documentclass[12pt]{article} \usepackage[]{} \begin{document} \pagestyle{empty} \section*{Editorial} As we begin the year 2000, it is time to step back for some historical perspective and to ask how a prominent journal in computational neuroscience and cybernetics might better serve its scientific community in the coming century. What has \emph{Biological Cybernetics} achieved as a long-standing forum for exchange of ideas in this field and where are we going next? It is fair to say that the first important papers in our field were published in this Journal, and that we continue to be a major conduit for influential literature in this domain. It is, however, also clear that progress in neuroscience and in our understanding of information processing in biological systems, in general, will accelerate into the next decade at an unprecedented rate. As we have stressed in our Editorial of last July (issue 81/1), such rapid growth of thought makes it more urgent than ever to facilitate the interaction between experimental reality and theoretical understanding. In the present context, our operational definition of `theory' is mathematical description of neurobiological reality. Theory, then, aims for more, viz., disclosure of underlying mathematical structures that, together, unify our understanding. To this end, we need fundamental concepts that give structure to the many particular observations we make as scientists, such as momentum and angular momentum in mechanics. As an everyday example of the insights gained from both experimental and theoretical understanding, consider how much better is our knowledge of a bicycle once we have both ridden one, and studied it as a device for creating and conserving angular momentum. And so, we expect, will be the progress in understanding computational neuroscience as a marriage of theory and experiment. In computational neuroscience the hunt for fundamental notions is open. Maybe there is none, which we doubt. A famous example underlining the usefulness of theoretical concepts is Hassenstein and Reichardt's velocity detector. In our opinion, it is a fascinating challenge to see what theoretical concepts look like and what they are. It is our aim that experiment and theory will join their efforts in advancing conceptual understanding and in so doing generate synergy that will benefit both. As for \emph{Biological Cybernetics}, the discussion will evolve in three clearly delineated publication formats. First, important new results are welcomed as \emph{Letters to the Editor}. As a rule, results satisfying the three criteria `novel, important, and well-presented' will be published within three months after submission. Fast publication speed requires electronic submission. Letters can be up to eight pages in print with no more than six pages text and two pages figures and references -- the shorter, the better. (See the ``Instructions for Authors'' for technical details.) Next, regular manuscripts will appear as \emph{Original Papers}. Finally, there are \emph{Reviews}, scholarly reports of rapid developments that the Editorial Board considers of key importance to information processing in neuronal systems. They can be either invited or unsollicited. We will also invite, and welcome, contributions submitted for publication as \emph{Prospects}, a novel form of `review' emphasizing future developments more than a the typical Review does, and giving more license to personal speculation, provided it is clearly explained. The Journal has seen a substantial increase in number of submitted manuscripts since our previous editorial calling for its intrinsic coverage, essentially all aspects of communication and control in biological information processing. In recognition of the need for vigorous and wide-ranging exchange of ideas for progress in science, we start the new millennium for \emph{Biological Cybernetics} with the redactional setup described above. We are looking forward to your participation in this exchange both as a reader and as an author of papers, be they Letter, Original Paper, Review, or Prospects. Together with the Editorial Board, it is you who makes the Journal a vital medium of communication -- the more so at the beginning of a new millennium. \vspace{0.7cm} \noindent Gert Hauske \\ J.\ Leo van Hemmen. \end{document} >>>$<<< Prof. Dr. J. Leo van Hemmen Physik Department TU M"unchen D-85747 Garching bei M"unchen Germany Phone: +49(89)289.12362 (office) and .12380 (secretary) Fax: +49(89)289.14656 e-mail: Leo.van.Hemmen at ph.tum.de From recruiting at phz.com Thu Jan 27 15:54:16 2000 From: recruiting at phz.com (PHZ Recruiting) Date: Thu, 27 Jan 2000 15:54:16 -0500 Subject: Financial Modeling/Trading Positions Available Message-ID: <200001272054.PAA23876@phz-9.phz.com> PHZ Capital Partners LP is a Boston area trading firm that manages client money using proprietary quantitative algorithms. Our models of the global financial markets are based on a cross-disciplinary blend of financial market theory, novel statistics, and advanced computing technology. PHZ's unique approach and strong trading performance to date have led to exceptional client interest and rapid asset growth. To further expand our business, PHZ is now looking for one or more talented, hard working people to join our research and trading team to work on our next generation of trading systems. Depending on candidate interests and skills, these positions will involve exploratory market and data analysis, development of cutting edge modeling, trading, and risk management algorithms and models, and execution of these strategies through live trading. The successful applicant for these positions will have a demonstrated knack for solving real world analytical problems. We are looking for candidates with a Ph.D. in computer science, statistics, finance, or a related field, or someone with 3-5 years of work experience in an applied research setting. Strong software engineering skills are required (esp. on PCs and Unix). Experience working with large real world numerical data sets and statistical modeling tools (e.g. Splus or SAS) is highly desirable. Applicants should have a keen interest in learning more about the world financial markets, although finance industry experience is not required. The growth potential of these positions is large, both in terms of responsibilities and compensation. Initial compensation will be competitive based on qualifications and will include a significant variable component linked to firm trading performance. PHZ was founded in 1993 and is partially owned by Goldman Sachs. Our clients include large institutions and high net worth individuals. Our staff is a group of highly motivated, friendly people, and we have a fun, comfortable working environment. We are located in a pleasant suburb 17 miles west of Boston. Interested applicants should fax resumes to Jim Hutchinson at 508-653-1745, or email resumes (plain ascii or MS Word format) to recruiting at phz.com. From moatl at cs.tu-berlin.de Sat Jan 29 11:28:05 2000 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Sat, 29 Jan 2000 17:28:05 +0100 Subject: EU ADVANCED COURSE IN COMPUTATIONAL NEUROSCIENCE: ANNOUNCEMENT Message-ID: <38931515.F0ECE310@cs.tu-berlin.de> EU ADVANCED COURSE IN COMPUTATIONAL NEUROSCIENCE (AN IBRO NEUROSCIENCE SCHOOL) AUGUST 21 - SEPTEMBER 15, 2000 INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS, TRIESTE, ITALY DIRECTORS: Erik De Schutter (University of Antwerp, Belgium) Klaus Obermayer (Technical University Berlin, Germany) Alessandro Treves (SISSA, Trieste, Italy) Eilon Vaadia (Hebrew University, Jerusalem, Israel) The EU Advanced Course in Computational Neuroscience introduces students to the panoply of problems and methods of computational neuroscience, simultaneously addressing several levels of neural organisation, from subcellular processes to operations of the entire brain. The course consists of two complementary parts. A distinguished international faculty gives morning lectures on topics in experimental and computational neuroscience. The rest of the day is devoted to practicals, including learning how to use simulation software and how to implement a model of the system the student wishes to study on individual unix workstations. The first week of the course introduces students to essential neuro- biological concepts and to the most important techniques in modeling single cells, networks and neural systems. Students learn how to apply software packages like GENESIS, MATLAB, NEURON, XPP, etc. to the solution of their problems. During the following three weeks the lectures will cover specific brain functions. Each week topics ranging from modeling single cells and subcellular processes through the simulation of simple circuits, large neuronal networks and system level models of the brain will be covered. The course ends with a presentation of the students' projects. The EU Advanced Course in Computational Neuroscience is designed for advanced graduate students and postdoctoral fellows in a variety of disciplines, including neuroscience, physics, electrical engineering, computer science and psychology. Students are expected to have a basic background in neurobiology as well as some computer experience. A total of 32 students will be accepted. Students of any nationality can apply. About 20 students will be from the European Union and affiliated countries (Iceland, Israel, Liechtenstein and Norway plus all countries which are negotiating future membership with the EU). These students are supported by the European Commission and we specifically encourage applications from researchers who work in less-favoured regions of the EU, from women and from researchers from industry. IBRO and ICTP provide support for participation from students of non-Western countries, in particular countries from the former Soviet Union, Africa and Asia, while The Brain Science Foundation supports Japanese students. Students receiving support from the mentioned sources will receive travel grants and free full board at the Adriatico Guest House. More information and application forms can be obtained: - http://www.bbf.uia.ac.be/EU_course.shtml Please apply electronically using a web browser if possible. - email: eucourse at bbf.uia.ac.be - by mail: Prof. E. De Schutter Born-Bunge Foundation University of Antwerp - UIA, Universiteitsplein 1 B2610 Antwerp Belgium FAX: +32-3-8202669 APPLICATION DEADLINE: April 15, 2000. Applicants will be notified of the results of the selection procedures by May 31, 2000. COURSE FACULTY: Moshe Abeles (Hebrew University of Jerusalem, Israel), Carol Barnes (University of Arizona, USA), Avrama Blackwell (George Mason University, Washington, USA), Valentino Braitenberg (MPI Tuebingen, Germany), Jean Bullier (Universite Paul Sabatier, Toulouse, France), Ron Calabrese (Emory University, Atlanta, USA), Carol Colby (University Pittsburgh, USA), Virginia de Sa (University California San Francisco, USA), Alain Destexhe (Laval University, Canada), Opher Donchin (Hebrew University of Jerusalem, Israel), Karl J. Friston (Institute of Neurology, London, England), Bruce Graham (University of Edinburgh, Scotland), Julian J.B. Jack (Oxford University, England), Mitsuo Kawato (ATR HIP Labs, Kyoto, Japan), Jennifer Lund (University College London, England), Miguel Nicolelis (Duke University, Durham, USA), Klaus Obermayer (Technical University Berlin, Germany), Stefano Panzeri (University of Newcastle, England), Alex Pouget (University of Rochester, USA), John M. Rinzel (New York University, USA), Nicolas Schweighofer (ATR ERATO, Kyoto, Japan), Idan Segev (Hebrew University of Jerusalem, Israel), Terry Sejnowski (Salk Institute, USA), Haim Sompolinsky (Hebrew University of Jerusalem, Israel), Martin Stetter (Technical University Berlin, Germany), Shigeru Tanaka (RIKEN, Japan), Alex M. Thomson (Royal Free Hospital, London, England), Naftali Tishby (Hebrew University of Jerusalem, Israel), Alessandro Treves (SISSA, Trieste, Italy), Eilon Vaadia (Hebrew University of Jerusalem, Israel), Charlie Wilson (University of Texas, San Antonio, USA), More to be announced... The 2000 EU Advanced Course in Computational Neuroscience is supported by the European Commission (5th Framework program), by the International Centre for Theoretical Physics (Trieste), by the Boehringer Ingelheim Foundation, by the International Brain Research Organization and by The Brain Science Foundation (Tokyo). -- ---------------------------------------------------------------------- Dr. Martin Stetter phone: ++49-30-314-73117 FR2-1, Informatik fax: ++49-30-314-73121 Technische Universitaet Berlin web: http://www.ni.cs.tu-berlin.de Franklinstrasse 28/29 D-10587 Berlin, Germany ---------------------------------------------------------------------- From fmdist at hotmail.com Sun Jan 30 11:18:50 2000 From: fmdist at hotmail.com (Fionn Murtagh) Date: Sun, 30 Jan 2000 08:18:50 PST Subject: position - Internship, IBM, time series prediction Message-ID: <20000130161850.21940.qmail@hotmail.com> Internship starting May 2000, IBM TJ Watson research center, must have PhD and experience of wavelet/multiscale methods, also neural nets, for signal/time series modeling and prediction. Contact F Murtagh, f.murtagh at qub.ac.uk ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From rod at dcs.gla.ac.uk Mon Jan 31 06:59:41 2000 From: rod at dcs.gla.ac.uk (Roderick Murray-Smith) Date: Mon, 31 Jan 2000 11:59:41 +0000 Subject: Ph.D. & Post-doc vacancies in European Network Message-ID: <3895792D.CD2290F0@dcs.gla.ac.uk> Several positions in this European Commission funded research network might be of interest to researchers who are active in statistically-oriented work, Bayesian networks, or stochastic simulation and who are interested in engineering applications, especially with dynamic systems. Multi-Agent Control: Probabilistic reasoning, optimal coordination, stability analysis and controller design for intelligent hybrid systems http://www.dcs.gla.ac.uk/mac/ Vacancies in the MAC network (deadline 1st March 2000): ------------------------------------------------------------------------ The Multi-Agent Control (MAC) network is a collaboration between the Universities of Glasgow, Strathclyde, Maynooth, NTNU, DTU and the Jozef Stefan Insitute (participants). This project is funded by the European Commission as a Research Training Network. The University of Glasgow acts as the project coordinator. There are vacancies for researchers at each of the members of the network as follows. 1.Pre-doctoral position at University of Glasgow 2.Pre-doctoral position at University of Strathclyde 3.Pre-doctoral position at National University of Ireland, Maynooth 4.Post-Doctoral Position at Norwegian University of Science and Technology 5.Pre-doctoral Position at Technical University of Denmark 6.Post-Doctoral Position at Institut Jozef Stefan, Ljubljana, Slovenia. NOTE: To be elegible it is *essential* that applicants satisfy EU requirements (i.e. be citizen/resident of EU member or associated state - see below for further details). Highlights of the programme for potential applicants are: Challenging programme of interdisciplinary research Industrially relevant research problems Excellent training programme Mobility between network nodes Industrial secondments Competitive salary and relocation package Project Goals ---------------- The overall research objective of the network is to develop rigorous methods for analysis and design of Multi-Agent Control systems. Due to the interdisciplinary nature of this objective, the network has been structured to include expertise from relevant problem domains; probabilistic reasoning, optimisation, stability analysis, control theory and computing science. The specific design problems to be addressed are: 1.To develop probabilistic reasoning methods for design that accommodate the inherent uncertainty in the system's knowledge of the state of the world. The work will build on new developments of computationally-intensive statistical inference tools, for modelling complex physical systems and human control behaviour. 2.To develop tools for rigorously analysing the potentially very strong and safety critical interactions between the outcome of controller decisions and the dynamic behaviour of the overall system. 3.To develop formal methods of design, which incorporate in a single framework, the design of the switching logic, co-ordination between multiple agents as well as optimisation of performance within given constraints on the overall system behaviour. The emphasis in this network is to develop a theory to support the design of computer-controlled systems where performance and safety are crucial. The efficacy of the research results will be evaluated using a number of test-bed industrial applications (aerospace, automotive, process and renewable energy fields). These applications will be supplied by a number of major European industrial companies, some of which are members of the network, and others that have expressed an interest in the scientific output of the network. Software tools developed during prototyping, as well as the scientific results, will be made available to the wider academic and industrial community. Funding ---------- The project is funded by the European Commission under a Research Training Network. The European Commission requires that the candidate is aged 35 years or less at the time of his appointment and must be a national of a Member State of the Community or of an Associated State excluding the country in which you plan to work) or have resided in the Community for at least five year prior to the appointment. It is emphasised that these elegibility conditions are strict requirements. Note that pre-doctoral positions are essentially fully paid Ph.D. positions, where the candidate is expected to gain a Ph.D. by the end of the work period. Post-doctoral positions require the candidate to have qualified for a Ph.D. or equivalent before starting work. Further information ----------------------- Please visit the project web site http://www.dcs.gla.ac.uk/mac/ for more information. Details about the individual vacancies can be found at http://www.dcs.gla.ac.uk/mac/vacancies.htm -- Roderick Murray-Smith Department of Computing Science Glasgow University Glasgow G12 8QQ Scotland http://www.dcs.gla.ac.uk/~rod From nnsp2000 at ee.usyd.edu.au Mon Jan 31 23:38:57 2000 From: nnsp2000 at ee.usyd.edu.au (NNSP 2000) Date: Tue, 1 Feb 2000 15:38:57 +1100 Subject: IEEE NNSP 2000 Call for Papers Message-ID: <00ec01bf6c6f$2d7e8820$581b4e81@ee.usyd.edu.au.pe088> ***************************************************************** CALL FOR PAPERS 2000 IEEE Workshop on Neural Networks for Signal Processing December 11-13, 2000, Sydney, Australia Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council (pending) ***************************************************************** Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council the tenth of a series of IEEE workshops on Neural Networks for Signal Processing will be held at the University of Sydney Campus, Sydney, Australia. The workshop will feature keynote lectures, technical presentations, and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithm and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, generalization, design algorithms, optimization, parameter estimation, nonlinear signal processing, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, sonar and radar, data fusion, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, blind source separation, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. PAPER SUBMISSION PROCEDURE Prospective authors are invited to submit a full paper using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2000 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academica Publishers. SCHEDULE Submission of full paper: March 31, 2000 Notification of acceptance: May 31, 2000 Submission of photo-ready accepted paper: July 15, 2000 Advanced registration, before: September 15, 2000 ORGANIZATION Honorary Chair Bernard WIDROW Stanford University General Chairs Ling GUAN University of Sydney email: ling at ee.usyd.edu.au Kuldip PALIWA Griffith University email: kkp at shiva2.me.gu.edu.au Program Chairs T=FClay ADALI University of Maryland, Baltimore County email: adali at umbc.edu Jan LARSEN Technical University of Denmark email: jl at imm.dtu.dk Finance Chair Raymond Hau-San WONG University of Sydney email: hswong at ee.usyd.edu.au Proceedings Chairs Elizabeth J. WILSON Raytheon Co. email: bwilson at ed.ray.com Scott C. DOUGLAS Southern Methodist University email: douglas at seas.smu.edu Publicity Chair Marc van HULLE Katholieke Universiteit, Leuven email: marc at neuro.kuleuven.ac.be Registration and Local Arrangements Stuart PERRY Defense Science and Technology Organisation email: Stuart.Perry at dsto.defence.gov.au Europe Liaison Jean-Francois CARDOSO ENST email: cardoso at sig.enst.fr America Liaison Amir ASSADI University of Wisconsin at Madison email: ahassadi at facstaff.wisc.edu Asia Liaison Andrew BACK Katestone Scientific email: andrew.back at usa.net PROGRAM COMMITTEE: Amir Assadi Yianni Attikiouzel John Asenstorfer Andrew Back Geoff Barton Herv=E9 Bourlard Andy Chalmers Zheru Chi Andrzej Cichocki Tharam Dillon Tom Downs Hsin Chia Fu Suresh Hangenahally Marwan Jabri Haosong Kong Shigeru Katagiri Anthony Kuh Yi Liu Fa-Long Luo David Miller Christophe Molina M Mohammadian Erkki Oja Soo-Chang Pei Jose Principe Ponnuthurai Suganthan Ah Chung Tsoi Marc Van Hulle A.N. Venetsanopoulos Yue Wang Wilson Wen