From pfbaldi at ics.uci.edu Thu Nov 2 09:46:46 2000 From: pfbaldi at ics.uci.edu (Pierre Baldi) Date: Thu, 2 Nov 2000 06:46:46 -0800 Subject: BIOINFORMATICS FACULTY POSITION AT UC IRVINE Message-ID: <006101c044db$b990b910$be04c380@time-slice.ics.uci.edu> Several tenure-track postions are open in the Department of Information and Computer Science at UC Irvine. In particular, we are looking for candidates in biological and medical informatics. Pierre Baldi Department of Information and Computer Science and Department of Biological Chemistry University of California, Irvine Irvine, CA 92697-3425 (949) 824-5809 (949) 824-4056 FAX www.ics.uci.edu/~pfbaldi =============================================== The Department of Information and Computer Science (ICS) at the University of California, Irvine (UCI) has a tenure-track position open in the area of bioinformatics, medical informatics, or computational biology. The Department has a strong presence in this area with 5 full-time faculty in biomedical informatics and many faculty from within the Department and throughout the University with whom they collaborate (see http://www.ics.uci.edu/~biomed/). The available position is at the assistant professor level, but exceptional candidates from all ranks will be considered. In all cases, we are looking for applicants with a Ph.D. degree in Computer Science, Medical Informatics, Bioinformatics or a related field, as well as strong research credentials as evidenced by scholarly publications. Applicants for senior positions must also demonstrate a proven track record in original research and teaching activities. The ICS department runs a concentration in Informatics in Biology and Medicine and there are outstanding opportunities for interdiscipinary collaborations at all levels with the School of Biological Sciences, the School of Physical Sciences, the Department of Biomedical Engineering and the College of Medicine at UCI. ICS faculty are also closely affiliated with the new Institute for Genomics and Bioinformatics (http://www.igb.uci.edu). The Department and UCI are poised for exceptional growth in the coming years. The ICS Department is organized as an independent campus unit reporting to the Executive Vice Chancellor. It is the largest computer science department within the University of California. It runs the second most popular major at UCI and has designed an undergraduate honors program that attracts the campus' most qualified students. External funding from government and industrial sponsors exceeded $10 million last year. The Department currently has 38 full-time faculty and 200 Ph.D. students involved in various research areas including analysis of algorithms and data structures, artificial intelligence and machine learning, hardware-software co-design, parallel and distributed processing, embedded systems, communication networks, middleware technology, security and cryptography, databases, information retrieval and visualization, computational biology and medical informatics, computer graphics, human computer interaction and computer supported cooperative work, programming languages, software development and advanced software technology. The faculty have productive interdisciplinary ties with colleagues in the arts, biology, cognitive science, engineering, management, medicine, and the social sciences. More information about the Department can be found at http://www.ics.uci.edu. Although UCI is a young university, it has attained remarkable stature in the past 3 decades. Two Nobel prizes were recently awarded to UCI faculty. UCI is located three miles from the Pacific Ocean near Newport Beach, approximately forty miles south of Los Angeles. The climate is ideal year-round avoiding extreme temperatures in winters and summers. Irvine is consistently ranked among the safest cities in the U.S. and has an exceptional public school system. The campus is surrounded by high-technology companies that participate in an active affiliates program. Both the campus and the area offer exciting professional and cultural opportunities. Mortgage and housing assistance are available including newly built, for-sale housing located on campus and within short walking distance from the department. Applicants should send a cover letter indicating their interest in area E (Bioinformatics or Medical Informatics), a CV, three sample papers and contact information for three or four references to recruit at ics.uci.edu (PDF, postscript, Word, or ASCII). Please cc also to: recruit-E at ics.uci.edu. Applicants are requested to ask their references to send letters of evaluation to recruit at ics.uci.edu by January 12, 2001. Those that insist upon sending hard copy may send it to: ICS Faculty Position [E] c/o Peggy Munhall Department of Information and Computer Science University of California, Irvine Irvine, CA 92697-3425 Application screening will begin immediately upon receipt of curriculum vitae. Maximum consideration will be given to applications received by January 5, 2001. The University of California is an Equal Opportunity Employer, committed to excellence through diversity. From mzib at ee.technion.ac.il Thu Nov 2 11:49:12 2000 From: mzib at ee.technion.ac.il (Michael Zibulevsky) Date: Thu, 2 Nov 2000 18:49:12 +0200 (IST) Subject: paper: "Multiresolution framework for blind source separation" (fwd) Message-ID: Announcing a paper ... Title: Multiresolution framework for blind source separation Authors: P. Kisilev, M. Zibulevsky, Y.Y. Zeevi, B.A. Pearlmutter ABSTRACT: The concern of the blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. It was discovered recently, that use of sparsity of sources in some signal dictionary dramatically improves the quality of separation. In this work we use the property of multiscale transforms, such as wavelet or wavelet packets, to decompose signals into sets of local features with various degrees of sparsity. We use this intrinsic property for selecting the best (most sparse) subsets of features for further separation. Experiments with simulated signals, musical sounds and images demonstrate further significant improvement of separation quality. URL of the ps file: http://ie.technion.ac.il/~mcib/multisepMP9a.ps.gz Contact: paulk at tx.technion.ac.il mzib at ee.technion.ac.il From garionis at luna.cs.uni-dortmund.de Fri Nov 3 06:11:18 2000 From: garionis at luna.cs.uni-dortmund.de (Ralf Garionis) Date: Fri, 03 Nov 2000 12:11:18 +0100 Subject: Professorship in Computer Science Message-ID: <200011031111.MAA17278@luna.cs.uni-dortmund.de> Readers of this list may be interested in the following post: The Department of Computer Science at the University of Dortmund seeks applicants for a professorship position (C3). Candidates should have a strong background in Neural Computation or Fuzzy Logic for conducting research and teaching in the specific area. Please see a copy of the official announcement http://dekanat.cs.uni-dortmund.de/JobMarkt/Professoren/C3Professur.jpg for further details (this page is available in german language only). Informal enquirires may be made to Mr Decker, tel. +49-(0)231-755-2121, email decker at dek.cs.uni-dortmund.de. Closing date: Thursday 30 November 2000. From D.Willshaw at cns.ed.ac.uk Fri Nov 3 12:14:38 2000 From: D.Willshaw at cns.ed.ac.uk (David Willshaw) Date: Fri, 3 Nov 2000 17:14:38 +0000 (GMT) Subject: Electronic access to NETWORK: Computation in Neural Systems Message-ID: <14850.62078.170899.871274@gargle.gargle.HOWL> IOPP, publishers of NETWORK, of which I am Editor-in-Chief, has made their journals freely accessible electronically until December 22 2000. For more details of this service, see http://www.iop.org/Physics/News/0259j I would like to invite you to use this opportunity to browse through the 11 volumes of NETWORK. For those of you who don't know, NETWORK is in its eleventh year of publication. Originallly publishing papers in Neural Networks and Computational Neuroscience, last year the focus of the journal was changed to concentrate on all aspects of Computational Neuroscience. Regards, David Willshaw ------------------------------------------------------- Professor David Willshaw Editor-in-Chief, NETWORK: Computation in Neural Systems Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh 5 Forrest Hill Edinburgh EH1 2QL Scotland, UK Tel: (+44) 131 650 4404/5 Fax: (+44) 131 650 4406 Email: neted at anc.ed.ac.uk ------------------------------------------------------- From terry at salk.edu Mon Nov 6 00:45:47 2000 From: terry at salk.edu (Terry Sejnowski) Date: Sun, 5 Nov 2000 21:45:47 -0800 (PST) Subject: Computational Neuroscience 2000 In-Reply-To: <14850.62078.170899.871274@gargle.gargle.HOWL> Message-ID: <200011060545.eA65jlc69337@hamlet.salk.edu> The following special supplement to Nature Neuroscience is available free at: http://www.nature.com/neuro/journal/v3/n11s/index.html Terry ----- Nature Neuroscience November 2000 Volume 3 Number Supp pp 1160 - 1211 Computational approaches to brain function p 1160 Charles Jennings Ph.D. Editor & Sandra Aamodt Ph.D. Senior Editor Computational neuroscience at the NIH pp 1161 - 1164 History The Hodgkin-Huxley theory of the action potential p 1165 Michael Heusser Half a century of Hebb p 1166 H. Sebastian Seung The basic unit of computation p 1167 Anthony Zador Models of motion detection p 1168 Alexander Borst The Pope and grandmother's frog's-eye view of theory p 1169 Kevan Martin Computation by neural networks p 1170 Geoffrey Hinton Reviews The role of single neurons in information processing pp 1171 - 1177 Christof Koch & Idan Segev Synaptic plasticity: taming the beast pp 1178 - 1183 L. Abbott & Sacha Nelson Neurocomputational models of working memory pp 1184 - 1191 Daniel Durstewitz, Jeremy Seamans & Terrence Sejnowski Computational approaches to sensorimotor transformations pp 1192 - 1198 Alexandre Pouget & Lawrence Snyder Models of object recognition pp 1199 - 1204 Maximilian Riesenhuber & Tomaso Poggio Computer simulation of cerebellar information processing pp 1205 - 1211 Javier Medina & Michael Mauk Computational principles of movement neuroscience pp 1212 - 1217 Daniel Wolpert & Zoubin Ghahramani Learning and selective attention pp 1218 - 1223 Peter Dayan, Sham Kakade & P. Read Montague Viewpoints Models are common; good theories are scarce p 1177 Charles Stevens In the brain, the model is the goal p 1183 Bartlett Mel Facilitating the science in computational neuroscience p 1191 Lyle Borg-Graham Models identify hidden assumptions p 1198 Eve Marder On theorists and data in computational neuroscience p 1204 J. Hopfield What does 'understanding' mean? p 1211 Gilles Laurent ----- From Nigel.Goddard at ed.ac.uk Tue Nov 7 08:42:00 2000 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Tue, 07 Nov 2000 13:42:00 +0000 Subject: Position in Research and System Support Message-ID: <3A0806A8.B45CBB9A@ed.ac.uk> Research and System Support Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh This is an outstanding opportunity for someone with system administration skills to be involved in exciting research projects related to brain function and neural computation, some using the most advanced high-performance computers. We are seeking an individual to administer a research network and to engage in a variety of research projects including computational modeling of brain function, functional MRI studies, probabilistic data modeling, and neuroinformatics. Effort is to be split about equally between system support and research projects. Please see http://www.personnel.ed.ac.uk/VACS/vac2.htm#job5, Post E for full details, and note the early deadline for applications. Informal enquiries about this position can be made to Andrew Gillies +44 (0)131 650 3096 -- ========================================================= Dr. Nigel Goddard Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh 5 Forrest Hill Edinburgh EH1 2QL Scotland Telephone: +44 (0)131 650 3087 Mobile: +44 (0)787 967 1811 email: Nigel.Goddard at ed.ac.uk web: http://anc.ed.ac.uk/~ngoddard FAX(UK) : +44 (0)870 063 3111 or +44 (0)870 130 5014 FAX(USA): +1 603 698 5854 Calendar: http://calendar.yahoo.com/public/nigel_goddard ========================================================= From d.mareschal at bbk.ac.uk Thu Nov 9 06:46:32 2000 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Thu, 9 Nov 2000 12:46:32 +0100 Subject: postdoctoral postions available Message-ID: Dear all, I would appreciate it if you could bring these positions to the attentions of any interested people. They are part of a project exploring perceptual and cognitive development that aims to link behavioural experimental work with neural network modelling as tightly as possible. Although one position is experimental and the other is computational, the ideal candidate would have interests in both approaches to studying perception and cognition. Many thanks, Denis Mareschal *************** insert text *************** Postdoctoral Research Assistants in Psychology/Cognitive Science (Two-Year Fixed Term) We are seeking two researchers to work with Dr Denis Mareschal within the School of Psychology and the Centre for Brain and Cognitive Development. The posts are tenable from 1 February, 2001 or as soon as possible thereafter. 1. Infant Behavioural Testing (Ref: APS346) The post will consist mainly in testing categorisation in infants. You will use both visual preference and manipulation methodologies to assess categorisation in infants from age 3 to 24 months. Applicants should have a PhD, preferably in cognitive psychology or developmental psychology. 2. Connectionist Modelling (Ref: APS343) The post will consist mainly in applying connectionist/neural network techniques. You will help implement and design models of memory and categorisation in infancy. It is expected that applicants with advanced technical skills would have ample time to develop their own research programmes. Applicants should have a PhD, preferably in cognitive psychology, developmental psychology, AI, or Neural Computation. Preliminary details of these positions can be obtained by following links from my web page http://www.psyc.bbk.ac.uk/staff/dm.html. Final details can be obtained, in due course, from the personnel department at Birkbeck College either through the web (http://www.bbk.ac.uk) or by sending an A4 sae quoting the reference number to the Personnel Department, Birkbeck, Malet Street, Bloomsbury, London WC1E 7HX. Closing date: 14 December 2000 Informal enquires for both positions can be directed to d.mareschal at bbk.ac.uk ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 020 7631-6582/6207 fax +44 020 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From cierina at vis.caltech.edu Mon Nov 6 17:45:54 2000 From: cierina at vis.caltech.edu (Cierina Reyes) Date: Mon, 06 Nov 2000 14:45:54 -0800 Subject: Announcement - Postdoctoral Position Message-ID: <5.0.0.25.0.20001103122118.00ab7160@vis.caltech.edu> THEORETICAL/COMPUTATIONAL POSTDOCTORAL FELLOWSHIP IN NEUROSCIENCE: Applications are invited for a postdoctoral research position available immediately jointly between the laboratories of P. Mitra at Bell Laboratories (Murray Hill, New Jersey) and Prof. R. Andersen at Caltech (Pasadena, California). The research projects will involve building a real time system to transform measured in vivo neural signals into high level control signals for driving prosthetic limbs and the associated data analysis and algorithmic development. The successful applicant should have a Ph.D. in physics, mathematics, electrical engineering, applied mathematics, or equivalent theoretical sciences background. Programming or electronics experience preferred. The geographical location (Caltech/Bell Labs) is flexible and will depend on the candidate's expertise. Applications should include a curriculum vitae and two letters of recommendation. This material should be sent to: Ms. Cierina Reyes, California Institute of Technology, MC 216-76, 1201 E. California Blvd., Pasadena, CA 91125. Caltech is an Equal Opportunity/Affirmative Action Employer. Women, minorities, veterans, and disabled persons are encouraged to apply. From stork at rsv.ricoh.com Tue Nov 7 01:36:56 2000 From: stork at rsv.ricoh.com (stork) Date: Mon, 06 Nov 2000 22:36:56 -0800 Subject: New book: Pattern Classification Message-ID: <3A07A301.4DC0F2F1@rsv.ricoh.com> Announcing a new book: Pattern Classification (2nd ed.) by R. O. Duda, P. E. Hart and D. G. Stork 654 pages, two-color printing (John Wiley and Sons, 2001) ISBN: 0-471-05669-3 This is a significant revision and expansion of the first half of Pattern Classification & Scene Analysis, R. O. Duda and P. E. Hart's influential 1973 book. The current book can serve as a textbook for a one- or two-semester graduate course in pattern recognition, machine learning, data mining and related fields offered in Electrical Engineering, Computer Science, Statistics, Operations Research, Cognitive Science, or Mathematics departments. Established researchers in any domain relying on pattern recognition can rely on the book as a reference on the foundations of their field. Table of Contents 1) Introduction 2) Bayesian Decision Theory 3) Maximum Likelihood and Bayesian Estimation 4) Nonparametric Techniques 5) Linear Discriminant Functions 6) Multilayer Neural Networks 7) Stochastic Methods 8) Nonmetric Methods 9) Algorithm-Independent Machine Learning 10) Unsupervised Learning and Clustering Mathematical Appendix Goals * Authoritative: The presentations are based on the best research and rigorous fundamental theory underlying proven techniques. * Complete: Every major topic in statistical, neural network and syntactic pattern recognition is presented, including all the topics that should be in the "toolbox" of designers of practical pattern recognition systems. * Up-to-date: The book includes the most recent proven techniques and developments in the theory of pattern recognition. * Clear: Every effort has been made to insure that the text is clearly written and will not be misinterpreted. The manuscript was tested in over 100 courses worldwide, and numerous suggestions from students, teachers and established researchers have been incorporated. Every attempt has been made to give the deepest explanation, providing insight and understanding rather than a laundry list of techniques. * Logically organized: The book is organized so as to build upon concepts and techniques from previous chapters, so as to speed the learning of the material. * Problem motivated, not technique motivated: Some books focus on a particular technique or method, for instance neural nets. The drawback of such books is that they highlight the particular technique, often at the expense of other techniques. Readers are left wondering how the particular highlighted technique compares with others, and especially how to decide which technique is appropriate for which particular problem. Pattern Classification instead assumes that practioners come first with a problem or class of problems, and seek a solution, using whichever technique is most appropriate. There are many pattern recognition problems for which neural networks (for instance) are ill-suited, and readers of alternative texts that focus on neural networks alone may be misled and believe neural networks are applicable to their problem. As the old saying goes, "if you're a hammer, every problem looks like a nail." Pattern Classification rather seeks to be a balanced and complete toolbox -- plus instructions on how to choose the right tool for the right job. * Long-lived: Every effort has been made to ensure the book will be useful for a long time, much as the first edition reamained useful for over a quarter of a century. For instance, even if a technique has vocal proponents, if that technique has not found genuine use in a challenging problem domain, it is not discussed in depth in the book. Further, the notation and terminology are consistent and standardized as generally accepted in the field. New topics * Neural Networks, including Hessians and second-order training and pruning techniques, popular heuristics for training and initializing parameters, and recurrent networks. * Stochastic methods, including simulated annealing, genetic algorithms, Boltzmann learning, and Gibbs sampling. * Nonmetric methods, including tree classifiers such as CART, ID3 and their descendents, string matching, grammatical methods and rule learning * Theory of learning, including the No Free Lunch theorems, Minimum Description Length (MDL) principle, Occam's principle, bias-variance in regression and classification, jackknife and bootstrap estimation, Bayesian model comparison and MLII, multi-classifier systems and resampling techniques such as boosting, bagging and cross validation. * Support Vector Machines, including the relationship between "primal" and "dual" representations. * Competitive learning and related methods, including Adaptive Resonance Theory (ART) networks and their relation to leader-follower clustering. * Self-organizing feature maps, including maps affected by the sampling density. New/improved features and resources * Solution Manual: A solution manual is available for faculty adopting the text. * New and redrawn figures: Every figure is carefully drawn (and all figures from the 1st edition have been updated and redrawn) using modern 3D graphics and plotting programs, all in order to illuminate ideas in a richer and more memorable way. Some (e.g., 3D Voronoi tesselations and novel renderings of stochastic search) appear in no other pattern recogntion books and provide new insight into mathematical issues. A complete set of figures is available for non-commercial purposes from http://www.wiley.com/products/subject/engineering/electrical/software_supplem_elec_eng.html and ftp://ftp.wiley.com/public/sci_tech_med/pattern. * Two-color printing in figures and text: The use of red and black throughout allows more information to be conveyed in the figures, where color can for instance indicate different categories, or different classes of solution, or stages in the development of solutions. * Pseudocode: Key algorithms are illustrated in language-independent pseudocode. Thus students can implement the algorithms in their favorite computer language. * Worked Examples: Several techniques are illustrated with worked examples, using data sets simple enough that students can readily follow the technical details. Such worked examples are particularly helpful to students tackling homework problems. * Extensive Bibliographies: Each chapter contains an extensive and up-to-date bibliography with detailed citation information, including the full names (first name and surname) of every author. * Chapter Summaries: Each chapter ends with a summary highlighting key points and terms. Such summaries reinforce the presentation in the text and facilitate rapid review of the material. * Homeworks: There are 380 homework problems, each keyed to its corresponding section in the text. * Computer Exercises: There are 102 language-independent computer exercises, each keyed to a corresponding section and in many cases also to explicit pseudocode in the text. * Starred sections: Some sections are starred to indicate that they may be skipped on a first reading, or in a one-semester course. * Key words listed in margins: Key words and topics are listed in the margins where they first appear, to highlight new terms and to speed subsequent search and retrieval of relevant information. From S.Singh at exeter.ac.uk Tue Nov 7 10:25:56 2000 From: S.Singh at exeter.ac.uk (S.Singh@exeter.ac.uk) Date: Tue, 7 Nov 2000 15:25:56 +0000 (GMT Standard Time) Subject: PhD positions available Message-ID: UNIVERSITY OF EXETER, UK Department of Computer Science We now invite applications for the following two PhD studentships within our department. PhD Studentship "Adaptive and Autonomous Image Understanding Systems" Deadline for Application: 20th November, 2000 The project will explore intelligent algorithms for adaptive image understanding including context based reasoning. Generic methodology will be developed based on image processing and pattern recognition methods to allow automatic processing of scene analysis data using video sequences. The goal of the project is to develop a seamless system that continuously evolves in real-time with video images. PhD Studentship "Ultrasound based object recognition and integration with image analysis" Deadline for Application: 20th November, 2000 Only recently it has been demonstrated that ultrasound can be used to detect objects in controlled environments. It has been used for face recognition and obstacle avoidance. In this project we develop techniques for ultrasound based object recognition and couple this ability with image processing on a mobile platform (robot). The ultrasound method will provide the low resolution cheaper method as a front end triggering the image processing component for more detailed analysis. The research will be tested on indoor and outdoor environment. For both studentships, it is expected that prospective applicants have good mathematical and analytical background, with programming experience in C/C++ and Unix/Linux operating systems. Applicants need not necessarily have prior knowledge in these areas but should have at least an upper second class first degree and where possible a good Masters degree in computer science. Applicants should send a letter of application and CV including the names and addresses of two referees to Dr. Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK (email S.Singh at exeter.ac.uk). Applicants should ask their referees to send their references directly to the above address. Informal enquiries can be made 01392-264053; further details of the Department's work may be found on the web at http://www.dcs.ex.ac.uk/research/pann. The studentships cover UK/EU fees and maintenance (currently £6,620 pa) for up to three years. The successful candidates should expect to take up the studentship no later than 1 December, 2000 or as soon as possible when negotiated with the department. ------------------------------------------- Sameer Singh Department of Computer Science University of Exeter Exeter EX4 4PT United Kingdom Tel: +44-1392-264053 Fax: +44-1392-264067 e-mail: s.singh at ex.ac.uk WEB: http://www.ex.ac.uk/academics/sameer/index.htm ------------------------------------------- From lpaulo at unirpnet.com.br Sat Nov 11 02:19:12 2000 From: lpaulo at unirpnet.com.br (Luis Paulo) Date: Sat, 11 Nov 2000 12:49:12 +0530 Subject: Invinted Session Message-ID: <003201c04baf$b17579c0$2f60e6c8@lpaulo.www.unirpnet.com.br> An invited session called " Genetic Algorithms, Neural Networksand Aplications " has been organized for the in The Fifth Multi-Conference on Systemics, Cybernetics and Informatics, which will be held in Orlando, Florida, USA, from July 22 - 25, 2001. This session inteds to discusss (predction of molecular strucuture, cybernetic, macromolecular conformational search, prediction and pattern regognition ). Who is interested to participate should send a paper (word) to lpaulo at df.ibilce.unesp.br Best Regards Scott, Luis Paulo Deptartament o Physics IBILCE - UNESP Brasil From sml at essex.ac.uk Mon Nov 13 09:48:28 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Mon, 13 Nov 2000 14:48:28 -0000 Subject: OCR Competition Announcement Message-ID: <6A8CC2D6487ED411A39F00D0B7847B66E77EF1@sernt14.essex.ac.uk> Dear All, We are now inviting entries for our OCR competition sponsored by the UK Post Office. For more details go to http://algoval.essex.ac.uk and follow the link to OCR Competition. This competition is challenging in several ways: 1. The PO Digits dataset (on which accuracy is judged) is sparse and quite variable. 2. The algorithm implementation plus any data it needs to store its trained state must occupy less than 50,000 bytes. 3. There are many other comparison criteria (see rules for more details) that apply in order to separate algorithms that have test set accuracy that is not statistically separable. 4. There are strict limits on other performance aspects such as training and recognition time - see rules for more details. Enjoy! Simon Lucas ps. Note that at present algorithms must be implemented in Java. ------------------------------------------------ Dr. Simon Lucas Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom Email: sml at essex.ac.uk http://algoval.essex.ac.uk ------------------------------------------------- From taketani at med64.com Mon Nov 13 16:14:45 2000 From: taketani at med64.com (Makoto Taketani) Date: Mon, 13 Nov 2000 13:14:45 -0800 Subject: J Neurosci published a paper on hippocampal beta rhythm studied by array electrode In-Reply-To: Message-ID: Hi All, I forgot to mention that we will be happy to send the reprint to people who ask for it, when the reprint is available in mid December. Thank you. -makoto -----Original Message----- From: owner-mea-users at its.caltech.edu [mailto:owner-mea-users at its.caltech.edu]On Behalf Of Makoto Taketani Sent: Saturday, November 11, 2000 11:09 PM To: mea-users at cco.caltech.edu; Connectionists at cs.cmu.edu; cneuro at bbb.caltech.edu Subject: MEA: J Neurosci published a paper on hippocampal beta rhythm studied by array electrode The following recent paper may be of interest to those in this list interested in new methods to study in-vitro network operations. The Journal of Neuroscience, November 15, 2000, 20(22):8462-8473 Origins and Distribution of Cholinergically Induced Beta Rhythms in Hippocampal Slices. Ken Shimono1, Fernando Brucher2, Richard Granger2, Gary Lynch3, and Makoto Taketani1 Regional variations and substrates of high-frequency rhythmic activity induced by cholinergic stimulation were studied in hippocampal slices with 64-electrode recording arrays. (1) Carbachol triggered beta waves (17.6 +/- 5.7 Hz) in pyramidal regions of 75% of the slices. (2) The waves had phase shifts across the cell body layers and were substantially larger in the apical dendrites than in cell body layers or basal dendrites. (3) Continuous, two-dimensional current source density analyses indicated apical sinks associated with basal sources, lasting approximately 10 msec, followed by apical sources and basal sinks, lasting approximately 20 msec, in a repeating pattern with a period in the range of 15-25 Hz. (4) Carbachol-induced beta waves in the hippocampus were accompanied by 40 Hz (gamma) oscillations in deep layers of the entorhinal cortex. (5) Cholinergically elicited beta and gamma rhythms were eliminated by antagonists of either AMPA or GABA receptors. Benzodiazepines markedly enhanced beta activity and sometimes introduced a distinct gamma frequency peak. (6) Twenty Hertz activity after orthodromic activation of field CA3 was distributed in the same manner as carbachol-induced beta waves and was generated by a current source in the apical dendrites of CA3. This source was eliminated by high concentrations of GABA(A) receptor blockers. It is concluded that cholinergically driven beta rhythms arise independently in hippocampal subfields from oscillatory circuits involving (1) bursts of pyramidal cell discharges, (2) activation of a subset of feedback interneurons that project apically, and (3) production of a GABA(A)-mediated hyperpolarization in the outer portions of the apical dendrites of pyramidal neurons. SFN members can download the full article from http://www.jneurosci.org/cgi/content/abstract/20/22/8462 The movie showing current source density of beta rhythms can be downloaded from http://www.med64.com/publications.htm ------------------------------------------------------- Makoto Taketani, Ph.D. Technology Development Center Matsushita Electric Corporation of America Irvine, CA Net: taketani at med64.com http://www.med64.com ------------------------------------------------------- From: esann To: "Connectionists at cs.cmu.edu" References: From bogus@does.not.exist.com Tue Nov 14 03:33:04 2000 From: bogus@does.not.exist.com () Date: Tue, 14 Nov 2000 09:33:04 +0100 Subject: ESANN'2001 - special sessions Message-ID: ---------------------------------------------------- | | | ESANN'2001 | | | | 9th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 25-26-27, 2001 | | | | Call for papers: special sessions | | (deadline for submissions: 8 December 2000) | ---------------------------------------------------- The ESANN'2001 conference will include six special sessions organized by renowned scientists in their respective fields: - Neural networks in finance - Artificial neural networks and early vision processing - Artificial neural networks for Web computing - Dedicated hardware implementations: perspectives on systems and applications - Novel neural transfer functions - Neural networks and evolutionary/genetic algorithms - hybrid approaches You will find below a short description of these special sesions. Contributions to these special sessions are welcome, as on any other topic covered by the ESANN conference. For other details on topics, submission procedure, etc., please consult the Web pages of the conference (http://www.dice.ucl.ac.be/esann). The ESANN'2001 conference is technically co-sponsored by the IEEE Neural Networks Council (TBC), the IEEE Region 8, the IEEE Benelux Section, and the International Neural Networks Society. Neural networks in finance -------------------------- Organised by M. Cottrell, Univ. Paris I (France), E. de Bodt, Univ. Lille II (France) & UCL Louvain-la-Neuve (Belgium) Since a long time, many applications of statistical and econometric tools have been realized on data issued from financial markets. It is not necessary to emphasize the economic dimension of those works. To be able to forecast, even only on a very short term, a stock index would obviously be of a lot of value for professionals. Those works have raised many questions. From morelock at Princeton.EDU Tue Nov 14 14:26:34 2000 From: morelock at Princeton.EDU (Wendy Morelock) Date: Tue, 14 Nov 2000 14:26:34 -0500 Subject: PostDoc available at Princeton Message-ID: <3A1191E9.81D89CAD@princeton.edu> POSTDOCTORAL POSITION AVAILABLE IN CONNECTIONIST/NEURAL NETWORK MODELING: Applications are invited for a Postdoctoral Fellowship in the newly established Silvio O. Conte Center for Neuroscience Research on the Cognitive and Neural Mechanisms of Conflict and Control, within the Center for Brain, Mind and Behavior at Princeton University. The position is available from December 1, 2000 for a renewable, one-year appointment. The Conte Center encompasses six projects, five experimental and one focussed on theory and mathematical and computer modelling. The individual will work with Jonathan Cohen (Psychology), Philip P. Holmes (Applied Mathematics) and John Hopfield (Molecular Biology), who collaborate on mathemical modelling projects to develop, analyses and test neural network (connectionist) models of decision-making, perceptual choice, memory recall, and attention, focussing on the relationship of conflict detection to control. The theoretical work will be conducted in close collaboration with behavioral and fMRI imaging experiments. Ph.D. with experience in nonlinear dynamics and/or neural network modeling is required; some familiarity with models in neurobiology and/or experimental methods in psychology is also desirable. Further information about resources and affiliated faculty at the Center for Brain, Mind and Behavior is available at: http://www.csbmb.princeton.edu. Resumes or inquiries can be directed to Jonathan D. Cohen at jdc at princeton.edu. We will begin reviewing applications as they are received, continuing until the position is filled. Salary and rank are commensurate with experience. PU/EO/AAE -- Wendy Morelock Center Manager Center for the Study of Brain, Mind and Behavior morelock at princeton.edu (609) 258-0613 (609) 258-2574 fax From terry at salk.edu Wed Nov 15 01:54:52 2000 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 14 Nov 2000 22:54:52 -0800 (PST) Subject: Computational Neurobiology Graduate Training at UCSD Message-ID: <200011150654.eAF6sqp01254@purkinje.salk.edu> DEADLINE: JANUARY 6, 2000 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/compneuro/ The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers that are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Financial support for students enrolled in this training program is available through a new NSF Integrative Graduate Education and Research Training (IGERT) award to UCSD. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD. * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology. * Darwin Berg (Biology): Regulation synaptic components, assembly and localization, function and long-term stability. Former Chairman of Biology. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms. * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation. * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels. * David Kleinfeld (Physics):Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. Co-director, Analysis of Neural Data Workshop (MBL). * William Kristan (Biology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD. * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies. * Javier Movellan (Cognitive Science): Sensory fusion and learning algorithms for continuous stochastic systems. * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects. * Sejnowski (Salk Institute/Biology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation. * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques. * Nicholas Spitzer (Biology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of the Neurobiology Section in Biology. * Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons. * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters. * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior. * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms. Requests for application materials should be sent to the Graduate Admissions Office, Division of Biology 0348, 9500 Gilman Drive, UCSD, La Jolla, CA, 92093-0348 [gradprog at biology.ucsd.edu]. The deadline for completed application materials, including letters of reference, is January 6, 2001. More information about applying to the UCSD Biology Graduate Program is available at http://www-biology.ucsd.edu/sa/Admissions.html. The Division of Biology home page is located at http://www-biology.ucsd.edu/. From paul at arti.vub.ac.be Wed Nov 15 07:59:40 2000 From: paul at arti.vub.ac.be (Paul Vogt) Date: Wed, 15 Nov 2000 13:59:40 +0100 Subject: PhD Thesis available: Lexicon Grounding on Mobile Robots Message-ID: <3A1288BB.C637B538@arti.vub.ac.be> Dear colleagues, I am pleased to announce that my PhD thesis, titled 'Lexicon Grounding on Mobile Robots' is now available at the web: http://arti.vub.ac.be/~paul/thesis.html Abstract: The thesis presents research that investigates how two mobile robots can develop a shared lexicon from scratch of which the meaning is grounded in the real world. It is shown how the robots can solve the symbol grounding problem in a particular experimental setup. The model by which the robots do so is explained in detail. The experimental results are presented and discussed. Long abstract: http://arti.vub.ac.be/~paul/abstract.html Best regards, Paul Vogt -- Paul Vogt tel: +32 2 629 37 05 VUB AI Lab fax: +32 2 629 37 29 Brussels URL: http://arti.vub.ac.be/~paul From mm at santafe.edu Wed Nov 15 16:43:54 2000 From: mm at santafe.edu (Melanie Mitchell) Date: Wed, 15 Nov 2000 14:43:54 -0700 (MST) Subject: 2001 Complex Systems Summer Schools Message-ID: <14867.922.687501.713279@aztec.santafe.edu> SANTA FE INSTITUTE Complex Systems Summer Schools Summer, 2001 SANTA FE SCHOOL: June 10 to July 7, 2001 in Santa Fe, New Mexico. Held on the campus of St. John's College in Santa Fe. Administered by the Santa Fe Institute. BUDAPEST SCHOOL: July 16 to August 10, 2001 in Budapest, Hungary. Held on the campus of Central European University in Budapest. Administered by Central European University and the Santa Fe Institute. GENERAL DESCRIPTION: An intensive introduction to complex behavior in mathematical, physical, living, and social systems for graduate students and postdoctoral fellows in the sciences and social sciences. Open to students in all countries. Students are expected to choose one school and attend the full four weeks. Week 1 will consist of an intensive series of lectures and laboratories introducing foundational ideas and tools of complex systems research. The topics will include nonlinear dynamics and pattern formation, statistical mechanics and stochastic processes, information theory and computation theory, adaptive computation, computer modeling tools, and specific applications of these core topics to various disciplines. Weeks 2 and 3 will consist of lectures and panel discussions on current research in complex systems. The topics this year are: -- Origin and Early Evolution of Life (Santa Fe and Budapest) -- Nonstandard Approaches to Computation (Santa Fe and Budapest) -- Geophysics and Climate Modeling (Santa Fe) -- Self-Organization and Collective Behavior (Budapest) Week 4 will be devoted to completion and presentation of student projects. WHO SHOULD APPLY: Applications are solicited from graduate students and postdoctoral fellows in any discipline, but with some background in science and mathematics at least at the undergraduate level (including calculus and linear algebra). An optional review of relevant mathematics will be given at the beginning of each school. Students may apply to either the Santa Fe School or the Budapest School, regardless of home country. COSTS: -- Santa Fe School: No tuition is charged. 100% of housing costs are provided for graduate students and 50% for postdoctoral fellows. (The remaining 50% is $700 for the four week school). Most students will provide their own travel funding. Some travel scholarships may be available, depending on need. -- Budapest School: No tuition is charged. 100% of housing costs are provided for all students. Some travel scholarships will be available, depending on need. HOUSING: Housing at both schools will be in single dormitory rooms, some with shared bathrooms. Telephone and computer network connectors will be available. For students with accompanying families, some family housing will be available. Travel support for families is not available. APPLICATION INSTRUCTIONS: Provide a current resume with publications list (if any), statement of current research interests, comments about why you want to attend the school, and two letters of recommendation from scientists who know your work. Include your e-mail address and fax number. Specify which school you want to attend (or which is your first choice if you are willing to attend either). Specify in your cover letter whether you wish to apply for a travel scholarship. (This will not affect our decision on your application.) Send only complete application packages by postal mail to: Summer Schools Santa Fe Institute 1399 Hyde Park Road Santa Fe, NM 87501 APPLICATION DEADLINE: February 5,2001 Women, minorities, and students from developing countries are especially encouraged to apply. Further information at http://www.santafe.edu/sfi/education/indexCSSS.html or summerschool at santafe.edu. ------------------------------------------------------------------ 2001 SUMMER SCHOOL FACULTY Directors: SANTA FE BUDAPEST Ray Goldstein, U. Arizona Imre Kondor, Eotvos Univ. Melanie Mitchell, SFI Melanie Mitchell, SFI Partial List of Lecturers: SANTA FE BUDAPEST Elizabeth Bradley, U. Colorado Imre Kondor, Eotvos Univ. Thomas Carter, Cal. State U. Andras Kroo, Renyi Inst. Math. Sean Elicker, U. New Mexico Melanie Mitchell, SFI Ray Goldstein, U. Arizona Cristopher Moore, U. New Mexico Thomas Halsey, Exxon Research Mark Newman, SFI Laura Landweber, Princeton Zoltan Racz, Eotvos Univ. Seth Lloyd, MIT Grzegorz Rozenberg, Leiden U. Melanie Mitchell, SFI Hava Siegleman, Technion Harold Morowitz, George Mason U. Erik Schultes, MIT Cosma Shalizi, SFI Peter Schuster, Univ. Vienna Ken Steiglitz, Princeton Eors Szathmary, Eotvos Univ. Eors Szathmary, Eotvos Univ. Gabor Vattay, Eotvos Univ. Koen Visscher, U. Arizona Tamas Vicsek, Eotvos Univ. Lance Williams, U. New Mexico Erik Winfree, MIT From oreilly at grey.colorado.edu Wed Nov 15 16:57:30 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Wed, 15 Nov 2000 14:57:30 -0700 Subject: Paper on frontal cortex & basal ganglia available Message-ID: <200011152157.OAA29242@grey.colorado.edu> The following technical report is now available for downloading: ftp://grey.colorado.edu/pub/oreilly/papers/frankloughryoreilly00_fcbg_tr.pdf *or* ftp://grey.colorado.edu/pub/oreilly/papers/frankloughryoreilly00_fcbg_tr.ps Interactions Between Frontal Cortex and Basal Ganglia in Working Memory: A Computational Model Michael J. Frank, Bryan Loughry, and Randall C. O'Reilly Department of Psychology University of Colorado at Boulder ICS Technical Report 00-01 Abstract: The frontal cortex and basal ganglia interact via a relatively well-understood and elaborate system of interconnections. In the context of motor function, these interconnections can be understood as disinhibiting or ``releasing the brakes'' on frontal motor action plans --- the basal ganglia detect appropriate contexts for performing motor actions, and enable the frontal cortex to execute such actions at the appropriate time. We build on this idea in the domain of working memory through the use of computational neural network models of this circuit. In our model, the frontal cortex exhibits robust active maintenance, while the basal ganglia contribute a selective, dynamic gating function that enables frontal memory representations to be rapidly updated in a task-relevant manner. We apply the model to a novel version of the continuous performance task (CPT) that requires subroutine-like selective working memory updating, and compare and contrast our model with other existing models and theories of frontal cortex--basal ganglia interactions. - Randy +----------------------------------------------------------------+ | Dr. Randall C. O'Reilly | | | Assistant Professor | Phone: (303) 492-0054 | | Department of Psychology | Fax: (303) 492-2967 | | Univ. of Colorado Boulder | Home: (303) 448-1810 | | Muenzinger D251C | Cell: (720) 839-7751 | | 345 UCB | email: oreilly at psych.colorado.edu | | Boulder, CO 80309-0345 | www: psych.colorado.edu/~oreilly | +----------------------------------------------------------------+ From john at dcs.rhbnc.ac.uk Thu Nov 16 10:32:40 2000 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Thu, 16 Nov 2000 15:32:40 +0000 (GMT) Subject: Research assistant for intelligent text analysis Message-ID: Royal Holloway, University of London invites expressions of interest for a research assistant position in computer science. The post is a three year position starting from January 1st, 2001 or as soon as possible thereafter. The salary is very competitive and the work will be developing kernel based methods for the analysis of multi-media documents provided by Reuters, who are collaborators on the project. The project is financed by the EU and also involves partners in France (Xerox), Italy (Genova University, Milano University) and Israel (Hebrew University of Jerusalem). We are seeking researchers with experience in corpus based methods of information retrieval and document categorisation, and a strong programming background. Experience with kernel methods is desirable but not required. Please contact John Shawe-Taylor by email at jst at dcs.rhbnc.ac.uk for more information. John Shawe-Taylor and Nello Cristianini will also be available at the NIPS conference to answer any questions and meet potential applicants. ************************************************************** John Shawe-Taylor J.Shawe-Taylor at dcs.rhbnc.ac.uk Dept of Computer Science, Royal Holloway, University of London Phone: +44 1784 443430 Fax: +44 1784 439786 ************************************************************** From ian.cloete at i-u.de Thu Nov 16 09:46:33 2000 From: ian.cloete at i-u.de (Ian Cloete) Date: Thu, 16 Nov 2000 15:46:33 +0100 Subject: AIM Special Issue Message-ID: *************************************************************************** CALL FOR PAPERS --- Please accept our apologies for multiple copies --- --- Please distribute this CFP to your colleagues --- *************************************************************************** ARTIFICIAL INTELLIGENCE in MEDICINE SPECIAL ISSUE on KNOWLEDGE - BASED NEUROCOMPUTING in MEDICINE *************************************************************************** The journal Artificial Intelligence in Medicine (http://www.elsevier.nl/locate/artmed) invites contributions for a Special Issue on Knowledge-Based Neurocomputing in Medicine. Full papers should be sent to the address below and are due by 30 April 2001. The journal's web site contains guidelines for authors. Knowledge-Based Neurocomputing, the topic of a recently published book (http://www.i-u.de/schools/cloete/book.htm) edited by Ian Cloete and Jacek M. Zurada, focuses on methods to encode prior knowledge and to extract, refine, and revise knowledge within a neurocomputing system. The journal calls for papers on knowledge-based artificial neural networks within a medical application area. A paper should address at least the following three components: neurocomputing, processing of knowledge, and a medical application area. Of course state-of-the-art research is required, however, the contributions may either be theoretical with a medical line of thought or medical application-oriented with results and a discussion. Each contribution may be 20 to 25 pages long. Contributors who would like to be considered for invited papers are requested to contact the guest editor, Ian Cloete (ian.cloete at i-u.de), for consideration. Please send an extended abstract of at least 3 A4 pages describing your proposed contribution, preferably in postscript or pdf format by email, or to the address given below, by 31 December 2000. Further information on the call for papers will be posted on the web page http://www.i-u.de/schools/cloete/aimcfp.htm Prof. Dr. Ian Cloete International University in Germany Campus 2 D-76646 Bruchsal Germany Email: ian.cloete at i-u.de ian.cloete at ieee.org Phone: +49-7251-700230 Fax: +49-7251-700250 Web: http://www.i-u.de/schools/cloete/index.htm IU Web: http://www.i-u.de/ From cjlin at csie.ntu.edu.tw Thu Nov 16 20:22:45 2000 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Fri, 17 Nov 2000 09:22:45 +0800 Subject: model selection software for support vector machines Message-ID: Dear Colleagues: We announce the release of the software looms, a leave-one-out model selection software for support vector machines (SVM). Automatic model selection is an important issue to make support vector machines (SVM) practically useful. Most existing approaches use the leave-one-out (loo) related estimators which are considered computationally expensive. looms uses some numerical tricks which lead to efficient calculation of loo rates of different models. Given a range of parameters, looms automatically returns the parameter and model with the best loo rate. For example, % looms heart_scale Optimal parameter: c=16.000000, gamma=0.016000, rate= 83.704% where c is the penalty parameter (or say the upper bound of the SVM dual formulation) and gamma is the parameter of the RBF kernel: exp(gamma*|x_i - x_j|^2). Currently we support only the RBF kernel. The current release (Version 1.0, by Jen-Hao Lee and Chih-Jen Lin) is available from http://www.csie.ntu.edu.tw/~cjlin/looms Details of looms are in the following paper: J.-H. Lee and C.-J. Lin, Automatic model selection for support vector machines http://www.csie.ntu.edu.tw/~cjlin/papers/modelselect.ps.gz Any comments are very welcome. Sincerely, Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan cjlin at csie.ntu.edu.tw From sugi at og.cs.titech.ac.jp Fri Nov 17 03:44:27 2000 From: sugi at og.cs.titech.ac.jp (SUGIYAMA Masashi) Date: Fri, 17 Nov 2000 17:44:27 +0900 Subject: Model Selection Paper Available ! Message-ID: <20001117174427B.sugi@og.cs.titech.ac.jp> Dear colleagues, I am pleased to announce the availability of our paper on line. "Subspace Information Criterion for Model Selection" By Masashi Sugiyama and Hidemitsu Ogawa To appear in Neural Computation http://ogawa-www.cs.titech.ac.jp/~sugi/publications/sic.ps.gz Also we will give a talk about the above Subspace Information Criterion at NIPS*2000 Workshop "Cross-Validation, Bootstrap and Model Selection" Breckenridge, Colorado, USA, December 1, 2000. http://www.cs.cmu.edu/~rahuls/nips2000/ We appreciate your questions and comments by e-mail or at the workshop. ABSTRACT The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection called the subspace information criterion (SIC), which is a generalization of Mallows' $C_L$. It is assumed that the learning target function belongs to a specified functional Hilbert space and the generalization error is defined as the Hilbert space squared norm of the difference between the learning result function and target function. SIC gives an unbiased estimate of the generalization error so defined. SIC assumes the availability of an unbiased estimate of the target function and the noise covariance matrix, which are generally unknown. A practical calculation method of SIC for least mean squares learning is provided under the assumption that the dimension of the Hilbert space is less than the number of training examples. Finally, computer simulations in two examples show that SIC works well even when the number of training examples is small. Sincerely yours, Masashi Sugiyama. --------------------------------- Masashi Sugiyama Department of Computer Science, Graduate School of Information Science and Engineering, Tokyo Institute of Technology, 2-12-1, O-okayama, Meguro-ku, Tokyo, 152-8552, Japan. E-mail: sugi at og.cs.titech.ac.jp URL: http://ogawa-www.cs.titech.ac.jp/~sugi Tel: +81-3-5734-2190 Fax: +81-3-5734-2949 From Zoubin at gatsby.ucl.ac.uk Fri Nov 17 14:44:50 2000 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Fri, 17 Nov 2000 19:44:50 +0000 (GMT) Subject: Paper available on variational Bayesian learning Message-ID: <200011171944.TAA28344@cajal.gatsby.ucl.ac.uk> Dear Connectionists, The following paper on variational approximations for Bayesian learning with an application to linear dynamical systems will be presented at the NIPS 2000 conference. Comments are welcome. Gzipped postscript and PDF versions can be found at: http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips00beal.ps.gz http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips00beal.pdf Zoubin Ghahramani and Matt Beal ---------------------------------------------------------------------- Propagation algorithms for variational Bayesian learning Zoubin Ghahramani and Matthew J. Beal Gatsby Computational Neuroscience Unit University College London Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoretical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these results to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smoothing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set. A revised version of this paper will appear in Advances in Neural Information Processing Systems 13, MIT Press. ---------------------------------------------------------------------- From dimi at ci.tuwien.ac.at Mon Nov 20 10:49:28 2000 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Mon, 20 Nov 2000 16:49:28 +0100 (CET) Subject: CI BibTeX Collection -- Update Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 4/1-4/3 IEEE Transactions on Fuzzy Systems, Volumes 8/2-8/5 IEEE Transactions on Neural Networks, Volumes 11/3-11/6 Machine Learning, Volumes 40/3-41/3 Neural Computation, Volumes 12/5-12/11 Neural Networks, Volumes 13/3-13/6 Neural Processing Letters, Volumes 11/3-12/2 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fuer Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitaet Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ From gary at cs.ucsd.edu Mon Nov 20 12:20:09 2000 From: gary at cs.ucsd.edu (Gary Cottrell) Date: Mon, 20 Nov 2000 09:20:09 -0800 (PST) Subject: Faculty Position In Cognitive Science UCSD Message-ID: <200011201720.JAA13547@gremlin.ucsd.edu> >From: Gilles Fauconnier (by way of Joanna Mancusi) *NOTE closing date changed to January 15, 2001. Please circulate widely. __________________________________ FACULTY POSITION IN COGNITIVE SCIENCE UNIVERSITY OF CALIFORNIA, SAN DIEGO The Department of Cognitive Science at the University of California, San Diego invites applications for a faculty position at the assistant professor level (tenure-track) starting July 1, 2001, the salary commensurate with the experience of the successful applicant and based on the UC pay scale. The department of cognitive science at UCSD was the first of its kind in the world, and, as part of an exceptional scientific community, it remains a dominant influence in the field it helped to create. The department is truly interdisciplinary, with a faculty whose interests span anthropology, computer science, human development, linguistics, neuroscience, philosophy, psychology, and sociology. The department is looking for a top-caliber junior researcher in cognitive science. Applicants must have a Ph.D. (or ABD). A broad interdisciplinary perspective and experience with multiple methodologies will be highly valued. Women and minorities are encouraged to apply. The University of California, San Diego is an affirmative action/equal opportunity employer. All applications received by January 15, 2001 will receive thorough consideration until position is filled. Candidates should include a vita, reprints, a short letter describing their background and interests, and names and addresses of at least three references to: University of California, San Diego Faculty Search Committee Department of Cognitive Science 0515-EM 9500 Gilman Drive La Jolla, CA 92093-0515 From henkel at physik.uni-bremen.de Mon Nov 20 15:06:21 2000 From: henkel at physik.uni-bremen.de (Rolf D. Henkel) Date: Mon, 20 Nov 2000 21:06:21 +0100 Subject: TR: Synchronization, Coherence-Detection and Three-Dimensional Vision Message-ID: <00112021230000.23911@axon> Dear Connectionists, I'd like to invite you to download the technical report Title: Synchronization, Coherence-Detection and Three-Dimensional Vision Author: Rolf D. Henkel, Institute for Theoretical Physics ABSTRACT: A new functional role for spiking neurons is proposed, considered necessary to convert noisy sensory data into meaningful and stable perceptions. Percept creation and validation is performed by a dynamical process of coherence detection between neural signals. The crucial operational step of the network is the interaction of neural oscillators in the weak-coupling limit, which realizes coherence detection dynamically by selective synchronization of neural responses. A robust estimate of the incoming signals is transmitted as modulation frequency of the output current of the coherence-detecting layer, and a validation measure is given by the modulation depth of this current. As a real world example of these ideas, a neural network is presented solving the task of stereo vision with real image data. It combines operations of time-averaging, rate-coding neurons with integrate-and-fire-neurons, which calculate a disparity map of a scene by partially synchronizing their spike trains. The report is available on my website, as 1) PostScript http://axon.physik.uni-bremen.de/research/papers/coherence.ps.gz 2) PDF-File http://axon.physik.uni-bremen.de/research/papers/coherence.pdf 3) HTML, online http://axon.physik.uni-bremen.de/research/papers/coherence/ Comments and remarks are very welcome. Regards, Rolf Henkel -- Dr. Rolf Henkel Institute for Theoretical Neurophysics University Bremen Kufsteiner Strae 1, D-28359 Bremen Phone: +49-421-218-3688 henkel at physik.uni-bremen.de Fax: +49-421-218-4869 http://axon.physik.uni-bremen.de/ From evansdj at aston.ac.uk Tue Nov 21 06:22:13 2000 From: evansdj at aston.ac.uk (DJ EVANS) Date: Tue, 21 Nov 2000 11:22:13 +0000 Subject: JOB: ANALYSIS OF CARDIOLOGY BIOSIGNALS Message-ID: <3A1A5AE5.E20BEFD4@aston.ac.uk> Dear Connectionists, I have been asked to post this job advert to the list on behalf of Dr. Ian Nabney. For informal enquiries, please contact Dr. Nabney via e-mail (I.T.Nabney at aston.ac.uk). Regards, David Evans. ------------------------------------------------------------------------------ JOB ADVERT: To appear in jobs.ac.uk and on the NCRG web site Cardionetics Institute of Bioinformatics ---------------------------------------- School of Engineering and Applied Sciences Aston University, Birmingham, UK POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- ANALYSIS OF CARDIOLOGY BIOSIGNALS --------------------------------- We are looking for a highly motivated individual for a 3 year postdoctoral research position in the area of analysis of clinical biosignals, primarily in cardiology. The emphasis of this research will be on developing and applying data modelling, visualisation and analysis algorithms to biosignals to generate clinically valuable information. The Institute is funded by Cardionetics Ltd, a UK company that has produced the world's first fully automatic portable electrocardiograph machine specifically for GP use. The aim of the Institute is to develop the technology that will underpin the next generation of products. It is located alongside the Neural Computing Research Group which has a worldwide reputation in practical and theoretical aspects of information analysis. Applicants should have strong mathematical and computational skills; a background in biosignal processing (particularly ECG) would be an advantage. Salaries will be up to point 9 on the RA 1A scale, currently 21435 UK pounds. The salary scale is subject to annual increments. If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Personnel Officer Aston University Birmingham B4 7ET, U.K. Tel: +44 (0)121 359 0870 Fax: +44 (0)121 359 6470 Full details at http://www.ncrg.aston.ac.uk/. Informal enquiries can be directed via e-mail to Dr. Ian Nabney (Institute Director) e-mail: I.T.Nabney at aston.ac.uk. Closing date: mid December 2000 ---------------------------------------------------------------------- FURTHER PARTICULARS In this research programme, we will be developing leading edge pattern analysis techniques for time series analysis, data fusion, visualisation and data mining. The technologies involved will include neural networks, deterministic time series modelling, Bayesian belief networks, temporal graphical models, component analysis and other related techniques. Development projects will normally progress to a proof of concept stage; typically this will apply research software to data samples sufficient to prove clinical reliability and relevance. Planned topics for research include: quantification of ventricular repolarisation; T wave modelling to improve offset detection; temporal analysis of arrhythmia patterns; data fusion for multiple channel ECG; on-line learning to provide systems for post myocardial infarction patients. The Cardionetics Institute of Bioinformatics (CIB) was established at Aston University in October 2000. The research will be carried out by two Research Fellows and a Director (Dr. Ian Nabney) with appropriate administrative and computing support. The Institute has been formed to act as the long term research arm of Cardionetics Ltd, a UK company that has produced the C.Net2000, the first fully automatic portable ECG machine for GP use. It won a Millennium award and was runner up From bert at mbfys.kun.nl Tue Nov 21 09:51:33 2000 From: bert at mbfys.kun.nl (Bert Kappen) Date: Tue, 21 Nov 2000 15:51:33 +0100 Subject: Paper available Message-ID: <3A1A8BF5.E33B14F4@mbfys.kun.nl> Dear all, The following paper will be presented at NIPS and is now available for previewing from my web page. Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: http://www.mbfys.kun.nl/~bert ---------- Second order approximations for probability models Bert Kappen, Wim Wiegerinck In this paper, we derive a second order mean field theory for directed graphical probability models. By using an information theoretic argument it is shown how this can be done in the absense of a partition function. This method is the direct generalisation of the well-known TAP approximation for Boltzmann Machines. In a numerical example, it is shown that the method greatly improves the first order mean field approximation. The computational complexity of the first (second) order method is linear (quadratic) in the network size and is exponential in the potential size. For a restricted class of graphical models, so-called single overlap graphs, the second order method has comparable complexity to the first order method. -- Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: http://www.mbfys.kun.nl/~bert From josh at vlsia.uccs.edu Tue Nov 21 10:06:03 2000 From: josh at vlsia.uccs.edu (Alspector) Date: Tue, 21 Nov 2000 08:06:03 -0700 (MST) Subject: research programmer position Message-ID: Research Programmer position at Personalogy, Inc. Personalogy, Inc. was founded to commercialize research done in Prof. Alspector's group at the University of Colorado at Colorado Springs. The company develops and applies state-of-the-art machine learning techniques to personalize information on the Internet. We are currently looking for a research programmer with the following credentials: -Master's or PhD degree in Computer Science, EE or related field -background in machine learning/neural networks/intelligent data mining -strong interests in information filtering/retrieval and user modeling -1-4 years in research and professional programming experience -strong programming skills in Perl/C/C++ -familiarity with web-server technology and Internet protocols -good knowledge HTML/CGI/WML -comfortable with Windows NT/2K and Unix/Solaris/Linux platforms -very good communication and problem solving skills The person will be responsible for enhancing and optimizing the core algorithms of the company as well as researching novel ways of personalizing web-based content presentation. The person will also be involved in the overall system design. Personalogy offers a competitive salary, benefits and stock options. The company is located in downtown Colorado Springs, Colorado. Please respond by sending a resume to recruitment at personalogy.net. Josh Alspector will be attending NIPS in Denver. -- Professor Joshua Alspector Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 262 3510 (719) 262 3589 (fax) josh at eas.uccs.edu From nadine at wtc.ab.ca Wed Nov 22 01:46:33 2000 From: nadine at wtc.ab.ca (Nadine Gisler) Date: Tue, 21 Nov 2000 23:46:33 -0700 Subject: Welcome to the First ICSC Congress on Neuro-Fuzzy NF'2002 Message-ID: INTERNATIONAL COMPUTER SCIENCE CONVENTIONS Head Office: 5101C-50 Street, Wetaskiwin AB, T9A 1K1, Canada (Phone: +1-780-352-1912 / Fax: +1-780-352-1913) Email: or / Web-Site: http://www.icsc.ab.ca Welcome to the First International ICSC Congress on Neuro-Fuzzy NF'2002 to be held at The Capitolio de la Habana, Cuba January 15 - 18, 2002 We are sending this message requesting papers for this conference. http://www.icsc.ab.ca/NF2002.htm Organizing Committee: Honorary Chair: Prof. Hans-Juergen Zimmermann, Germany (zi at or.twth-aachen.de) General Chair: Hans-Heinrich Bothe, Denmark(hhb at it.dtu.dk) Special Scientific Events Chair : Alberto Ochoa, Cuba (aa8ar at yahoo.com) Scientific Program Chair: Hans Hellendoorn, Netherland (to be confirmed) Scientific Program Co-Chair : Pedro Gonzalez Lanza, Cuba (pedro at cidet.icmf.inf.cu) Local Committee Chair : Orestes Llanes-Santiago, Cuba (orestes at electrica.ispjae.edu.cu) Local Committee Co-Chair: Abelardo del Pozo Quintero, Cuba (pozo at cidet.icmf.inf.cu) Administration and Finance Chair : Ilkka Tervonen, Canada (operating at icsc.ab.ca) Publication Chair: Antonio Di Nola, Italy (dinola at unina.it) Introduction: During the past decade, paradigms and benefits from neuro fuzzy systems (NF) have been growing tremendously. Today, not only does NF solve scientific problems, but its applications are also appearing in our daily lives. In order to discuss the state of the art in NF and the future of these exciting topics; we are honored to invite you to Neuro-Fuzzy 2002. We believe it will be an excellent opportunity to share our knowledge on NF and contribute to its development in the next century. This major international conference will be held in a very enjoyable location: Havana, the Capital of Cuba, where we hope you will experience the famous Cuban hospitality. Sponsored/supported by: ISPJAE: Instituto Superior Politecnico Jos Antonio Echeverria ICIMAF: Instituto De Cibernetica, Matematica y Fisica UCLV: Universidad Central De Las Villas UO: Universidad De Oriente - RAC: Red de Automtica de Cuba - Ministerio de Educacin Superior de la Repblica de Cuba - Ministerio de la Informtica y las Comunicaciones de la Repblica de Cuba - Ministerio de Ciencias, Tecnologa y Medio Ambiente de la Repblica de Cuba. - ICSC/NAISO Canada/Switzerland Topics suggested (not limited to): - Advanced Neuro and Fuzzy Paradigms - Data Granulation and Fuzzy Rule Extraction - Advanced Training Algorithms - Evolutionary Computation (GA, GP, ET) and Graphical Models - Chaotic Behavior and Fractals Applications in signal processing, control, robotics, etc. Of particular interest are applications from the following fields: Sound and image processing, pattern recognition, image understanding, feature binding, perception, sensor fusion, controller design, state observation, motor control, mobile robotics, autonomous navigation, deliberation and planning, active anchoring, gain-scheduling, fault detection, hardware solutions, data mining, financing, e-commerce. International Steering/Program Committee (invitations sent to): Anderson P., USA Antonsson A.K., USA Baldwin J.F., U.K. Bandemer H., Germany Bezdek J., USA Bonnisone P., USA Bosc P., France Carlsson Ch., Finland Dubois D., France Esogbue A. O., USA Fyfe C., U.K. Gallard R., Argentina Gottwald S., Germany Grabisch M., France Halmague S.K., Australia Heiss-Czedik D., Austria Heiss M., Austria Hoehle U., Germany Jentzen Jan, Denmark Kalaykov Ivan, Sweden Kandel A.,Tampa, USA Klement E. P., Austria Kruse R., Germany Kuncheva L., U.K. Mamdani E., UK Marichal G.N., Spain Nauck Detlef, U.K. Pap E., Yugoslavia Pedrycz W., Canada, Roubens M., Belgium Runkler Th., Germany Ruspini ...., Belgium/ USA Steele N., U.K. Sugeno M., Japan Surmann H., Germany Takagi T., Japan Tuerksen I. B., Toronto, Canada Ulieru M., Canada Verdegay, J. L., Spain Zamarreo J., Spain SCIENTIFIC PROGRAM NF'2002 will include invited plenary talks, contributed sessions, invited sessions, workshops and tutorials. Updated details are available at http://www.icsc.ab.ca/NF2002.htm CALL FOR INVITED SESSIONS The organization of invited sessions is encouraged. Prospective organizers are requested to send a session proposal (consisting of 4-5 invited papers, the recommended session-chair and co-chair, as well as a short statement describing the title and the purpose of the session) to the respective symposium chair or the congress organizer. Invited sessions should preferably start with a tutorial paper. The registration fee of the session organizer will be waived, if at least 4 authors of invited papers register to the conference. POSTER PRESENTATIONS Poster presentations are encouraged for people who wish to receive peer feedback, and practical examples of applied research are particularly welcome. Poster sessions will allow the presentation and discussion of respective papers, which will also be included in the conference proceedings. CALL FOR TUTORIALS Pre-conference tutorials on specific relevant topics are planned. Proposals for a tutorial must include the title, topics covered, proposed speakers, targeted audience and estimated length (preferably 2 or 4 hours). The proposal must be submitted to the general chair, the scientific program chair and the congress organizer by May 31, 2001. Tutorial papers of max 15 pages can be included in the conference proceedings. CALL FOR WORKSHOPS Interested scientists are encouraged to organize a workshop on their particular field of research. Workshops consist of several presentations or open discussions on a specific subject. The proposal must include the title, the topics covered, the proposed speakers, the targeted audience and the estimated length. It should be submitted to the general chair, the scientific program chair and the congress organizer by May 31, 2001. Joint or edited workshop papers of max 35 pages can be included in the conference proceedings. SUBMISSION OF PAPERS Authors are requested to send an extended abstract, or the full paper of minimum 4 and maximum 7 pages for review, by the International Program Committee. All submissions must be written in English, starting with a succinct statement of the problem, the results achieved, their significance and a comparison with previous work. Submissions must be received by May 31, 2001. Regular papers, as well as poster presentations, tutorial papers and invited sessions are encouraged. The submission should also include: - Title of congress (NF'2002), - Type of paper (workshop, tutorial, invited, regular, poster) - Authors names, affiliations, addresses - Name of author to contact for correspondence - E-mail address and fax # of contact author and co-authors - Topics which best describe the paper (5 - 10 keywords) - Short CV of authors (recommended) Please submit your paper proposal electronically to the following address: operating at icsc.ab.ca Data formats: Word, Postscript, PDF. PROCEEDINGS AND PUBLICATIONS All accepted and invited papers will be included in the congress proceedings, published in print and on CD-ROM by ICSC Academic Press, Canada/Switzerland. A selected number of papers will be expanded and revised for possible inclusion in special issues of some prestigious journals. IMPORTANT DATES Submission Deadline: May 31, 2001 Notification of Acceptance: August 15, 2001 Delivery of Final Manuscripts: October 31, 2001 Conference NF'2002: January 15/18, 2002 CONGRESS ORGANIZER ICSC International Computer Science Conventions NAISO Natural and Artificial Intelligence Systems Organizations 5101C - 50 Street Wetaskiwin AB, T9A 1K1 / Canada Phone: +1-780-352-1912 Fax: +1-780-352-1913 Email Operating Division: operating at icsc.ab.ca Email Planning Division: planning at icsc.ab.ca connectionists at cs.cmu.edu ICSC From mpessk at guppy.mpe.nus.edu.sg Thu Nov 23 02:30:32 2000 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Thu, 23 Nov 2000 15:30:32 +0800 (SGT) Subject: TR on fast computation of Leave-One-Out error in SVM algorithms In-Reply-To: Message-ID: A New Technical Report... Two Efficient Methods for Computing Leave-One-Out Error in SVM Algorithms S. Sathiya Keerthi, Chong Jin Ong and Martin M.S. Lee National University of Singapore Abstract: We propose two new methods for the efficient computation of the leave-one-out (LOO) error for SVMs. The first method is based on the idea of exact penalty functions while the second method uses a new initial choice for the alpha variables. These methods can also be extended to the recomputation of SVM solutions when more than one example is omitted and the computation of LOO error for nu-SVM. Recently, Lee and Lin pointed out that a loose stopping criteria can be exploited to speed up LOO computations for SVM. This fact, combined with our proposed methods allows for the efficient computation of the LOO error. To download a gzipped postscript file containing the report, go to: http://guppy.mpe.nus.edu.sg/~mpessk/svm.shtml From tnatschl at igi.tu-graz.ac.at Thu Nov 23 09:00:13 2000 From: tnatschl at igi.tu-graz.ac.at (Thomas Natschlaeger) Date: Thu, 23 Nov 2000 15:00:13 +0100 Subject: Papers on computational analysis of dynamic synapses Message-ID: <3A1D22ED.2D819E2E@igi.tu-graz.ac.at> Dear Connectionists, The following two papers on computational analysis of dynamic synapses will be presented at the NIPS 2000 conference. Comments are welcome. Gzipped postscript and PDF versions can be found at: http://www.igi.TUGraz.at/igi/tnatschl/publications.html Sincerely Thomas Natschlaeger ---------------------------------------------------------------------- FINDING THE KEY TO A SYNAPSE T. Natschlaeger and W. Maass URLs: http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-nips00.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-poster.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-poster.pdf ABSTRACT: Experimental data have shown that synapses are heterogeneous: different synapses respond with different sequences of amplitudes of postsynaptic responses to the same spike train. Neither the role of synaptic dynamics itself nor the role of the heterogeneity of synaptic dynamics for computations in neural circuits is well understood. We present in this article two computational methods that make it feasible to compute for a given synapse with known synaptic parameters the spike train that is optimally fitted to the synapse in a certain sense. One of these methods is based on dynamic programming (similar as in reinforcement learning), the other one on sequential quadratic programming. With the help of these methods one can compute for example the temporal pattern of a spike train (with a given number of spikes) that produces the largest sum of postsynaptic responses for a specific synapse. Several other applications are also discussed. To our surprise we find that most of these optimally fitted spike trains match common firing patterns of specific types of neurons that are discussed in the literature. Furthermore optimally fitted spike trains are rather specific to a certain synapse ("the key to this synapse") in the sense that they exhibit a substantially smaller postsynaptic response on any other of the mayor types of synapses reported in the literature. This observation provides the first glimpse at a possible functional role of the specific combinations of synapse types and neuron types that was recently found in (Gupta, Wang, Markram, Science, 2000). Our computational analysis provides the platform for a better understanding of the specific role of different parameters that control synaptic dynamics, because with the help of the computational techniques that we have introduced one can now see directly how the temporal structure of the optimal spike train for a synapse depends on the individual synaptic parameters. We believe that this inverse analysis is essential for understanding the computational role of neural circuits. ------------------------------------------------------------------------- PROCESSING OF TIME SERIES BY NEURAL CIRCUITS WITH BIOLOGICALLY REALISTIC SYNAPTIC DYNAMICS T. Natschlaeger, W. Maass, E. D. Sontag, and A. M. Zador URLs: http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-nips00.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-poster.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-poster.pdf ABSTRACT: Experimental data show that biological synapses behave quite differently from the symbolic synapses in common artificial neural network models. Biological synapses are dynamic, i.e., their ``weight'' changes on a short time scale by several hundred percent in dependence of the past input to the synapse. In this article we explore the consequences that this synaptic dynamics entails for the computational power and adaptive capability of feedforward neural networks. Our analytical results show that the class of nonlinear filters that can be approximated by neural networks with dynamic synapses, even with just a single hidden layer of sigmoidal neurons, is remarkably rich. It contains every time invariant filter with fading memory, hence arguable every filter that is potentially useful for a biological organism. This result is robust with regard to various changes in the model for synaptic dynamics. Furthermore we show that simple gradient descent suffices to approximate a given quadratic filter by a rather small neural network with dynamic synapses. The computer simulations we performed show that in fact their performance is slightly better than that of previously considered artificial neural networks that were designed for the purpose of yielding efficient processing of temporal signals, without aiming at biological realism. We have tested dynamic networks on tasks such as the learning of a randomly chosen quadratic filter, as well as on the system identification task used in (Back and Tsoi, 1993), to illustrate the potential of our new architecture. We also address the question which synaptic parameters are essential for a network with dynamic synapses to be able to learn a particular target filter. We found that neither just plasticity in the synaptic dynamics nor just plasticity of the maximal amplitude alone yields satisfactory results. However a simple gradient descent learning algorithm that tunes both types of parameters simultaneously yields good approximation capabilities. From rosi-ci0 at wpmail.paisley.ac.uk Thu Nov 23 10:02:45 2000 From: rosi-ci0 at wpmail.paisley.ac.uk (Roman Rosipal) Date: Thu, 23 Nov 2000 15:02:45 +0000 Subject: TR available Message-ID: Dear Connectionists, The following TR is now available at my home page: Kernel Principal Component Regression with EM Approach to Nonlinear Principal Components Extraction R. Rosipal, LJ Trejo, A. Cichocki Abstract In kernel based methods such as Support Vector Machines, Kernel PCA, Gaussian Processes or Regularization Networks the computational requirements scale as O(n^3) where n is the number of training points. In this paper we investigate Kernel Principal Component Regression (KPCR) with the Expectation Maximization approach in estimating of the subset of p principal components (p < n) in a feature space defined by a positive definite kernel function. The computational requirements of the method are O(pn^2). Moreover, the algorithm can be implemented with memory requirements O(p^2)+O((p+1)n)). We give the theoretical description explaining how by the proper selection of a subset of non-linear principal components desired generalization of the KPCR is achieved. On two data sets we experimentally demonstrate this fact. Moreover, on a noisy chaotic Mackey-Glass time series prediction the best performance is achieved with p << n and experiments also suggests that in such cases we can also use significantly reduced training data sets to estimate the non-linear principal components. The theoretical relation and experimental comparison to Kernel Ridge Regression and epsilon-insensitive Support Vector Regression is also given. _______________ You can download gzipped postscript from http://cis.paisley.ac.uk/rosi-ci0/Papers/TR00_2.ps.gz Any comments and remarks are welcome. _______________ Roman Rosipal University of Paisley, CIS Department, Paisley, PA1 2BE Scotland, UK http://cis.paisley.ac.uk/rosi-ci0 e-mai:rosi-ci0 at paisley.ac.uk Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From maass at igi.tu-graz.ac.at Thu Nov 23 13:34:13 2000 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Thu, 23 Nov 2000 19:34:13 +0100 Subject: Paper on Circuit Complexity for Sensory Processing Message-ID: <3A1D6325.F7A9E4@igi.tu-graz.ac.at> The following paper is now online available. It will be presented as a talk at NIPS 2000. FOUNDATIONS FOR A CIRCUIT COMPLEXITY THEORY OF SENSORY PROCESSING by Robert A. Legenstein and Wolfgang Maass Technische Universitaet Graz, Austria ABSTRACT: We introduce TOTAL WIRE LENGTH as a salient complexity measure for evaluating the biological feasibility of proposed circuit designs for sensory processing tasks, such as early vision. Biological data show that the total wire length of neural circuits in the cortex is in fact not very large, if compared with the number of neurons that they connect. This implies that many commonly proposed circuit architectures for sensory processing tasks are biologically unrealistic. In this paper we exhibit some alternative circuit design techniques for computational tasks that capture typical aspects of translation- and scale-invariant sensory processing. These techniques yield circuits with a total wire length that scales LINEARLY with the number of neurons in the circuit. ------------------------------------------------------------------------------ This paper, as well as an illustrated poster, is online available from # 122 on http://www.igi.TUGraz.at/igi/maass/publications.html Wolfgang Maass tel;fax:++43 (0)316 873 5805 tel;work:++43 (0)316 873 5822 http://www.tu-graz.ac.at/igi/maass Technische Universitaet Graz;Institute for Theoretical Computer Science. Professor of Computer Science From Thorsten.Joachims at gmd.de Fri Nov 24 05:39:33 2000 From: Thorsten.Joachims at gmd.de (Thorsten Joachims) Date: Fri, 24 Nov 2000 11:39:33 +0100 (MET) Subject: New SVM-Light Release (V3.50) Message-ID: <200011241039.LAA02826@borneo.gmd.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 2399 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/405be3e0/attachment.ksh From nnsp01 at neuro.kuleuven.ac.be Fri Nov 24 10:56:16 2000 From: nnsp01 at neuro.kuleuven.ac.be (Neural Networks for Signal Processing 2001) Date: Fri, 24 Nov 2000 16:56:16 +0100 Subject: NNSP 2001, CALL FOR PAPERS Message-ID: <3A1E8FA0.D882DAB8@neuro.kuleuven.ac.be> --------------------------------------------------------------- 2001 IEEE WORKSHOP ON NEURAL NETWORKS FOR SIGNAL PROCESSING --------------------------------------------------------------- September 10-12, 2001 Falmouth, Massachusetts, USA NNSP'2001 homepage: http://eivind.imm.dtu.dk/nnsp2001 submission deadline: March 15, 2001 Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council, the eleventh of a series of IEEE workshops on Neural Networks for Signal Processing will be held in Falmouth, Massachusetts, at the SeaCrest Oceanfront Resort and Conference Center. The workshop will feature keynote lectures, technical presentations, and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithms and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, generalization, design algorithms, optimization, parameter estimation, nonlinear signal processing, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, sonar and radar, data fusion, data mining, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, blind source separation, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. PAPER SUBMISSION PROCEDURE Prospective authors are invited to submit a full paper of up to ten pages using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2001 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academica Publishers. SCHEDULE Submission of full paper: March 15, 2001 Notification of acceptance: May 1, 2001 Submission of photo-ready accepted paper and author registration: June 1, 2001 Advance registration before: July 15, 2001 ORGANIZATION General Chairs David J. MILLER Tulay ADALI The Pennsylvania State University University of Maryland Baltimore County Program Chairs Jan LARSEN Marc VAN HULLE Technical University of Denmark Katholieke Universiteit, Leuven Finance Chair Publicity Chair Lian YAN Patrick DE MAZIERE Athene Software, Inc. Katholieke Universiteit, Leuven Proceedings Chair Registration and Local Arrangements Scott C. DOUGLAS Elizabeth J. WILSON Southern Methodist University Raytheon Co. America Liaison Asia Liaison Amir ASSADI H.C. FU University of Wisconsin at Madison National Chiao Tung University Europe Liaison Herve BOURLARD Swiss Federal Institute of Technology at Lausanne Program Committee Yianni Attikiouzel Andrew Back Herve Bourlard Andrzej Cichocki Jesus Cid-Sueiro Robert Dony Li Min Ling Guan Tzyy-Ping Jung Shigeru Katagiri Jens Kohlmorgen Fa Long Luo Danilo Mandic Elias Manolakos Michael Manry Takashi Matsumoto Christophe Molina Bernard Mulgrew Mahesan Niranjan Tomaso Poggio Kostas N. Plataniotis Jose Principe Phillip A. Regalia Joao-Marcos Romano Kenneth Rose Jonas Sjoberg Robert Snapp M. Kemal Sonmez Naonori Ueda Lizhong Wu Lian Yan Fernando Jose Von Zuben From diamond at sissa.it Fri Nov 24 11:35:07 2000 From: diamond at sissa.it (Mathew E. Diamond) Date: Fri, 24 Nov 2000 17:35:07 +0100 Subject: Neuroscience in Trieste Message-ID: <3.0.5.32.20001124173507.008692b0@shannon.sissa.it> Dear Neuroscientists, I would like to direct your attention to Neuroscience study and training opportunities now available in Trieste, Italy. For an overview of activities, please see the website: http://www.sissa.it/cns/www/neuroinfo.html For those interested in attending a College in Trieste on the Evolution of Intelligent Behavior please see the website: http://www.ictp.trieste.it/cgi- bin/ICTPsmr/mkhtml/smr2html.pl?smr1308/Announcement kind regards, Mathew E. Diamond (diamond at sissa.it) Cognitive Neuroscience Sector International School for Advanced Studies, Trieste ITALY From ahirose at info-dev.rcast.u-tokyo.ac.jp Mon Nov 27 02:08:27 2000 From: ahirose at info-dev.rcast.u-tokyo.ac.jp (ahirose@info-dev.rcast.u-tokyo.ac.jp) Date: Mon, 27 Nov 2000 16:08:27 +0900 (JST) Subject: CFP IEICE Trans Electron --Special Issue Message-ID: <200011270708.QAA16993@info-dev.rcast.u-tokyo.ac.jp> +----------------------------------------------------------------+ | Call for Papers | | ~~~~~~~~~~~~~~~ | | IEICE Trans. on Electronics Special Issue on | | New Technologies in Signal Processing for Electromagnetic-wave | | Sensing and Imaging | | Manuscript Deadline: March 31, 2001 | +----------------------------------------------------------------+ Topics include, for example, -Neural networks and other soft-computing techniques in classification, restoration and segmentation of radar images The IEICE (Institute of Electronics, Information and Communication Engineers, http://www.ieice.or.jp/) Transactions on Electronics announces a "Special Issue on New Technologies in Signal Processing for Electromagnetic- wave Sensing and Imaging" to be published in December 2001. Please visit: http://www.ee.t.u-tokyo.ac.jp/~ahirose/ieice-special-issue2001/ The Special Issue aims to publish articles on recent progress in science, theories, techniques, applications and systems concerning signal processing in electromagnetic-wave / lightwave sensing and imaging. The Editorial Committee solicits submissions of Papers and Letters in related wide research fields. 1. Scope Topics of interest include, but are not limited to, the following areas: Science and application of statistics in rough surface scattering Polarimetry / interferometry science and technology Phase unwrapping and residue analysis / elimination Speckle and interference noise reduction Inverse scattering and image reconstruction Neural network and soft-computing in classification, restoration and segmentation Wavelet transform and linear / nonlinear processing Migration method MUSIC method and high-resolution estimation Temporal and spatial coherence synthesis SAR theory and system Polarimetric / interferometric radar theory and system Near-field electromagnetic-wave imaging and system Remote sensing of the earth, ocean and atmosphere Compensation of fluctuation and distortion in imaging and sensing 2. Manuscript Deadline: March 31, 2001 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 3. Submission of Papers Prospective authors are requested to send four copies of their manuscript by March 31, 2001, to Dr. Akira Hirose, Guest Editor, at the address shown below. In the preparation, please refer to "Information for Authors" of the IEICE Transactions on Electronics for details. The information is also found at http://www.ieice.or.jp. Papers should not exceed 8 printed pages, whereas Letters 2 pages. When submitting, please indicate "Special Issue on New Technologies in Signal Processing for Electromagnetic-wave Sensing and Imaging" with red ink on the envelope and each front page of the four copies. Please note that the period for resubmission (normally 60 days) after review may be shortened due to the publication schedule. Submitted manuscripts (Papers / Letters) will be reviewed by referees according to the ordinary rules of the Transactions Editorial Committee. 4. Mailing address Four copies of the complete manuscript should be submitted to: Akira Hirose, Guest Editor Department of Frontier Informatics, Graduate School of Frontier Sciences, The University of Tokyo 4-6-1 Komaba, Meguro-ku, Tokyo 153-8904, Japan Phone: +81-3-5452-5155 Facsimile: +81-3-5452-5156 E-mail: ahirose at ee.t.u-tokyo.ac.jp 5. Special Issue Editorial Committee Guest Editor: Akira Hirose (Univ. of Tokyo, Tokyo) Members: Shane Cloude (Applied Electromagnetics, St. Andrews) George Georgiou (California State Univ., San Bernardino) Kazuo Hotate (Univ. of Tokyo, Tokyo) Eric Pottier (Univ. of Rennes 1, Rennes) Motoyuki Sato (Tohoku Univ. Sendai) Toru Sato (Kyoto Univ., Kyoto) Mitsuo Takeda (Univ. of Electro-Commun., Tokyo) Yoshio Yamaguchi (Niigata Univ., Niigata) * Please note that if accepted, all authors, including authors of invited papers, should pay for the page charges covering partial cost of publication. Authors will receive 100 copies of the reprint. Updated information will be posted at http://www.ee.t.u-tokyo.ac.jp/~ahirose/ieice-special-issue2001/ -- From dr+ at cs.cmu.edu Mon Nov 27 16:00:47 2000 From: dr+ at cs.cmu.edu (Douglas Rohde) Date: Mon, 27 Nov 2000 16:00:47 -0500 Subject: The Lens Neural Net Simulator Message-ID: <3A22CB7F.4BBBBDEB@cs.cmu.edu> Lens 2.3 is Now Available!! http://www.cs.cmu.edu/~dr/Lens I'm pleased to announce the first public release of the Lens neural network simulator. Lens is a general-purpose simulator that runs on Windows as well as most Unix platforms. It is primarily designed for training feed-forward and recurrent backpropagation networks, but can be easily adapted to handle other architectures and currently includes deterministic Boltzmann machines, Kohonen networks, and interactive-activation models. Lens provides a variety of graphical interfaces to aid in executing common commands, visualizing unit and link values, accessing internal parameters, and plotting data. However, it also includes a complete scripting language (Tcl), that enables networks to be constructed quickly and easily. Although Lens was designed with the serious modeler in mind, it would actually be a good choice for use in teaching introductory courses. Basic networks can be created in just a few simple commands and students will be up and running in minutes. The following are some of the main advantages of Lens over other neural network simulators I have used, including PDP++, SNNS, Xerion, RCS, Tlearn, and Matlab: -SPEED: Lens is up to 4 times faster than the other major simulators. This is mainly due to tight inner loops that minimize memory references and achieve good cache performance. Lens is also able to do batch parallel training on multiple processors. -EFFICIENCY: Lens makes efficient use of memory, so it can handle larger networks and larger example sets. -FLEXIBILITY: Because it includes a scripting language, the user has considerable flexibility in writing procedures to automatically build, train, or test customized networks. Unit input and output procedures can be composed from a variety of sub-procedures, allowing many group types to be created without the need to add additional code. -CUSTOMIZABILITY: Lens was designed around the principle that no simulator can satisfy all users out of the box. Sophisticated users will inevitably need to customize the simulator at the source code level. But altered code always causes problems if a new generic version of the source is released. Therefore, Lens includes facilities for easily extending network structures and creating new types, algorithms, and commands without altering the main code base. You can find additional information about downloading and using Lens at its homepage: http://www.cs.cmu.edu/~dr/Lens AVAILABILITY Lens is available free-of-charge to those conducting research at academic or non-profit institutions. Other users should contact me for licensing information at dr+lens at cs.cmu.edu. CAVEATS Lens is designed for simulating connectionist networks. It is not appropriate for compartmental membrane-level or differential equation-based modeling. From oreilly at grey.colorado.edu Tue Nov 28 17:32:08 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 28 Nov 2000 15:32:08 -0700 Subject: CU-Boulder Graduate Training in Cognitive Neuroscience Message-ID: <200011282232.PAA07733@grey.colorado.edu> Please forward this to relevant mailing lists, colleagues, potential students, etc. Cognitive Neuroscience University of Colorado Boulder Graduate Training Opportunities This is an invitation to apply for graduate training in Cognitive Neuroscience at the University of Colorado Boulder (CU). The CU Department of Psychology has a strong nucleus of cognitive neuroscientists with major interests in learning, memory, attention, executive functions, and emotion. Training is available in a variety of cognitive neuroscience methods including functional magnetic resonance imaging (fMRI), event-related brain potentials (ERP), computational cognitive neuroscience, human neuropsychology, and animal behavioral neuroscience methods. Faculty include Marie Banich (fMRI, human neuropsychology, attention and executive functions), Tim Curran (ERP, learning and memory), Tiffany Ito (ERP, social neuroscience, emotion, stereotyping), Akira Miyake (working memory and executive functions), Randall O'Reilly (computational cognitive neuroscience, learning and memory, executive functions), and Jerry Rudy (behavioral neuroscience, learning & memory). Several other members of the CU Psychology faculty are interested in cognitive neuroscience approaches to a variety of topics: Edward Craighead (prevention of major depression), Reid Hastie (judgment and decision making, memory and cognition, social psychology), Kent Hutchinson (substance abuse and dependence), Steve Maier (behavioral neuroscience, adaptation to challenge), David Miklowitz (adult psychopathology), Linda Watkins (behavioral neuroscience, adaptation to challenge), and Mark Whisman (cognitive therapy and depression). The cognitive neuroscientists in the CU Psychology Department are complemented by an outstanding community of scientists with similar interests. The CU Psychology Department, with graduate programs in Cognitive Psychology and Behavioral Neuroscience, is consistently in the top 20 of the US News & World Report rankings. The Institute of Cognitive Science includes scientists from computer science, education, linguistics, philosophy, and psychology. The Institute for Behavioral Genetics provides an unique resource for conducting and facilitating research on the genetic and environmental bases of individual differences in behavior. Associated hospitals provide the possibility of conducting research with patient populations. Our computational cognitive neuroscience laboratory particularly benefits from interactions from professors Mike Mozer (CU Computer Science) and Yuko Munakata (University of Denver (DU), Psychology). In addition to providing an exciting research environment and hosting the annual Neural Information Processing Systems conference, the greater Denver/Boulder area offers an exceptional quality of life. Spectacularly situated at the eastern edge of the Rockies, this area provides a wide variety of extraordinary outdoor activities, an average of 330 sunny days per year, and also affords a broad range of cultural activities. For more information, full lists of associated faculty, and instructions on applying to the graduate programs, see the following web sites: CU Overview Web Page: http://www.cs.colorado.edu/~mozer/resgroup.html CU Psychology: http://psych-www.colorado.edu/ CU Institute of Cognitive Science: http://psych-www.colorado.edu/ics/home.html CU Institute of Behavioral Genetics: http://ibgwww.colorado.edu/ CU Computer Science: http://www.cs.colorado.edu/ DU Psychology: http://www.du.edu/psychology/ One or more of the following faculty should be contacted for any further information. Marie Banich, CU Psych mbanich at psych.colorado.edu http://psych.colorado.edu/~mbanich/lab/ Tim Curran, CU Psych tcurran at psych.colorado.edu http://psych.colorado.edu/~tcurran/ Tiffany Ito, CU Psych tito at psych.colorado.edu http://psych.colorado.edu/~tito/ Akira Miyake, CU Psych miyake at psych.colorado.edu http://psych.colorado.edu/~miyake/ Michael Mozer, CU CS mozer at cs.colorado.edu http://www.cs.colorado.edu/~mozer/Home.html Yuko Munakata, DU Psych munakata at kore.psy.du.edu http://kore.psy.du.edu/munakata Randall O'Reilly, CU Psych oreilly at psych.colorado.edu http://psych.colorado.edu/~oreilly Jerry Rudy, CU Psych jrudy at clipr.Colorado.edu http://psych.colorado.edu/~jrudy/ From heiko.wersing at hre-ftr.f.rd.honda.co.jp Wed Nov 29 11:24:28 2000 From: heiko.wersing at hre-ftr.f.rd.honda.co.jp (Heiko Wersing) Date: Wed, 29 Nov 2000 17:24:28 +0100 Subject: Two PhD Studentships at HONDA R&D Europe Message-ID: <3A252DBC.C02F35B0@hre-ftr.f.rd.honda.co.jp> Two PhD Studentships at HONDA R&D Europe Systems, which support humans in their complex and dynamic environments and help to save natural resources, are the aim of the Future Technology Research Division of HONDA R&D EUROPE (Deutschland). In close cooperation with national and international research institutions, we develop intelligent systems based on neural architectures and evolutionary processes for the design and optimisation of technical systems. One example for the leading-edge fundamental research activities at HONDA is the development of the walking humanoid robots P3 (see http://www.honda-p3.com/) and Asimo. Our research centre is located in Offenbach/Main, next to Frankfurt/Main, the financial capital of Europe with its international flair. Candidates will work in a high-class team conducting leading-edge research in computational intelligence. As part of a national research project on learning systems, we are looking for two researchers to strengthen our young and dynamic team. The positions will focus on behaviour-based learning and evolutionary principles for structure adaptation with applications to robotics, man-machine interaction, and computer vision. Due to the nature of the project the positions are initially limited to a period of 3 years. The candidates will be supported to do a doctorate degree. Applicants should have a Diploma or Master degree in physics, computer science or electrical engineering and enjoy to work in a multidisciplinary research project in an international team. The candidates should be fluent in English or German. Please send your application to: Prof. Dr.-Ing habil. Edgar K?rner Future Technology Research Division HONDA R&D EUROPE (Deutschland) GmbH Carl-Legien-Strasse 30 63073 Offenbach/Main GERMANY Telephone : ++49 (0)69 890110 email : edgar.koerner at hre-ftr.f.rd.honda.co.jp From rogilmore at psu.edu Wed Nov 29 11:39:59 2000 From: rogilmore at psu.edu (Rick Gilmore) Date: Wed, 29 Nov 2000 11:39:59 -0500 Subject: Graduate Training at Penn State Message-ID: Please circulate the following graduate program announcement. Thanks, Rick Gilmore --------------------------------------------- Graduate Training in Psychobiology, Psychophysiology, and Neuroscience at Penn State The Psychology Department at Penn State invites students with excellent undergraduate academic records and research experience to apply for graduate training in psychobiology, psychophysiology, and neuroscience. Faculty in the department have wide-ranging interests and offer a variety of personalized training experiences for graduate students. Faculty research specialties include motion sickness, nausea and appetite; affective neuroscience; chronobiology (time analysis of behavior's rhythms); developmental and clinical cognitive neuroscience; attention and information processing; and visual cognition. The department has particular expertise in and excellent facilities for research using psychophysiological (EEG, ERP, EGG, EKG) and behavioral measures. Penn State's Psychology Department offers an exceptionally congenial and collaborative environment for graduate students who are seeking individualized and hands-on training. Incoming students are guaranteed a competitive multi-year financial package. The Department has collaborative relationships with other neuroscience researchers who are part of the university-wide Life Sciences Consortium based at University Park and at the College of Medicine in Hershey. In addition, the Department expects to add a senior neuroscientist to the faculty by Fall 2001. Penn State is located in an area known for its affordability and exceptional quality of life. For more information about graduate training at Penn State, visit the Department's web site (http://psych.la.psu.edu) or contact one of the faculty members directly. Frederick Brown, Chronobiology (time analysis of behavior's rhythms) f3b at psu.edu, http://psych.la.psu.edu/faculty/brown.htm Rick Gilmore, Perceptual development, developmental cognitive neuroscience rogilmore at psu.edu, http://psych.la.psu.edu/faculty/gilmore.htm Cathleen Moore, Visual cognition cmm15 at psu.edu, http://psych.la.psu.edu/faculty/moore.htm Toby Mordkoff, Attention and information processing jtm12 at psu.edu, http://psych.la.psu.edu/faculty/mordkoff.htm Karen Quigley, Affective neuroscience ksq1 at psu.edu, http://psych.la.psu.edu/faculty/quigley.htm William Ray, Clinical cognitive neuroscience wjr at psu.edu, http://psych.la.psu.edu/faculty/ray.htm Robert Stern, Motion sickness, nausea, and appetite rs3 at psu.edu, http://psych.la.psu.edu/faculty/stern.htm From ken at phy.ucsf.edu Thu Nov 30 01:00:06 2000 From: ken at phy.ucsf.edu (Ken Miller) Date: Wed, 29 Nov 2000 22:00:06 -0800 (PST) Subject: UCSF Postdoctoral/Graduate Fellowships in Theoretical Neurobiology Message-ID: <14885.60646.421523.8769@coltrane.ucsf.edu> FULL INFO: http://www.sloan.ucsf.edu/sloan/sloan-info.html PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR CONTACT ADDRESSES GIVEN BELOW The Sloan Center for Theoretical Neurobiology at UCSF solicits applications for pre- and post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. The Sloan Center offers opportunities to combine theoretical and experimental approaches to understanding the operation of the intact brain. Young scientists with strong theoretical backgrounds will receive scientific training in experimental approaches to understanding the operation of the intact brain. They will learn to integrate their theoretical abilities with these experimental approaches to form a mature research program in integrative neuroscience. The research undertaken by the trainees may be theoretical, experimental, or a combination. Resident Faculty of the Sloan Center and their research interests include: Herwig Baier: Genetic analysis of the visual system William Bialek (25\% time): Information-theoretic and statistical characterization of, and physical limits to, neural coding and representation Allison Doupe: Development of song recognition and production in songbirds Stephen Lisberger: Learning and memory in a simple motor reflex, the vestibulo-ocular reflex, and visual guidance of smooth pursuit eye movements by the cerebral cortex Michael Merzenich: Experience-dependent plasticity underlying learning in the adult cerebral cortex, and the neurological bases of learning disabilities in children Kenneth Miller: Circuitry of the cerebral cortex: its structure, self-organization, and computational function (primarily using cat primary visual cortex as a model system) Philip Sabes: Sensorimotor coordination, adaptation and development of spatially guided behaviors, experience dependent cortical plasticity Christoph Schreiner: Cortical mechanisms of perception of complex sounds such as speech in adults, and plasticity of speech recognition in children and adults Michael Stryker: Mechanisms that guide development of the visual cortex There are also a number of visiting faculty, including Larry Abbott, Brandeis University; Sebastian Seung, MIT; David Sparks, Baylor University; Steve Zucker, Yale University. TO APPLY, please send a curriculum vitae, a statement of previous research and research goals, up to three relevant publications, and have two letters of recommendation sent to us. The application deadline is February 1, 2000. Send applications to: Sloan Center 2001 Admissions Sloan Center for Theoretical Neurobiology at UCSF Department of Physiology University of California 513 Parnassus Ave. San Francisco, CA 94143-0444 PRE-DOCTORAL applicants with strong theoretical training may seek admission into the UCSF Neuroscience Graduate Program as a first-year student. Applicants seeking such admission must apply by Jan. 5, 2000 to be considered for fall, 2000 admission. Application materials for the UCSF Neuroscience Program may be obtained from http://www.neuroscience.ucsf.edu/neuroscience/admission.html or from Pat Vietch Neuroscience Graduate Program Department of Physiology University of California San Francisco San Francisco, CA 94143-0444 neuroscience at phy.ucsf.edu Be sure to include your surface-mail address. The procedure is: make a normal application to the UCSF Neuroscience program; but also alert the Sloan Center of your application, by writing to Steve Lisberger at the address given above. If you need more information: -- Consult the Sloan Center WWW Home Page: http://www.sloan.ucsf.edu/sloan -- Send e-mail to sloan-info at phy.ucsf.edu -- See also the home page for the W.M. Keck Foundation Center for Integrative Neuroscience, in which the Sloan Center is housed: http://www.keck.ucsf.edu/ From henkel at physik.uni-bremen.de Thu Nov 30 03:33:48 2000 From: henkel at physik.uni-bremen.de (Rolf D. Henkel) Date: Thu, 30 Nov 2000 09:33:48 +0100 Subject: Follow-Up on TR "Sync & Coherence-Detection" Message-ID: <00113009425000.16709@axon> Dear Connectionists, some people had difficulties accessing the technical report Title: Synchronization, Coherence-Detection and Three-Dimensional Vision Author: Rolf D. Henkel, Institute for Theoretical Neurophysics Keywords: Synchronization, Coherence, Neural Code, Neural Computations, Robust Estimators, Three-dimensional Vision, Integrate-And-Fire-Neurons. in which a new operational mode for networks of spiking neurons is proposed. There is now an easier accessable webpage available, at http://axon.physik.uni-bremen.de/~rdh/research/stereo/spiking/ Also, the principle idea of coherence-detection can be tested online, with own data, by going to the webpage http://axon.physik.uni-bremen.de/~rdh/online_calc/stereo/ Regards, Rolf Henkel -- Dr. Rolf Henkel Institute for Theoretical Neurophysics University Bremen Kufsteiner Straße 1, D-28359 Bremen Phone: +49-421-218-3688 henkel at physik.uni-bremen.de Fax: +49-421-218-4869 http://axon.physik.uni-bremen.de/ From pfbaldi at ics.uci.edu Thu Nov 2 09:46:46 2000 From: pfbaldi at ics.uci.edu (Pierre Baldi) Date: Thu, 2 Nov 2000 06:46:46 -0800 Subject: BIOINFORMATICS FACULTY POSITION AT UC IRVINE Message-ID: <006101c044db$b990b910$be04c380@time-slice.ics.uci.edu> Several tenure-track postions are open in the Department of Information and Computer Science at UC Irvine. In particular, we are looking for candidates in biological and medical informatics. Pierre Baldi Department of Information and Computer Science and Department of Biological Chemistry University of California, Irvine Irvine, CA 92697-3425 (949) 824-5809 (949) 824-4056 FAX www.ics.uci.edu/~pfbaldi =============================================== The Department of Information and Computer Science (ICS) at the University of California, Irvine (UCI) has a tenure-track position open in the area of bioinformatics, medical informatics, or computational biology. The Department has a strong presence in this area with 5 full-time faculty in biomedical informatics and many faculty from within the Department and throughout the University with whom they collaborate (see http://www.ics.uci.edu/~biomed/). The available position is at the assistant professor level, but exceptional candidates from all ranks will be considered. In all cases, we are looking for applicants with a Ph.D. degree in Computer Science, Medical Informatics, Bioinformatics or a related field, as well as strong research credentials as evidenced by scholarly publications. Applicants for senior positions must also demonstrate a proven track record in original research and teaching activities. The ICS department runs a concentration in Informatics in Biology and Medicine and there are outstanding opportunities for interdiscipinary collaborations at all levels with the School of Biological Sciences, the School of Physical Sciences, the Department of Biomedical Engineering and the College of Medicine at UCI. ICS faculty are also closely affiliated with the new Institute for Genomics and Bioinformatics (http://www.igb.uci.edu). The Department and UCI are poised for exceptional growth in the coming years. The ICS Department is organized as an independent campus unit reporting to the Executive Vice Chancellor. It is the largest computer science department within the University of California. It runs the second most popular major at UCI and has designed an undergraduate honors program that attracts the campus' most qualified students. External funding from government and industrial sponsors exceeded $10 million last year. The Department currently has 38 full-time faculty and 200 Ph.D. students involved in various research areas including analysis of algorithms and data structures, artificial intelligence and machine learning, hardware-software co-design, parallel and distributed processing, embedded systems, communication networks, middleware technology, security and cryptography, databases, information retrieval and visualization, computational biology and medical informatics, computer graphics, human computer interaction and computer supported cooperative work, programming languages, software development and advanced software technology. The faculty have productive interdisciplinary ties with colleagues in the arts, biology, cognitive science, engineering, management, medicine, and the social sciences. More information about the Department can be found at http://www.ics.uci.edu. Although UCI is a young university, it has attained remarkable stature in the past 3 decades. Two Nobel prizes were recently awarded to UCI faculty. UCI is located three miles from the Pacific Ocean near Newport Beach, approximately forty miles south of Los Angeles. The climate is ideal year-round avoiding extreme temperatures in winters and summers. Irvine is consistently ranked among the safest cities in the U.S. and has an exceptional public school system. The campus is surrounded by high-technology companies that participate in an active affiliates program. Both the campus and the area offer exciting professional and cultural opportunities. Mortgage and housing assistance are available including newly built, for-sale housing located on campus and within short walking distance from the department. Applicants should send a cover letter indicating their interest in area E (Bioinformatics or Medical Informatics), a CV, three sample papers and contact information for three or four references to recruit at ics.uci.edu (PDF, postscript, Word, or ASCII). Please cc also to: recruit-E at ics.uci.edu. Applicants are requested to ask their references to send letters of evaluation to recruit at ics.uci.edu by January 12, 2001. Those that insist upon sending hard copy may send it to: ICS Faculty Position [E] c/o Peggy Munhall Department of Information and Computer Science University of California, Irvine Irvine, CA 92697-3425 Application screening will begin immediately upon receipt of curriculum vitae. Maximum consideration will be given to applications received by January 5, 2001. The University of California is an Equal Opportunity Employer, committed to excellence through diversity. From mzib at ee.technion.ac.il Thu Nov 2 11:49:12 2000 From: mzib at ee.technion.ac.il (Michael Zibulevsky) Date: Thu, 2 Nov 2000 18:49:12 +0200 (IST) Subject: paper: "Multiresolution framework for blind source separation" (fwd) Message-ID: Announcing a paper ... Title: Multiresolution framework for blind source separation Authors: P. Kisilev, M. Zibulevsky, Y.Y. Zeevi, B.A. Pearlmutter ABSTRACT: The concern of the blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. It was discovered recently, that use of sparsity of sources in some signal dictionary dramatically improves the quality of separation. In this work we use the property of multiscale transforms, such as wavelet or wavelet packets, to decompose signals into sets of local features with various degrees of sparsity. We use this intrinsic property for selecting the best (most sparse) subsets of features for further separation. Experiments with simulated signals, musical sounds and images demonstrate further significant improvement of separation quality. URL of the ps file: http://ie.technion.ac.il/~mcib/multisepMP9a.ps.gz Contact: paulk at tx.technion.ac.il mzib at ee.technion.ac.il From garionis at luna.cs.uni-dortmund.de Fri Nov 3 06:11:18 2000 From: garionis at luna.cs.uni-dortmund.de (Ralf Garionis) Date: Fri, 03 Nov 2000 12:11:18 +0100 Subject: Professorship in Computer Science Message-ID: <200011031111.MAA17278@luna.cs.uni-dortmund.de> Readers of this list may be interested in the following post: The Department of Computer Science at the University of Dortmund seeks applicants for a professorship position (C3). Candidates should have a strong background in Neural Computation or Fuzzy Logic for conducting research and teaching in the specific area. Please see a copy of the official announcement http://dekanat.cs.uni-dortmund.de/JobMarkt/Professoren/C3Professur.jpg for further details (this page is available in german language only). Informal enquirires may be made to Mr Decker, tel. +49-(0)231-755-2121, email decker at dek.cs.uni-dortmund.de. Closing date: Thursday 30 November 2000. From D.Willshaw at cns.ed.ac.uk Fri Nov 3 12:14:38 2000 From: D.Willshaw at cns.ed.ac.uk (David Willshaw) Date: Fri, 3 Nov 2000 17:14:38 +0000 (GMT) Subject: Electronic access to NETWORK: Computation in Neural Systems Message-ID: <14850.62078.170899.871274@gargle.gargle.HOWL> IOPP, publishers of NETWORK, of which I am Editor-in-Chief, has made their journals freely accessible electronically until December 22 2000. For more details of this service, see http://www.iop.org/Physics/News/0259j I would like to invite you to use this opportunity to browse through the 11 volumes of NETWORK. For those of you who don't know, NETWORK is in its eleventh year of publication. Originallly publishing papers in Neural Networks and Computational Neuroscience, last year the focus of the journal was changed to concentrate on all aspects of Computational Neuroscience. Regards, David Willshaw ------------------------------------------------------- Professor David Willshaw Editor-in-Chief, NETWORK: Computation in Neural Systems Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh 5 Forrest Hill Edinburgh EH1 2QL Scotland, UK Tel: (+44) 131 650 4404/5 Fax: (+44) 131 650 4406 Email: neted at anc.ed.ac.uk ------------------------------------------------------- From terry at salk.edu Mon Nov 6 00:45:47 2000 From: terry at salk.edu (Terry Sejnowski) Date: Sun, 5 Nov 2000 21:45:47 -0800 (PST) Subject: Computational Neuroscience 2000 In-Reply-To: <14850.62078.170899.871274@gargle.gargle.HOWL> Message-ID: <200011060545.eA65jlc69337@hamlet.salk.edu> The following special supplement to Nature Neuroscience is available free at: http://www.nature.com/neuro/journal/v3/n11s/index.html Terry ----- Nature Neuroscience November 2000 Volume 3 Number Supp pp 1160 - 1211 Computational approaches to brain function p 1160 Charles Jennings Ph.D. Editor & Sandra Aamodt Ph.D. Senior Editor Computational neuroscience at the NIH pp 1161 - 1164 History The Hodgkin-Huxley theory of the action potential p 1165 Michael Heusser Half a century of Hebb p 1166 H. Sebastian Seung The basic unit of computation p 1167 Anthony Zador Models of motion detection p 1168 Alexander Borst The Pope and grandmother's frog's-eye view of theory p 1169 Kevan Martin Computation by neural networks p 1170 Geoffrey Hinton Reviews The role of single neurons in information processing pp 1171 - 1177 Christof Koch & Idan Segev Synaptic plasticity: taming the beast pp 1178 - 1183 L. Abbott & Sacha Nelson Neurocomputational models of working memory pp 1184 - 1191 Daniel Durstewitz, Jeremy Seamans & Terrence Sejnowski Computational approaches to sensorimotor transformations pp 1192 - 1198 Alexandre Pouget & Lawrence Snyder Models of object recognition pp 1199 - 1204 Maximilian Riesenhuber & Tomaso Poggio Computer simulation of cerebellar information processing pp 1205 - 1211 Javier Medina & Michael Mauk Computational principles of movement neuroscience pp 1212 - 1217 Daniel Wolpert & Zoubin Ghahramani Learning and selective attention pp 1218 - 1223 Peter Dayan, Sham Kakade & P. Read Montague Viewpoints Models are common; good theories are scarce p 1177 Charles Stevens In the brain, the model is the goal p 1183 Bartlett Mel Facilitating the science in computational neuroscience p 1191 Lyle Borg-Graham Models identify hidden assumptions p 1198 Eve Marder On theorists and data in computational neuroscience p 1204 J. Hopfield What does 'understanding' mean? p 1211 Gilles Laurent ----- From Nigel.Goddard at ed.ac.uk Tue Nov 7 08:42:00 2000 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Tue, 07 Nov 2000 13:42:00 +0000 Subject: Position in Research and System Support Message-ID: <3A0806A8.B45CBB9A@ed.ac.uk> Research and System Support Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh This is an outstanding opportunity for someone with system administration skills to be involved in exciting research projects related to brain function and neural computation, some using the most advanced high-performance computers. We are seeking an individual to administer a research network and to engage in a variety of research projects including computational modeling of brain function, functional MRI studies, probabilistic data modeling, and neuroinformatics. Effort is to be split about equally between system support and research projects. Please see http://www.personnel.ed.ac.uk/VACS/vac2.htm#job5, Post E for full details, and note the early deadline for applications. Informal enquiries about this position can be made to Andrew Gillies +44 (0)131 650 3096 -- ========================================================= Dr. Nigel Goddard Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh 5 Forrest Hill Edinburgh EH1 2QL Scotland Telephone: +44 (0)131 650 3087 Mobile: +44 (0)787 967 1811 email: Nigel.Goddard at ed.ac.uk web: http://anc.ed.ac.uk/~ngoddard FAX(UK) : +44 (0)870 063 3111 or +44 (0)870 130 5014 FAX(USA): +1 603 698 5854 Calendar: http://calendar.yahoo.com/public/nigel_goddard ========================================================= From d.mareschal at bbk.ac.uk Thu Nov 9 06:46:32 2000 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Thu, 9 Nov 2000 12:46:32 +0100 Subject: postdoctoral postions available Message-ID: Dear all, I would appreciate it if you could bring these positions to the attentions of any interested people. They are part of a project exploring perceptual and cognitive development that aims to link behavioural experimental work with neural network modelling as tightly as possible. Although one position is experimental and the other is computational, the ideal candidate would have interests in both approaches to studying perception and cognition. Many thanks, Denis Mareschal *************** insert text *************** Postdoctoral Research Assistants in Psychology/Cognitive Science (Two-Year Fixed Term) We are seeking two researchers to work with Dr Denis Mareschal within the School of Psychology and the Centre for Brain and Cognitive Development. The posts are tenable from 1 February, 2001 or as soon as possible thereafter. 1. Infant Behavioural Testing (Ref: APS346) The post will consist mainly in testing categorisation in infants. You will use both visual preference and manipulation methodologies to assess categorisation in infants from age 3 to 24 months. Applicants should have a PhD, preferably in cognitive psychology or developmental psychology. 2. Connectionist Modelling (Ref: APS343) The post will consist mainly in applying connectionist/neural network techniques. You will help implement and design models of memory and categorisation in infancy. It is expected that applicants with advanced technical skills would have ample time to develop their own research programmes. Applicants should have a PhD, preferably in cognitive psychology, developmental psychology, AI, or Neural Computation. Preliminary details of these positions can be obtained by following links from my web page http://www.psyc.bbk.ac.uk/staff/dm.html. Final details can be obtained, in due course, from the personnel department at Birkbeck College either through the web (http://www.bbk.ac.uk) or by sending an A4 sae quoting the reference number to the Personnel Department, Birkbeck, Malet Street, Bloomsbury, London WC1E 7HX. Closing date: 14 December 2000 Informal enquires for both positions can be directed to d.mareschal at bbk.ac.uk ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 020 7631-6582/6207 fax +44 020 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From cierina at vis.caltech.edu Mon Nov 6 17:45:54 2000 From: cierina at vis.caltech.edu (Cierina Reyes) Date: Mon, 06 Nov 2000 14:45:54 -0800 Subject: Announcement - Postdoctoral Position Message-ID: <5.0.0.25.0.20001103122118.00ab7160@vis.caltech.edu> THEORETICAL/COMPUTATIONAL POSTDOCTORAL FELLOWSHIP IN NEUROSCIENCE: Applications are invited for a postdoctoral research position available immediately jointly between the laboratories of P. Mitra at Bell Laboratories (Murray Hill, New Jersey) and Prof. R. Andersen at Caltech (Pasadena, California). The research projects will involve building a real time system to transform measured in vivo neural signals into high level control signals for driving prosthetic limbs and the associated data analysis and algorithmic development. The successful applicant should have a Ph.D. in physics, mathematics, electrical engineering, applied mathematics, or equivalent theoretical sciences background. Programming or electronics experience preferred. The geographical location (Caltech/Bell Labs) is flexible and will depend on the candidate's expertise. Applications should include a curriculum vitae and two letters of recommendation. This material should be sent to: Ms. Cierina Reyes, California Institute of Technology, MC 216-76, 1201 E. California Blvd., Pasadena, CA 91125. Caltech is an Equal Opportunity/Affirmative Action Employer. Women, minorities, veterans, and disabled persons are encouraged to apply. From stork at rsv.ricoh.com Tue Nov 7 01:36:56 2000 From: stork at rsv.ricoh.com (stork) Date: Mon, 06 Nov 2000 22:36:56 -0800 Subject: New book: Pattern Classification Message-ID: <3A07A301.4DC0F2F1@rsv.ricoh.com> Announcing a new book: Pattern Classification (2nd ed.) by R. O. Duda, P. E. Hart and D. G. Stork 654 pages, two-color printing (John Wiley and Sons, 2001) ISBN: 0-471-05669-3 This is a significant revision and expansion of the first half of Pattern Classification & Scene Analysis, R. O. Duda and P. E. Hart's influential 1973 book. The current book can serve as a textbook for a one- or two-semester graduate course in pattern recognition, machine learning, data mining and related fields offered in Electrical Engineering, Computer Science, Statistics, Operations Research, Cognitive Science, or Mathematics departments. Established researchers in any domain relying on pattern recognition can rely on the book as a reference on the foundations of their field. Table of Contents 1) Introduction 2) Bayesian Decision Theory 3) Maximum Likelihood and Bayesian Estimation 4) Nonparametric Techniques 5) Linear Discriminant Functions 6) Multilayer Neural Networks 7) Stochastic Methods 8) Nonmetric Methods 9) Algorithm-Independent Machine Learning 10) Unsupervised Learning and Clustering Mathematical Appendix Goals * Authoritative: The presentations are based on the best research and rigorous fundamental theory underlying proven techniques. * Complete: Every major topic in statistical, neural network and syntactic pattern recognition is presented, including all the topics that should be in the "toolbox" of designers of practical pattern recognition systems. * Up-to-date: The book includes the most recent proven techniques and developments in the theory of pattern recognition. * Clear: Every effort has been made to insure that the text is clearly written and will not be misinterpreted. The manuscript was tested in over 100 courses worldwide, and numerous suggestions from students, teachers and established researchers have been incorporated. Every attempt has been made to give the deepest explanation, providing insight and understanding rather than a laundry list of techniques. * Logically organized: The book is organized so as to build upon concepts and techniques from previous chapters, so as to speed the learning of the material. * Problem motivated, not technique motivated: Some books focus on a particular technique or method, for instance neural nets. The drawback of such books is that they highlight the particular technique, often at the expense of other techniques. Readers are left wondering how the particular highlighted technique compares with others, and especially how to decide which technique is appropriate for which particular problem. Pattern Classification instead assumes that practioners come first with a problem or class of problems, and seek a solution, using whichever technique is most appropriate. There are many pattern recognition problems for which neural networks (for instance) are ill-suited, and readers of alternative texts that focus on neural networks alone may be misled and believe neural networks are applicable to their problem. As the old saying goes, "if you're a hammer, every problem looks like a nail." Pattern Classification rather seeks to be a balanced and complete toolbox -- plus instructions on how to choose the right tool for the right job. * Long-lived: Every effort has been made to ensure the book will be useful for a long time, much as the first edition reamained useful for over a quarter of a century. For instance, even if a technique has vocal proponents, if that technique has not found genuine use in a challenging problem domain, it is not discussed in depth in the book. Further, the notation and terminology are consistent and standardized as generally accepted in the field. New topics * Neural Networks, including Hessians and second-order training and pruning techniques, popular heuristics for training and initializing parameters, and recurrent networks. * Stochastic methods, including simulated annealing, genetic algorithms, Boltzmann learning, and Gibbs sampling. * Nonmetric methods, including tree classifiers such as CART, ID3 and their descendents, string matching, grammatical methods and rule learning * Theory of learning, including the No Free Lunch theorems, Minimum Description Length (MDL) principle, Occam's principle, bias-variance in regression and classification, jackknife and bootstrap estimation, Bayesian model comparison and MLII, multi-classifier systems and resampling techniques such as boosting, bagging and cross validation. * Support Vector Machines, including the relationship between "primal" and "dual" representations. * Competitive learning and related methods, including Adaptive Resonance Theory (ART) networks and their relation to leader-follower clustering. * Self-organizing feature maps, including maps affected by the sampling density. New/improved features and resources * Solution Manual: A solution manual is available for faculty adopting the text. * New and redrawn figures: Every figure is carefully drawn (and all figures from the 1st edition have been updated and redrawn) using modern 3D graphics and plotting programs, all in order to illuminate ideas in a richer and more memorable way. Some (e.g., 3D Voronoi tesselations and novel renderings of stochastic search) appear in no other pattern recogntion books and provide new insight into mathematical issues. A complete set of figures is available for non-commercial purposes from http://www.wiley.com/products/subject/engineering/electrical/software_supplem_elec_eng.html and ftp://ftp.wiley.com/public/sci_tech_med/pattern. * Two-color printing in figures and text: The use of red and black throughout allows more information to be conveyed in the figures, where color can for instance indicate different categories, or different classes of solution, or stages in the development of solutions. * Pseudocode: Key algorithms are illustrated in language-independent pseudocode. Thus students can implement the algorithms in their favorite computer language. * Worked Examples: Several techniques are illustrated with worked examples, using data sets simple enough that students can readily follow the technical details. Such worked examples are particularly helpful to students tackling homework problems. * Extensive Bibliographies: Each chapter contains an extensive and up-to-date bibliography with detailed citation information, including the full names (first name and surname) of every author. * Chapter Summaries: Each chapter ends with a summary highlighting key points and terms. Such summaries reinforce the presentation in the text and facilitate rapid review of the material. * Homeworks: There are 380 homework problems, each keyed to its corresponding section in the text. * Computer Exercises: There are 102 language-independent computer exercises, each keyed to a corresponding section and in many cases also to explicit pseudocode in the text. * Starred sections: Some sections are starred to indicate that they may be skipped on a first reading, or in a one-semester course. * Key words listed in margins: Key words and topics are listed in the margins where they first appear, to highlight new terms and to speed subsequent search and retrieval of relevant information. From S.Singh at exeter.ac.uk Tue Nov 7 10:25:56 2000 From: S.Singh at exeter.ac.uk (S.Singh@exeter.ac.uk) Date: Tue, 7 Nov 2000 15:25:56 +0000 (GMT Standard Time) Subject: PhD positions available Message-ID: UNIVERSITY OF EXETER, UK Department of Computer Science We now invite applications for the following two PhD studentships within our department. PhD Studentship "Adaptive and Autonomous Image Understanding Systems" Deadline for Application: 20th November, 2000 The project will explore intelligent algorithms for adaptive image understanding including context based reasoning. Generic methodology will be developed based on image processing and pattern recognition methods to allow automatic processing of scene analysis data using video sequences. The goal of the project is to develop a seamless system that continuously evolves in real-time with video images. PhD Studentship "Ultrasound based object recognition and integration with image analysis" Deadline for Application: 20th November, 2000 Only recently it has been demonstrated that ultrasound can be used to detect objects in controlled environments. It has been used for face recognition and obstacle avoidance. In this project we develop techniques for ultrasound based object recognition and couple this ability with image processing on a mobile platform (robot). The ultrasound method will provide the low resolution cheaper method as a front end triggering the image processing component for more detailed analysis. The research will be tested on indoor and outdoor environment. For both studentships, it is expected that prospective applicants have good mathematical and analytical background, with programming experience in C/C++ and Unix/Linux operating systems. Applicants need not necessarily have prior knowledge in these areas but should have at least an upper second class first degree and where possible a good Masters degree in computer science. Applicants should send a letter of application and CV including the names and addresses of two referees to Dr. Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK (email S.Singh at exeter.ac.uk). Applicants should ask their referees to send their references directly to the above address. Informal enquiries can be made 01392-264053; further details of the Department's work may be found on the web at http://www.dcs.ex.ac.uk/research/pann. The studentships cover UK/EU fees and maintenance (currently £6,620 pa) for up to three years. The successful candidates should expect to take up the studentship no later than 1 December, 2000 or as soon as possible when negotiated with the department. ------------------------------------------- Sameer Singh Department of Computer Science University of Exeter Exeter EX4 4PT United Kingdom Tel: +44-1392-264053 Fax: +44-1392-264067 e-mail: s.singh at ex.ac.uk WEB: http://www.ex.ac.uk/academics/sameer/index.htm ------------------------------------------- From lpaulo at unirpnet.com.br Sat Nov 11 02:19:12 2000 From: lpaulo at unirpnet.com.br (Luis Paulo) Date: Sat, 11 Nov 2000 12:49:12 +0530 Subject: Invinted Session Message-ID: <003201c04baf$b17579c0$2f60e6c8@lpaulo.www.unirpnet.com.br> An invited session called " Genetic Algorithms, Neural Networksand Aplications " has been organized for the in The Fifth Multi-Conference on Systemics, Cybernetics and Informatics, which will be held in Orlando, Florida, USA, from July 22 - 25, 2001. This session inteds to discusss (predction of molecular strucuture, cybernetic, macromolecular conformational search, prediction and pattern regognition ). Who is interested to participate should send a paper (word) to lpaulo at df.ibilce.unesp.br Best Regards Scott, Luis Paulo Deptartament o Physics IBILCE - UNESP Brasil From sml at essex.ac.uk Mon Nov 13 09:48:28 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Mon, 13 Nov 2000 14:48:28 -0000 Subject: OCR Competition Announcement Message-ID: <6A8CC2D6487ED411A39F00D0B7847B66E77EF1@sernt14.essex.ac.uk> Dear All, We are now inviting entries for our OCR competition sponsored by the UK Post Office. For more details go to http://algoval.essex.ac.uk and follow the link to OCR Competition. This competition is challenging in several ways: 1. The PO Digits dataset (on which accuracy is judged) is sparse and quite variable. 2. The algorithm implementation plus any data it needs to store its trained state must occupy less than 50,000 bytes. 3. There are many other comparison criteria (see rules for more details) that apply in order to separate algorithms that have test set accuracy that is not statistically separable. 4. There are strict limits on other performance aspects such as training and recognition time - see rules for more details. Enjoy! Simon Lucas ps. Note that at present algorithms must be implemented in Java. ------------------------------------------------ Dr. Simon Lucas Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom Email: sml at essex.ac.uk http://algoval.essex.ac.uk ------------------------------------------------- From taketani at med64.com Mon Nov 13 16:14:45 2000 From: taketani at med64.com (Makoto Taketani) Date: Mon, 13 Nov 2000 13:14:45 -0800 Subject: J Neurosci published a paper on hippocampal beta rhythm studied by array electrode In-Reply-To: Message-ID: Hi All, I forgot to mention that we will be happy to send the reprint to people who ask for it, when the reprint is available in mid December. Thank you. -makoto -----Original Message----- From: owner-mea-users at its.caltech.edu [mailto:owner-mea-users at its.caltech.edu]On Behalf Of Makoto Taketani Sent: Saturday, November 11, 2000 11:09 PM To: mea-users at cco.caltech.edu; Connectionists at cs.cmu.edu; cneuro at bbb.caltech.edu Subject: MEA: J Neurosci published a paper on hippocampal beta rhythm studied by array electrode The following recent paper may be of interest to those in this list interested in new methods to study in-vitro network operations. The Journal of Neuroscience, November 15, 2000, 20(22):8462-8473 Origins and Distribution of Cholinergically Induced Beta Rhythms in Hippocampal Slices. Ken Shimono1, Fernando Brucher2, Richard Granger2, Gary Lynch3, and Makoto Taketani1 Regional variations and substrates of high-frequency rhythmic activity induced by cholinergic stimulation were studied in hippocampal slices with 64-electrode recording arrays. (1) Carbachol triggered beta waves (17.6 +/- 5.7 Hz) in pyramidal regions of 75% of the slices. (2) The waves had phase shifts across the cell body layers and were substantially larger in the apical dendrites than in cell body layers or basal dendrites. (3) Continuous, two-dimensional current source density analyses indicated apical sinks associated with basal sources, lasting approximately 10 msec, followed by apical sources and basal sinks, lasting approximately 20 msec, in a repeating pattern with a period in the range of 15-25 Hz. (4) Carbachol-induced beta waves in the hippocampus were accompanied by 40 Hz (gamma) oscillations in deep layers of the entorhinal cortex. (5) Cholinergically elicited beta and gamma rhythms were eliminated by antagonists of either AMPA or GABA receptors. Benzodiazepines markedly enhanced beta activity and sometimes introduced a distinct gamma frequency peak. (6) Twenty Hertz activity after orthodromic activation of field CA3 was distributed in the same manner as carbachol-induced beta waves and was generated by a current source in the apical dendrites of CA3. This source was eliminated by high concentrations of GABA(A) receptor blockers. It is concluded that cholinergically driven beta rhythms arise independently in hippocampal subfields from oscillatory circuits involving (1) bursts of pyramidal cell discharges, (2) activation of a subset of feedback interneurons that project apically, and (3) production of a GABA(A)-mediated hyperpolarization in the outer portions of the apical dendrites of pyramidal neurons. SFN members can download the full article from http://www.jneurosci.org/cgi/content/abstract/20/22/8462 The movie showing current source density of beta rhythms can be downloaded from http://www.med64.com/publications.htm ------------------------------------------------------- Makoto Taketani, Ph.D. Technology Development Center Matsushita Electric Corporation of America Irvine, CA Net: taketani at med64.com http://www.med64.com ------------------------------------------------------- From: esann To: "Connectionists at cs.cmu.edu" References: From bogus@does.not.exist.com Tue Nov 14 03:33:04 2000 From: bogus@does.not.exist.com () Date: Tue, 14 Nov 2000 09:33:04 +0100 Subject: ESANN'2001 - special sessions Message-ID: ---------------------------------------------------- | | | ESANN'2001 | | | | 9th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 25-26-27, 2001 | | | | Call for papers: special sessions | | (deadline for submissions: 8 December 2000) | ---------------------------------------------------- The ESANN'2001 conference will include six special sessions organized by renowned scientists in their respective fields: - Neural networks in finance - Artificial neural networks and early vision processing - Artificial neural networks for Web computing - Dedicated hardware implementations: perspectives on systems and applications - Novel neural transfer functions - Neural networks and evolutionary/genetic algorithms - hybrid approaches You will find below a short description of these special sesions. Contributions to these special sessions are welcome, as on any other topic covered by the ESANN conference. For other details on topics, submission procedure, etc., please consult the Web pages of the conference (http://www.dice.ucl.ac.be/esann). The ESANN'2001 conference is technically co-sponsored by the IEEE Neural Networks Council (TBC), the IEEE Region 8, the IEEE Benelux Section, and the International Neural Networks Society. Neural networks in finance -------------------------- Organised by M. Cottrell, Univ. Paris I (France), E. de Bodt, Univ. Lille II (France) & UCL Louvain-la-Neuve (Belgium) Since a long time, many applications of statistical and econometric tools have been realized on data issued from financial markets. It is not necessary to emphasize the economic dimension of those works. To be able to forecast, even only on a very short term, a stock index would obviously be of a lot of value for professionals. Those works have raised many questions. From morelock at Princeton.EDU Tue Nov 14 14:26:34 2000 From: morelock at Princeton.EDU (Wendy Morelock) Date: Tue, 14 Nov 2000 14:26:34 -0500 Subject: PostDoc available at Princeton Message-ID: <3A1191E9.81D89CAD@princeton.edu> POSTDOCTORAL POSITION AVAILABLE IN CONNECTIONIST/NEURAL NETWORK MODELING: Applications are invited for a Postdoctoral Fellowship in the newly established Silvio O. Conte Center for Neuroscience Research on the Cognitive and Neural Mechanisms of Conflict and Control, within the Center for Brain, Mind and Behavior at Princeton University. The position is available from December 1, 2000 for a renewable, one-year appointment. The Conte Center encompasses six projects, five experimental and one focussed on theory and mathematical and computer modelling. The individual will work with Jonathan Cohen (Psychology), Philip P. Holmes (Applied Mathematics) and John Hopfield (Molecular Biology), who collaborate on mathemical modelling projects to develop, analyses and test neural network (connectionist) models of decision-making, perceptual choice, memory recall, and attention, focussing on the relationship of conflict detection to control. The theoretical work will be conducted in close collaboration with behavioral and fMRI imaging experiments. Ph.D. with experience in nonlinear dynamics and/or neural network modeling is required; some familiarity with models in neurobiology and/or experimental methods in psychology is also desirable. Further information about resources and affiliated faculty at the Center for Brain, Mind and Behavior is available at: http://www.csbmb.princeton.edu. Resumes or inquiries can be directed to Jonathan D. Cohen at jdc at princeton.edu. We will begin reviewing applications as they are received, continuing until the position is filled. Salary and rank are commensurate with experience. PU/EO/AAE -- Wendy Morelock Center Manager Center for the Study of Brain, Mind and Behavior morelock at princeton.edu (609) 258-0613 (609) 258-2574 fax From terry at salk.edu Wed Nov 15 01:54:52 2000 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 14 Nov 2000 22:54:52 -0800 (PST) Subject: Computational Neurobiology Graduate Training at UCSD Message-ID: <200011150654.eAF6sqp01254@purkinje.salk.edu> DEADLINE: JANUARY 6, 2000 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/compneuro/ The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers that are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Financial support for students enrolled in this training program is available through a new NSF Integrative Graduate Education and Research Training (IGERT) award to UCSD. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD. * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology. * Darwin Berg (Biology): Regulation synaptic components, assembly and localization, function and long-term stability. Former Chairman of Biology. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms. * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation. * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels. * David Kleinfeld (Physics):Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. Co-director, Analysis of Neural Data Workshop (MBL). * William Kristan (Biology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD. * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies. * Javier Movellan (Cognitive Science): Sensory fusion and learning algorithms for continuous stochastic systems. * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects. * Sejnowski (Salk Institute/Biology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation. * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques. * Nicholas Spitzer (Biology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of the Neurobiology Section in Biology. * Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons. * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters. * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior. * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms. Requests for application materials should be sent to the Graduate Admissions Office, Division of Biology 0348, 9500 Gilman Drive, UCSD, La Jolla, CA, 92093-0348 [gradprog at biology.ucsd.edu]. The deadline for completed application materials, including letters of reference, is January 6, 2001. More information about applying to the UCSD Biology Graduate Program is available at http://www-biology.ucsd.edu/sa/Admissions.html. The Division of Biology home page is located at http://www-biology.ucsd.edu/. From paul at arti.vub.ac.be Wed Nov 15 07:59:40 2000 From: paul at arti.vub.ac.be (Paul Vogt) Date: Wed, 15 Nov 2000 13:59:40 +0100 Subject: PhD Thesis available: Lexicon Grounding on Mobile Robots Message-ID: <3A1288BB.C637B538@arti.vub.ac.be> Dear colleagues, I am pleased to announce that my PhD thesis, titled 'Lexicon Grounding on Mobile Robots' is now available at the web: http://arti.vub.ac.be/~paul/thesis.html Abstract: The thesis presents research that investigates how two mobile robots can develop a shared lexicon from scratch of which the meaning is grounded in the real world. It is shown how the robots can solve the symbol grounding problem in a particular experimental setup. The model by which the robots do so is explained in detail. The experimental results are presented and discussed. Long abstract: http://arti.vub.ac.be/~paul/abstract.html Best regards, Paul Vogt -- Paul Vogt tel: +32 2 629 37 05 VUB AI Lab fax: +32 2 629 37 29 Brussels URL: http://arti.vub.ac.be/~paul From mm at santafe.edu Wed Nov 15 16:43:54 2000 From: mm at santafe.edu (Melanie Mitchell) Date: Wed, 15 Nov 2000 14:43:54 -0700 (MST) Subject: 2001 Complex Systems Summer Schools Message-ID: <14867.922.687501.713279@aztec.santafe.edu> SANTA FE INSTITUTE Complex Systems Summer Schools Summer, 2001 SANTA FE SCHOOL: June 10 to July 7, 2001 in Santa Fe, New Mexico. Held on the campus of St. John's College in Santa Fe. Administered by the Santa Fe Institute. BUDAPEST SCHOOL: July 16 to August 10, 2001 in Budapest, Hungary. Held on the campus of Central European University in Budapest. Administered by Central European University and the Santa Fe Institute. GENERAL DESCRIPTION: An intensive introduction to complex behavior in mathematical, physical, living, and social systems for graduate students and postdoctoral fellows in the sciences and social sciences. Open to students in all countries. Students are expected to choose one school and attend the full four weeks. Week 1 will consist of an intensive series of lectures and laboratories introducing foundational ideas and tools of complex systems research. The topics will include nonlinear dynamics and pattern formation, statistical mechanics and stochastic processes, information theory and computation theory, adaptive computation, computer modeling tools, and specific applications of these core topics to various disciplines. Weeks 2 and 3 will consist of lectures and panel discussions on current research in complex systems. The topics this year are: -- Origin and Early Evolution of Life (Santa Fe and Budapest) -- Nonstandard Approaches to Computation (Santa Fe and Budapest) -- Geophysics and Climate Modeling (Santa Fe) -- Self-Organization and Collective Behavior (Budapest) Week 4 will be devoted to completion and presentation of student projects. WHO SHOULD APPLY: Applications are solicited from graduate students and postdoctoral fellows in any discipline, but with some background in science and mathematics at least at the undergraduate level (including calculus and linear algebra). An optional review of relevant mathematics will be given at the beginning of each school. Students may apply to either the Santa Fe School or the Budapest School, regardless of home country. COSTS: -- Santa Fe School: No tuition is charged. 100% of housing costs are provided for graduate students and 50% for postdoctoral fellows. (The remaining 50% is $700 for the four week school). Most students will provide their own travel funding. Some travel scholarships may be available, depending on need. -- Budapest School: No tuition is charged. 100% of housing costs are provided for all students. Some travel scholarships will be available, depending on need. HOUSING: Housing at both schools will be in single dormitory rooms, some with shared bathrooms. Telephone and computer network connectors will be available. For students with accompanying families, some family housing will be available. Travel support for families is not available. APPLICATION INSTRUCTIONS: Provide a current resume with publications list (if any), statement of current research interests, comments about why you want to attend the school, and two letters of recommendation from scientists who know your work. Include your e-mail address and fax number. Specify which school you want to attend (or which is your first choice if you are willing to attend either). Specify in your cover letter whether you wish to apply for a travel scholarship. (This will not affect our decision on your application.) Send only complete application packages by postal mail to: Summer Schools Santa Fe Institute 1399 Hyde Park Road Santa Fe, NM 87501 APPLICATION DEADLINE: February 5,2001 Women, minorities, and students from developing countries are especially encouraged to apply. Further information at http://www.santafe.edu/sfi/education/indexCSSS.html or summerschool at santafe.edu. ------------------------------------------------------------------ 2001 SUMMER SCHOOL FACULTY Directors: SANTA FE BUDAPEST Ray Goldstein, U. Arizona Imre Kondor, Eotvos Univ. Melanie Mitchell, SFI Melanie Mitchell, SFI Partial List of Lecturers: SANTA FE BUDAPEST Elizabeth Bradley, U. Colorado Imre Kondor, Eotvos Univ. Thomas Carter, Cal. State U. Andras Kroo, Renyi Inst. Math. Sean Elicker, U. New Mexico Melanie Mitchell, SFI Ray Goldstein, U. Arizona Cristopher Moore, U. New Mexico Thomas Halsey, Exxon Research Mark Newman, SFI Laura Landweber, Princeton Zoltan Racz, Eotvos Univ. Seth Lloyd, MIT Grzegorz Rozenberg, Leiden U. Melanie Mitchell, SFI Hava Siegleman, Technion Harold Morowitz, George Mason U. Erik Schultes, MIT Cosma Shalizi, SFI Peter Schuster, Univ. Vienna Ken Steiglitz, Princeton Eors Szathmary, Eotvos Univ. Eors Szathmary, Eotvos Univ. Gabor Vattay, Eotvos Univ. Koen Visscher, U. Arizona Tamas Vicsek, Eotvos Univ. Lance Williams, U. New Mexico Erik Winfree, MIT From oreilly at grey.colorado.edu Wed Nov 15 16:57:30 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Wed, 15 Nov 2000 14:57:30 -0700 Subject: Paper on frontal cortex & basal ganglia available Message-ID: <200011152157.OAA29242@grey.colorado.edu> The following technical report is now available for downloading: ftp://grey.colorado.edu/pub/oreilly/papers/frankloughryoreilly00_fcbg_tr.pdf *or* ftp://grey.colorado.edu/pub/oreilly/papers/frankloughryoreilly00_fcbg_tr.ps Interactions Between Frontal Cortex and Basal Ganglia in Working Memory: A Computational Model Michael J. Frank, Bryan Loughry, and Randall C. O'Reilly Department of Psychology University of Colorado at Boulder ICS Technical Report 00-01 Abstract: The frontal cortex and basal ganglia interact via a relatively well-understood and elaborate system of interconnections. In the context of motor function, these interconnections can be understood as disinhibiting or ``releasing the brakes'' on frontal motor action plans --- the basal ganglia detect appropriate contexts for performing motor actions, and enable the frontal cortex to execute such actions at the appropriate time. We build on this idea in the domain of working memory through the use of computational neural network models of this circuit. In our model, the frontal cortex exhibits robust active maintenance, while the basal ganglia contribute a selective, dynamic gating function that enables frontal memory representations to be rapidly updated in a task-relevant manner. We apply the model to a novel version of the continuous performance task (CPT) that requires subroutine-like selective working memory updating, and compare and contrast our model with other existing models and theories of frontal cortex--basal ganglia interactions. - Randy +----------------------------------------------------------------+ | Dr. Randall C. O'Reilly | | | Assistant Professor | Phone: (303) 492-0054 | | Department of Psychology | Fax: (303) 492-2967 | | Univ. of Colorado Boulder | Home: (303) 448-1810 | | Muenzinger D251C | Cell: (720) 839-7751 | | 345 UCB | email: oreilly at psych.colorado.edu | | Boulder, CO 80309-0345 | www: psych.colorado.edu/~oreilly | +----------------------------------------------------------------+ From john at dcs.rhbnc.ac.uk Thu Nov 16 10:32:40 2000 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Thu, 16 Nov 2000 15:32:40 +0000 (GMT) Subject: Research assistant for intelligent text analysis Message-ID: Royal Holloway, University of London invites expressions of interest for a research assistant position in computer science. The post is a three year position starting from January 1st, 2001 or as soon as possible thereafter. The salary is very competitive and the work will be developing kernel based methods for the analysis of multi-media documents provided by Reuters, who are collaborators on the project. The project is financed by the EU and also involves partners in France (Xerox), Italy (Genova University, Milano University) and Israel (Hebrew University of Jerusalem). We are seeking researchers with experience in corpus based methods of information retrieval and document categorisation, and a strong programming background. Experience with kernel methods is desirable but not required. Please contact John Shawe-Taylor by email at jst at dcs.rhbnc.ac.uk for more information. John Shawe-Taylor and Nello Cristianini will also be available at the NIPS conference to answer any questions and meet potential applicants. ************************************************************** John Shawe-Taylor J.Shawe-Taylor at dcs.rhbnc.ac.uk Dept of Computer Science, Royal Holloway, University of London Phone: +44 1784 443430 Fax: +44 1784 439786 ************************************************************** From ian.cloete at i-u.de Thu Nov 16 09:46:33 2000 From: ian.cloete at i-u.de (Ian Cloete) Date: Thu, 16 Nov 2000 15:46:33 +0100 Subject: AIM Special Issue Message-ID: *************************************************************************** CALL FOR PAPERS --- Please accept our apologies for multiple copies --- --- Please distribute this CFP to your colleagues --- *************************************************************************** ARTIFICIAL INTELLIGENCE in MEDICINE SPECIAL ISSUE on KNOWLEDGE - BASED NEUROCOMPUTING in MEDICINE *************************************************************************** The journal Artificial Intelligence in Medicine (http://www.elsevier.nl/locate/artmed) invites contributions for a Special Issue on Knowledge-Based Neurocomputing in Medicine. Full papers should be sent to the address below and are due by 30 April 2001. The journal's web site contains guidelines for authors. Knowledge-Based Neurocomputing, the topic of a recently published book (http://www.i-u.de/schools/cloete/book.htm) edited by Ian Cloete and Jacek M. Zurada, focuses on methods to encode prior knowledge and to extract, refine, and revise knowledge within a neurocomputing system. The journal calls for papers on knowledge-based artificial neural networks within a medical application area. A paper should address at least the following three components: neurocomputing, processing of knowledge, and a medical application area. Of course state-of-the-art research is required, however, the contributions may either be theoretical with a medical line of thought or medical application-oriented with results and a discussion. Each contribution may be 20 to 25 pages long. Contributors who would like to be considered for invited papers are requested to contact the guest editor, Ian Cloete (ian.cloete at i-u.de), for consideration. Please send an extended abstract of at least 3 A4 pages describing your proposed contribution, preferably in postscript or pdf format by email, or to the address given below, by 31 December 2000. Further information on the call for papers will be posted on the web page http://www.i-u.de/schools/cloete/aimcfp.htm Prof. Dr. Ian Cloete International University in Germany Campus 2 D-76646 Bruchsal Germany Email: ian.cloete at i-u.de ian.cloete at ieee.org Phone: +49-7251-700230 Fax: +49-7251-700250 Web: http://www.i-u.de/schools/cloete/index.htm IU Web: http://www.i-u.de/ From cjlin at csie.ntu.edu.tw Thu Nov 16 20:22:45 2000 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Fri, 17 Nov 2000 09:22:45 +0800 Subject: model selection software for support vector machines Message-ID: Dear Colleagues: We announce the release of the software looms, a leave-one-out model selection software for support vector machines (SVM). Automatic model selection is an important issue to make support vector machines (SVM) practically useful. Most existing approaches use the leave-one-out (loo) related estimators which are considered computationally expensive. looms uses some numerical tricks which lead to efficient calculation of loo rates of different models. Given a range of parameters, looms automatically returns the parameter and model with the best loo rate. For example, % looms heart_scale Optimal parameter: c=16.000000, gamma=0.016000, rate= 83.704% where c is the penalty parameter (or say the upper bound of the SVM dual formulation) and gamma is the parameter of the RBF kernel: exp(gamma*|x_i - x_j|^2). Currently we support only the RBF kernel. The current release (Version 1.0, by Jen-Hao Lee and Chih-Jen Lin) is available from http://www.csie.ntu.edu.tw/~cjlin/looms Details of looms are in the following paper: J.-H. Lee and C.-J. Lin, Automatic model selection for support vector machines http://www.csie.ntu.edu.tw/~cjlin/papers/modelselect.ps.gz Any comments are very welcome. Sincerely, Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan cjlin at csie.ntu.edu.tw From sugi at og.cs.titech.ac.jp Fri Nov 17 03:44:27 2000 From: sugi at og.cs.titech.ac.jp (SUGIYAMA Masashi) Date: Fri, 17 Nov 2000 17:44:27 +0900 Subject: Model Selection Paper Available ! Message-ID: <20001117174427B.sugi@og.cs.titech.ac.jp> Dear colleagues, I am pleased to announce the availability of our paper on line. "Subspace Information Criterion for Model Selection" By Masashi Sugiyama and Hidemitsu Ogawa To appear in Neural Computation http://ogawa-www.cs.titech.ac.jp/~sugi/publications/sic.ps.gz Also we will give a talk about the above Subspace Information Criterion at NIPS*2000 Workshop "Cross-Validation, Bootstrap and Model Selection" Breckenridge, Colorado, USA, December 1, 2000. http://www.cs.cmu.edu/~rahuls/nips2000/ We appreciate your questions and comments by e-mail or at the workshop. ABSTRACT The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection called the subspace information criterion (SIC), which is a generalization of Mallows' $C_L$. It is assumed that the learning target function belongs to a specified functional Hilbert space and the generalization error is defined as the Hilbert space squared norm of the difference between the learning result function and target function. SIC gives an unbiased estimate of the generalization error so defined. SIC assumes the availability of an unbiased estimate of the target function and the noise covariance matrix, which are generally unknown. A practical calculation method of SIC for least mean squares learning is provided under the assumption that the dimension of the Hilbert space is less than the number of training examples. Finally, computer simulations in two examples show that SIC works well even when the number of training examples is small. Sincerely yours, Masashi Sugiyama. --------------------------------- Masashi Sugiyama Department of Computer Science, Graduate School of Information Science and Engineering, Tokyo Institute of Technology, 2-12-1, O-okayama, Meguro-ku, Tokyo, 152-8552, Japan. E-mail: sugi at og.cs.titech.ac.jp URL: http://ogawa-www.cs.titech.ac.jp/~sugi Tel: +81-3-5734-2190 Fax: +81-3-5734-2949 From Zoubin at gatsby.ucl.ac.uk Fri Nov 17 14:44:50 2000 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Fri, 17 Nov 2000 19:44:50 +0000 (GMT) Subject: Paper available on variational Bayesian learning Message-ID: <200011171944.TAA28344@cajal.gatsby.ucl.ac.uk> Dear Connectionists, The following paper on variational approximations for Bayesian learning with an application to linear dynamical systems will be presented at the NIPS 2000 conference. Comments are welcome. Gzipped postscript and PDF versions can be found at: http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips00beal.ps.gz http://www.gatsby.ucl.ac.uk/~zoubin/papers/nips00beal.pdf Zoubin Ghahramani and Matt Beal ---------------------------------------------------------------------- Propagation algorithms for variational Bayesian learning Zoubin Ghahramani and Matthew J. Beal Gatsby Computational Neuroscience Unit University College London Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoretical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these results to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smoothing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set. A revised version of this paper will appear in Advances in Neural Information Processing Systems 13, MIT Press. ---------------------------------------------------------------------- From dimi at ci.tuwien.ac.at Mon Nov 20 10:49:28 2000 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Mon, 20 Nov 2000 16:49:28 +0100 (CET) Subject: CI BibTeX Collection -- Update Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 4/1-4/3 IEEE Transactions on Fuzzy Systems, Volumes 8/2-8/5 IEEE Transactions on Neural Networks, Volumes 11/3-11/6 Machine Learning, Volumes 40/3-41/3 Neural Computation, Volumes 12/5-12/11 Neural Networks, Volumes 13/3-13/6 Neural Processing Letters, Volumes 11/3-12/2 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fuer Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitaet Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ From gary at cs.ucsd.edu Mon Nov 20 12:20:09 2000 From: gary at cs.ucsd.edu (Gary Cottrell) Date: Mon, 20 Nov 2000 09:20:09 -0800 (PST) Subject: Faculty Position In Cognitive Science UCSD Message-ID: <200011201720.JAA13547@gremlin.ucsd.edu> >From: Gilles Fauconnier (by way of Joanna Mancusi) *NOTE closing date changed to January 15, 2001. Please circulate widely. __________________________________ FACULTY POSITION IN COGNITIVE SCIENCE UNIVERSITY OF CALIFORNIA, SAN DIEGO The Department of Cognitive Science at the University of California, San Diego invites applications for a faculty position at the assistant professor level (tenure-track) starting July 1, 2001, the salary commensurate with the experience of the successful applicant and based on the UC pay scale. The department of cognitive science at UCSD was the first of its kind in the world, and, as part of an exceptional scientific community, it remains a dominant influence in the field it helped to create. The department is truly interdisciplinary, with a faculty whose interests span anthropology, computer science, human development, linguistics, neuroscience, philosophy, psychology, and sociology. The department is looking for a top-caliber junior researcher in cognitive science. Applicants must have a Ph.D. (or ABD). A broad interdisciplinary perspective and experience with multiple methodologies will be highly valued. Women and minorities are encouraged to apply. The University of California, San Diego is an affirmative action/equal opportunity employer. All applications received by January 15, 2001 will receive thorough consideration until position is filled. Candidates should include a vita, reprints, a short letter describing their background and interests, and names and addresses of at least three references to: University of California, San Diego Faculty Search Committee Department of Cognitive Science 0515-EM 9500 Gilman Drive La Jolla, CA 92093-0515 From henkel at physik.uni-bremen.de Mon Nov 20 15:06:21 2000 From: henkel at physik.uni-bremen.de (Rolf D. Henkel) Date: Mon, 20 Nov 2000 21:06:21 +0100 Subject: TR: Synchronization, Coherence-Detection and Three-Dimensional Vision Message-ID: <00112021230000.23911@axon> Dear Connectionists, I'd like to invite you to download the technical report Title: Synchronization, Coherence-Detection and Three-Dimensional Vision Author: Rolf D. Henkel, Institute for Theoretical Physics ABSTRACT: A new functional role for spiking neurons is proposed, considered necessary to convert noisy sensory data into meaningful and stable perceptions. Percept creation and validation is performed by a dynamical process of coherence detection between neural signals. The crucial operational step of the network is the interaction of neural oscillators in the weak-coupling limit, which realizes coherence detection dynamically by selective synchronization of neural responses. A robust estimate of the incoming signals is transmitted as modulation frequency of the output current of the coherence-detecting layer, and a validation measure is given by the modulation depth of this current. As a real world example of these ideas, a neural network is presented solving the task of stereo vision with real image data. It combines operations of time-averaging, rate-coding neurons with integrate-and-fire-neurons, which calculate a disparity map of a scene by partially synchronizing their spike trains. The report is available on my website, as 1) PostScript http://axon.physik.uni-bremen.de/research/papers/coherence.ps.gz 2) PDF-File http://axon.physik.uni-bremen.de/research/papers/coherence.pdf 3) HTML, online http://axon.physik.uni-bremen.de/research/papers/coherence/ Comments and remarks are very welcome. Regards, Rolf Henkel -- Dr. Rolf Henkel Institute for Theoretical Neurophysics University Bremen Kufsteiner Strae 1, D-28359 Bremen Phone: +49-421-218-3688 henkel at physik.uni-bremen.de Fax: +49-421-218-4869 http://axon.physik.uni-bremen.de/ From evansdj at aston.ac.uk Tue Nov 21 06:22:13 2000 From: evansdj at aston.ac.uk (DJ EVANS) Date: Tue, 21 Nov 2000 11:22:13 +0000 Subject: JOB: ANALYSIS OF CARDIOLOGY BIOSIGNALS Message-ID: <3A1A5AE5.E20BEFD4@aston.ac.uk> Dear Connectionists, I have been asked to post this job advert to the list on behalf of Dr. Ian Nabney. For informal enquiries, please contact Dr. Nabney via e-mail (I.T.Nabney at aston.ac.uk). Regards, David Evans. ------------------------------------------------------------------------------ JOB ADVERT: To appear in jobs.ac.uk and on the NCRG web site Cardionetics Institute of Bioinformatics ---------------------------------------- School of Engineering and Applied Sciences Aston University, Birmingham, UK POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- ANALYSIS OF CARDIOLOGY BIOSIGNALS --------------------------------- We are looking for a highly motivated individual for a 3 year postdoctoral research position in the area of analysis of clinical biosignals, primarily in cardiology. The emphasis of this research will be on developing and applying data modelling, visualisation and analysis algorithms to biosignals to generate clinically valuable information. The Institute is funded by Cardionetics Ltd, a UK company that has produced the world's first fully automatic portable electrocardiograph machine specifically for GP use. The aim of the Institute is to develop the technology that will underpin the next generation of products. It is located alongside the Neural Computing Research Group which has a worldwide reputation in practical and theoretical aspects of information analysis. Applicants should have strong mathematical and computational skills; a background in biosignal processing (particularly ECG) would be an advantage. Salaries will be up to point 9 on the RA 1A scale, currently 21435 UK pounds. The salary scale is subject to annual increments. If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Personnel Officer Aston University Birmingham B4 7ET, U.K. Tel: +44 (0)121 359 0870 Fax: +44 (0)121 359 6470 Full details at http://www.ncrg.aston.ac.uk/. Informal enquiries can be directed via e-mail to Dr. Ian Nabney (Institute Director) e-mail: I.T.Nabney at aston.ac.uk. Closing date: mid December 2000 ---------------------------------------------------------------------- FURTHER PARTICULARS In this research programme, we will be developing leading edge pattern analysis techniques for time series analysis, data fusion, visualisation and data mining. The technologies involved will include neural networks, deterministic time series modelling, Bayesian belief networks, temporal graphical models, component analysis and other related techniques. Development projects will normally progress to a proof of concept stage; typically this will apply research software to data samples sufficient to prove clinical reliability and relevance. Planned topics for research include: quantification of ventricular repolarisation; T wave modelling to improve offset detection; temporal analysis of arrhythmia patterns; data fusion for multiple channel ECG; on-line learning to provide systems for post myocardial infarction patients. The Cardionetics Institute of Bioinformatics (CIB) was established at Aston University in October 2000. The research will be carried out by two Research Fellows and a Director (Dr. Ian Nabney) with appropriate administrative and computing support. The Institute has been formed to act as the long term research arm of Cardionetics Ltd, a UK company that has produced the C.Net2000, the first fully automatic portable ECG machine for GP use. It won a Millennium award and was runner up From bert at mbfys.kun.nl Tue Nov 21 09:51:33 2000 From: bert at mbfys.kun.nl (Bert Kappen) Date: Tue, 21 Nov 2000 15:51:33 +0100 Subject: Paper available Message-ID: <3A1A8BF5.E33B14F4@mbfys.kun.nl> Dear all, The following paper will be presented at NIPS and is now available for previewing from my web page. Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: http://www.mbfys.kun.nl/~bert ---------- Second order approximations for probability models Bert Kappen, Wim Wiegerinck In this paper, we derive a second order mean field theory for directed graphical probability models. By using an information theoretic argument it is shown how this can be done in the absense of a partition function. This method is the direct generalisation of the well-known TAP approximation for Boltzmann Machines. In a numerical example, it is shown that the method greatly improves the first order mean field approximation. The computational complexity of the first (second) order method is linear (quadratic) in the network size and is exponential in the potential size. For a restricted class of graphical models, so-called single overlap graphs, the second order method has comparable complexity to the first order method. -- Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: http://www.mbfys.kun.nl/~bert From josh at vlsia.uccs.edu Tue Nov 21 10:06:03 2000 From: josh at vlsia.uccs.edu (Alspector) Date: Tue, 21 Nov 2000 08:06:03 -0700 (MST) Subject: research programmer position Message-ID: Research Programmer position at Personalogy, Inc. Personalogy, Inc. was founded to commercialize research done in Prof. Alspector's group at the University of Colorado at Colorado Springs. The company develops and applies state-of-the-art machine learning techniques to personalize information on the Internet. We are currently looking for a research programmer with the following credentials: -Master's or PhD degree in Computer Science, EE or related field -background in machine learning/neural networks/intelligent data mining -strong interests in information filtering/retrieval and user modeling -1-4 years in research and professional programming experience -strong programming skills in Perl/C/C++ -familiarity with web-server technology and Internet protocols -good knowledge HTML/CGI/WML -comfortable with Windows NT/2K and Unix/Solaris/Linux platforms -very good communication and problem solving skills The person will be responsible for enhancing and optimizing the core algorithms of the company as well as researching novel ways of personalizing web-based content presentation. The person will also be involved in the overall system design. Personalogy offers a competitive salary, benefits and stock options. The company is located in downtown Colorado Springs, Colorado. Please respond by sending a resume to recruitment at personalogy.net. Josh Alspector will be attending NIPS in Denver. -- Professor Joshua Alspector Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 262 3510 (719) 262 3589 (fax) josh at eas.uccs.edu From nadine at wtc.ab.ca Wed Nov 22 01:46:33 2000 From: nadine at wtc.ab.ca (Nadine Gisler) Date: Tue, 21 Nov 2000 23:46:33 -0700 Subject: Welcome to the First ICSC Congress on Neuro-Fuzzy NF'2002 Message-ID: INTERNATIONAL COMPUTER SCIENCE CONVENTIONS Head Office: 5101C-50 Street, Wetaskiwin AB, T9A 1K1, Canada (Phone: +1-780-352-1912 / Fax: +1-780-352-1913) Email: or / Web-Site: http://www.icsc.ab.ca Welcome to the First International ICSC Congress on Neuro-Fuzzy NF'2002 to be held at The Capitolio de la Habana, Cuba January 15 - 18, 2002 We are sending this message requesting papers for this conference. http://www.icsc.ab.ca/NF2002.htm Organizing Committee: Honorary Chair: Prof. Hans-Juergen Zimmermann, Germany (zi at or.twth-aachen.de) General Chair: Hans-Heinrich Bothe, Denmark(hhb at it.dtu.dk) Special Scientific Events Chair : Alberto Ochoa, Cuba (aa8ar at yahoo.com) Scientific Program Chair: Hans Hellendoorn, Netherland (to be confirmed) Scientific Program Co-Chair : Pedro Gonzalez Lanza, Cuba (pedro at cidet.icmf.inf.cu) Local Committee Chair : Orestes Llanes-Santiago, Cuba (orestes at electrica.ispjae.edu.cu) Local Committee Co-Chair: Abelardo del Pozo Quintero, Cuba (pozo at cidet.icmf.inf.cu) Administration and Finance Chair : Ilkka Tervonen, Canada (operating at icsc.ab.ca) Publication Chair: Antonio Di Nola, Italy (dinola at unina.it) Introduction: During the past decade, paradigms and benefits from neuro fuzzy systems (NF) have been growing tremendously. Today, not only does NF solve scientific problems, but its applications are also appearing in our daily lives. In order to discuss the state of the art in NF and the future of these exciting topics; we are honored to invite you to Neuro-Fuzzy 2002. We believe it will be an excellent opportunity to share our knowledge on NF and contribute to its development in the next century. This major international conference will be held in a very enjoyable location: Havana, the Capital of Cuba, where we hope you will experience the famous Cuban hospitality. Sponsored/supported by: ISPJAE: Instituto Superior Politecnico Jos Antonio Echeverria ICIMAF: Instituto De Cibernetica, Matematica y Fisica UCLV: Universidad Central De Las Villas UO: Universidad De Oriente - RAC: Red de Automtica de Cuba - Ministerio de Educacin Superior de la Repblica de Cuba - Ministerio de la Informtica y las Comunicaciones de la Repblica de Cuba - Ministerio de Ciencias, Tecnologa y Medio Ambiente de la Repblica de Cuba. - ICSC/NAISO Canada/Switzerland Topics suggested (not limited to): - Advanced Neuro and Fuzzy Paradigms - Data Granulation and Fuzzy Rule Extraction - Advanced Training Algorithms - Evolutionary Computation (GA, GP, ET) and Graphical Models - Chaotic Behavior and Fractals Applications in signal processing, control, robotics, etc. Of particular interest are applications from the following fields: Sound and image processing, pattern recognition, image understanding, feature binding, perception, sensor fusion, controller design, state observation, motor control, mobile robotics, autonomous navigation, deliberation and planning, active anchoring, gain-scheduling, fault detection, hardware solutions, data mining, financing, e-commerce. International Steering/Program Committee (invitations sent to): Anderson P., USA Antonsson A.K., USA Baldwin J.F., U.K. Bandemer H., Germany Bezdek J., USA Bonnisone P., USA Bosc P., France Carlsson Ch., Finland Dubois D., France Esogbue A. O., USA Fyfe C., U.K. Gallard R., Argentina Gottwald S., Germany Grabisch M., France Halmague S.K., Australia Heiss-Czedik D., Austria Heiss M., Austria Hoehle U., Germany Jentzen Jan, Denmark Kalaykov Ivan, Sweden Kandel A.,Tampa, USA Klement E. P., Austria Kruse R., Germany Kuncheva L., U.K. Mamdani E., UK Marichal G.N., Spain Nauck Detlef, U.K. Pap E., Yugoslavia Pedrycz W., Canada, Roubens M., Belgium Runkler Th., Germany Ruspini ...., Belgium/ USA Steele N., U.K. Sugeno M., Japan Surmann H., Germany Takagi T., Japan Tuerksen I. B., Toronto, Canada Ulieru M., Canada Verdegay, J. L., Spain Zamarreo J., Spain SCIENTIFIC PROGRAM NF'2002 will include invited plenary talks, contributed sessions, invited sessions, workshops and tutorials. Updated details are available at http://www.icsc.ab.ca/NF2002.htm CALL FOR INVITED SESSIONS The organization of invited sessions is encouraged. Prospective organizers are requested to send a session proposal (consisting of 4-5 invited papers, the recommended session-chair and co-chair, as well as a short statement describing the title and the purpose of the session) to the respective symposium chair or the congress organizer. Invited sessions should preferably start with a tutorial paper. The registration fee of the session organizer will be waived, if at least 4 authors of invited papers register to the conference. POSTER PRESENTATIONS Poster presentations are encouraged for people who wish to receive peer feedback, and practical examples of applied research are particularly welcome. Poster sessions will allow the presentation and discussion of respective papers, which will also be included in the conference proceedings. CALL FOR TUTORIALS Pre-conference tutorials on specific relevant topics are planned. Proposals for a tutorial must include the title, topics covered, proposed speakers, targeted audience and estimated length (preferably 2 or 4 hours). The proposal must be submitted to the general chair, the scientific program chair and the congress organizer by May 31, 2001. Tutorial papers of max 15 pages can be included in the conference proceedings. CALL FOR WORKSHOPS Interested scientists are encouraged to organize a workshop on their particular field of research. Workshops consist of several presentations or open discussions on a specific subject. The proposal must include the title, the topics covered, the proposed speakers, the targeted audience and the estimated length. It should be submitted to the general chair, the scientific program chair and the congress organizer by May 31, 2001. Joint or edited workshop papers of max 35 pages can be included in the conference proceedings. SUBMISSION OF PAPERS Authors are requested to send an extended abstract, or the full paper of minimum 4 and maximum 7 pages for review, by the International Program Committee. All submissions must be written in English, starting with a succinct statement of the problem, the results achieved, their significance and a comparison with previous work. Submissions must be received by May 31, 2001. Regular papers, as well as poster presentations, tutorial papers and invited sessions are encouraged. The submission should also include: - Title of congress (NF'2002), - Type of paper (workshop, tutorial, invited, regular, poster) - Authors names, affiliations, addresses - Name of author to contact for correspondence - E-mail address and fax # of contact author and co-authors - Topics which best describe the paper (5 - 10 keywords) - Short CV of authors (recommended) Please submit your paper proposal electronically to the following address: operating at icsc.ab.ca Data formats: Word, Postscript, PDF. PROCEEDINGS AND PUBLICATIONS All accepted and invited papers will be included in the congress proceedings, published in print and on CD-ROM by ICSC Academic Press, Canada/Switzerland. A selected number of papers will be expanded and revised for possible inclusion in special issues of some prestigious journals. IMPORTANT DATES Submission Deadline: May 31, 2001 Notification of Acceptance: August 15, 2001 Delivery of Final Manuscripts: October 31, 2001 Conference NF'2002: January 15/18, 2002 CONGRESS ORGANIZER ICSC International Computer Science Conventions NAISO Natural and Artificial Intelligence Systems Organizations 5101C - 50 Street Wetaskiwin AB, T9A 1K1 / Canada Phone: +1-780-352-1912 Fax: +1-780-352-1913 Email Operating Division: operating at icsc.ab.ca Email Planning Division: planning at icsc.ab.ca connectionists at cs.cmu.edu ICSC From mpessk at guppy.mpe.nus.edu.sg Thu Nov 23 02:30:32 2000 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Thu, 23 Nov 2000 15:30:32 +0800 (SGT) Subject: TR on fast computation of Leave-One-Out error in SVM algorithms In-Reply-To: Message-ID: A New Technical Report... Two Efficient Methods for Computing Leave-One-Out Error in SVM Algorithms S. Sathiya Keerthi, Chong Jin Ong and Martin M.S. Lee National University of Singapore Abstract: We propose two new methods for the efficient computation of the leave-one-out (LOO) error for SVMs. The first method is based on the idea of exact penalty functions while the second method uses a new initial choice for the alpha variables. These methods can also be extended to the recomputation of SVM solutions when more than one example is omitted and the computation of LOO error for nu-SVM. Recently, Lee and Lin pointed out that a loose stopping criteria can be exploited to speed up LOO computations for SVM. This fact, combined with our proposed methods allows for the efficient computation of the LOO error. To download a gzipped postscript file containing the report, go to: http://guppy.mpe.nus.edu.sg/~mpessk/svm.shtml From tnatschl at igi.tu-graz.ac.at Thu Nov 23 09:00:13 2000 From: tnatschl at igi.tu-graz.ac.at (Thomas Natschlaeger) Date: Thu, 23 Nov 2000 15:00:13 +0100 Subject: Papers on computational analysis of dynamic synapses Message-ID: <3A1D22ED.2D819E2E@igi.tu-graz.ac.at> Dear Connectionists, The following two papers on computational analysis of dynamic synapses will be presented at the NIPS 2000 conference. Comments are welcome. Gzipped postscript and PDF versions can be found at: http://www.igi.TUGraz.at/igi/tnatschl/publications.html Sincerely Thomas Natschlaeger ---------------------------------------------------------------------- FINDING THE KEY TO A SYNAPSE T. Natschlaeger and W. Maass URLs: http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-nips00.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-poster.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/synkey-poster.pdf ABSTRACT: Experimental data have shown that synapses are heterogeneous: different synapses respond with different sequences of amplitudes of postsynaptic responses to the same spike train. Neither the role of synaptic dynamics itself nor the role of the heterogeneity of synaptic dynamics for computations in neural circuits is well understood. We present in this article two computational methods that make it feasible to compute for a given synapse with known synaptic parameters the spike train that is optimally fitted to the synapse in a certain sense. One of these methods is based on dynamic programming (similar as in reinforcement learning), the other one on sequential quadratic programming. With the help of these methods one can compute for example the temporal pattern of a spike train (with a given number of spikes) that produces the largest sum of postsynaptic responses for a specific synapse. Several other applications are also discussed. To our surprise we find that most of these optimally fitted spike trains match common firing patterns of specific types of neurons that are discussed in the literature. Furthermore optimally fitted spike trains are rather specific to a certain synapse ("the key to this synapse") in the sense that they exhibit a substantially smaller postsynaptic response on any other of the mayor types of synapses reported in the literature. This observation provides the first glimpse at a possible functional role of the specific combinations of synapse types and neuron types that was recently found in (Gupta, Wang, Markram, Science, 2000). Our computational analysis provides the platform for a better understanding of the specific role of different parameters that control synaptic dynamics, because with the help of the computational techniques that we have introduced one can now see directly how the temporal structure of the optimal spike train for a synapse depends on the individual synaptic parameters. We believe that this inverse analysis is essential for understanding the computational role of neural circuits. ------------------------------------------------------------------------- PROCESSING OF TIME SERIES BY NEURAL CIRCUITS WITH BIOLOGICALLY REALISTIC SYNAPTIC DYNAMICS T. Natschlaeger, W. Maass, E. D. Sontag, and A. M. Zador URLs: http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-nips00.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-poster.ps.gz http://www.igi.TUGraz.at/igi/tnatschl/psfiles/dynsyn-poster.pdf ABSTRACT: Experimental data show that biological synapses behave quite differently from the symbolic synapses in common artificial neural network models. Biological synapses are dynamic, i.e., their ``weight'' changes on a short time scale by several hundred percent in dependence of the past input to the synapse. In this article we explore the consequences that this synaptic dynamics entails for the computational power and adaptive capability of feedforward neural networks. Our analytical results show that the class of nonlinear filters that can be approximated by neural networks with dynamic synapses, even with just a single hidden layer of sigmoidal neurons, is remarkably rich. It contains every time invariant filter with fading memory, hence arguable every filter that is potentially useful for a biological organism. This result is robust with regard to various changes in the model for synaptic dynamics. Furthermore we show that simple gradient descent suffices to approximate a given quadratic filter by a rather small neural network with dynamic synapses. The computer simulations we performed show that in fact their performance is slightly better than that of previously considered artificial neural networks that were designed for the purpose of yielding efficient processing of temporal signals, without aiming at biological realism. We have tested dynamic networks on tasks such as the learning of a randomly chosen quadratic filter, as well as on the system identification task used in (Back and Tsoi, 1993), to illustrate the potential of our new architecture. We also address the question which synaptic parameters are essential for a network with dynamic synapses to be able to learn a particular target filter. We found that neither just plasticity in the synaptic dynamics nor just plasticity of the maximal amplitude alone yields satisfactory results. However a simple gradient descent learning algorithm that tunes both types of parameters simultaneously yields good approximation capabilities. From rosi-ci0 at wpmail.paisley.ac.uk Thu Nov 23 10:02:45 2000 From: rosi-ci0 at wpmail.paisley.ac.uk (Roman Rosipal) Date: Thu, 23 Nov 2000 15:02:45 +0000 Subject: TR available Message-ID: Dear Connectionists, The following TR is now available at my home page: Kernel Principal Component Regression with EM Approach to Nonlinear Principal Components Extraction R. Rosipal, LJ Trejo, A. Cichocki Abstract In kernel based methods such as Support Vector Machines, Kernel PCA, Gaussian Processes or Regularization Networks the computational requirements scale as O(n^3) where n is the number of training points. In this paper we investigate Kernel Principal Component Regression (KPCR) with the Expectation Maximization approach in estimating of the subset of p principal components (p < n) in a feature space defined by a positive definite kernel function. The computational requirements of the method are O(pn^2). Moreover, the algorithm can be implemented with memory requirements O(p^2)+O((p+1)n)). We give the theoretical description explaining how by the proper selection of a subset of non-linear principal components desired generalization of the KPCR is achieved. On two data sets we experimentally demonstrate this fact. Moreover, on a noisy chaotic Mackey-Glass time series prediction the best performance is achieved with p << n and experiments also suggests that in such cases we can also use significantly reduced training data sets to estimate the non-linear principal components. The theoretical relation and experimental comparison to Kernel Ridge Regression and epsilon-insensitive Support Vector Regression is also given. _______________ You can download gzipped postscript from http://cis.paisley.ac.uk/rosi-ci0/Papers/TR00_2.ps.gz Any comments and remarks are welcome. _______________ Roman Rosipal University of Paisley, CIS Department, Paisley, PA1 2BE Scotland, UK http://cis.paisley.ac.uk/rosi-ci0 e-mai:rosi-ci0 at paisley.ac.uk Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From maass at igi.tu-graz.ac.at Thu Nov 23 13:34:13 2000 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Thu, 23 Nov 2000 19:34:13 +0100 Subject: Paper on Circuit Complexity for Sensory Processing Message-ID: <3A1D6325.F7A9E4@igi.tu-graz.ac.at> The following paper is now online available. It will be presented as a talk at NIPS 2000. FOUNDATIONS FOR A CIRCUIT COMPLEXITY THEORY OF SENSORY PROCESSING by Robert A. Legenstein and Wolfgang Maass Technische Universitaet Graz, Austria ABSTRACT: We introduce TOTAL WIRE LENGTH as a salient complexity measure for evaluating the biological feasibility of proposed circuit designs for sensory processing tasks, such as early vision. Biological data show that the total wire length of neural circuits in the cortex is in fact not very large, if compared with the number of neurons that they connect. This implies that many commonly proposed circuit architectures for sensory processing tasks are biologically unrealistic. In this paper we exhibit some alternative circuit design techniques for computational tasks that capture typical aspects of translation- and scale-invariant sensory processing. These techniques yield circuits with a total wire length that scales LINEARLY with the number of neurons in the circuit. ------------------------------------------------------------------------------ This paper, as well as an illustrated poster, is online available from # 122 on http://www.igi.TUGraz.at/igi/maass/publications.html Wolfgang Maass tel;fax:++43 (0)316 873 5805 tel;work:++43 (0)316 873 5822 http://www.tu-graz.ac.at/igi/maass Technische Universitaet Graz;Institute for Theoretical Computer Science. Professor of Computer Science From Thorsten.Joachims at gmd.de Fri Nov 24 05:39:33 2000 From: Thorsten.Joachims at gmd.de (Thorsten Joachims) Date: Fri, 24 Nov 2000 11:39:33 +0100 (MET) Subject: New SVM-Light Release (V3.50) Message-ID: <200011241039.LAA02826@borneo.gmd.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 2399 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/405be3e0/attachment-0001.ksh From nnsp01 at neuro.kuleuven.ac.be Fri Nov 24 10:56:16 2000 From: nnsp01 at neuro.kuleuven.ac.be (Neural Networks for Signal Processing 2001) Date: Fri, 24 Nov 2000 16:56:16 +0100 Subject: NNSP 2001, CALL FOR PAPERS Message-ID: <3A1E8FA0.D882DAB8@neuro.kuleuven.ac.be> --------------------------------------------------------------- 2001 IEEE WORKSHOP ON NEURAL NETWORKS FOR SIGNAL PROCESSING --------------------------------------------------------------- September 10-12, 2001 Falmouth, Massachusetts, USA NNSP'2001 homepage: http://eivind.imm.dtu.dk/nnsp2001 submission deadline: March 15, 2001 Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council, the eleventh of a series of IEEE workshops on Neural Networks for Signal Processing will be held in Falmouth, Massachusetts, at the SeaCrest Oceanfront Resort and Conference Center. The workshop will feature keynote lectures, technical presentations, and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithms and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, generalization, design algorithms, optimization, parameter estimation, nonlinear signal processing, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, sonar and radar, data fusion, data mining, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, blind source separation, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. PAPER SUBMISSION PROCEDURE Prospective authors are invited to submit a full paper of up to ten pages using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2001 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academica Publishers. SCHEDULE Submission of full paper: March 15, 2001 Notification of acceptance: May 1, 2001 Submission of photo-ready accepted paper and author registration: June 1, 2001 Advance registration before: July 15, 2001 ORGANIZATION General Chairs David J. MILLER Tulay ADALI The Pennsylvania State University University of Maryland Baltimore County Program Chairs Jan LARSEN Marc VAN HULLE Technical University of Denmark Katholieke Universiteit, Leuven Finance Chair Publicity Chair Lian YAN Patrick DE MAZIERE Athene Software, Inc. Katholieke Universiteit, Leuven Proceedings Chair Registration and Local Arrangements Scott C. DOUGLAS Elizabeth J. WILSON Southern Methodist University Raytheon Co. America Liaison Asia Liaison Amir ASSADI H.C. FU University of Wisconsin at Madison National Chiao Tung University Europe Liaison Herve BOURLARD Swiss Federal Institute of Technology at Lausanne Program Committee Yianni Attikiouzel Andrew Back Herve Bourlard Andrzej Cichocki Jesus Cid-Sueiro Robert Dony Li Min Ling Guan Tzyy-Ping Jung Shigeru Katagiri Jens Kohlmorgen Fa Long Luo Danilo Mandic Elias Manolakos Michael Manry Takashi Matsumoto Christophe Molina Bernard Mulgrew Mahesan Niranjan Tomaso Poggio Kostas N. Plataniotis Jose Principe Phillip A. Regalia Joao-Marcos Romano Kenneth Rose Jonas Sjoberg Robert Snapp M. Kemal Sonmez Naonori Ueda Lizhong Wu Lian Yan Fernando Jose Von Zuben From diamond at sissa.it Fri Nov 24 11:35:07 2000 From: diamond at sissa.it (Mathew E. Diamond) Date: Fri, 24 Nov 2000 17:35:07 +0100 Subject: Neuroscience in Trieste Message-ID: <3.0.5.32.20001124173507.008692b0@shannon.sissa.it> Dear Neuroscientists, I would like to direct your attention to Neuroscience study and training opportunities now available in Trieste, Italy. For an overview of activities, please see the website: http://www.sissa.it/cns/www/neuroinfo.html For those interested in attending a College in Trieste on the Evolution of Intelligent Behavior please see the website: http://www.ictp.trieste.it/cgi- bin/ICTPsmr/mkhtml/smr2html.pl?smr1308/Announcement kind regards, Mathew E. Diamond (diamond at sissa.it) Cognitive Neuroscience Sector International School for Advanced Studies, Trieste ITALY From ahirose at info-dev.rcast.u-tokyo.ac.jp Mon Nov 27 02:08:27 2000 From: ahirose at info-dev.rcast.u-tokyo.ac.jp (ahirose@info-dev.rcast.u-tokyo.ac.jp) Date: Mon, 27 Nov 2000 16:08:27 +0900 (JST) Subject: CFP IEICE Trans Electron --Special Issue Message-ID: <200011270708.QAA16993@info-dev.rcast.u-tokyo.ac.jp> +----------------------------------------------------------------+ | Call for Papers | | ~~~~~~~~~~~~~~~ | | IEICE Trans. on Electronics Special Issue on | | New Technologies in Signal Processing for Electromagnetic-wave | | Sensing and Imaging | | Manuscript Deadline: March 31, 2001 | +----------------------------------------------------------------+ Topics include, for example, -Neural networks and other soft-computing techniques in classification, restoration and segmentation of radar images The IEICE (Institute of Electronics, Information and Communication Engineers, http://www.ieice.or.jp/) Transactions on Electronics announces a "Special Issue on New Technologies in Signal Processing for Electromagnetic- wave Sensing and Imaging" to be published in December 2001. Please visit: http://www.ee.t.u-tokyo.ac.jp/~ahirose/ieice-special-issue2001/ The Special Issue aims to publish articles on recent progress in science, theories, techniques, applications and systems concerning signal processing in electromagnetic-wave / lightwave sensing and imaging. The Editorial Committee solicits submissions of Papers and Letters in related wide research fields. 1. Scope Topics of interest include, but are not limited to, the following areas: Science and application of statistics in rough surface scattering Polarimetry / interferometry science and technology Phase unwrapping and residue analysis / elimination Speckle and interference noise reduction Inverse scattering and image reconstruction Neural network and soft-computing in classification, restoration and segmentation Wavelet transform and linear / nonlinear processing Migration method MUSIC method and high-resolution estimation Temporal and spatial coherence synthesis SAR theory and system Polarimetric / interferometric radar theory and system Near-field electromagnetic-wave imaging and system Remote sensing of the earth, ocean and atmosphere Compensation of fluctuation and distortion in imaging and sensing 2. Manuscript Deadline: March 31, 2001 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 3. Submission of Papers Prospective authors are requested to send four copies of their manuscript by March 31, 2001, to Dr. Akira Hirose, Guest Editor, at the address shown below. In the preparation, please refer to "Information for Authors" of the IEICE Transactions on Electronics for details. The information is also found at http://www.ieice.or.jp. Papers should not exceed 8 printed pages, whereas Letters 2 pages. When submitting, please indicate "Special Issue on New Technologies in Signal Processing for Electromagnetic-wave Sensing and Imaging" with red ink on the envelope and each front page of the four copies. Please note that the period for resubmission (normally 60 days) after review may be shortened due to the publication schedule. Submitted manuscripts (Papers / Letters) will be reviewed by referees according to the ordinary rules of the Transactions Editorial Committee. 4. Mailing address Four copies of the complete manuscript should be submitted to: Akira Hirose, Guest Editor Department of Frontier Informatics, Graduate School of Frontier Sciences, The University of Tokyo 4-6-1 Komaba, Meguro-ku, Tokyo 153-8904, Japan Phone: +81-3-5452-5155 Facsimile: +81-3-5452-5156 E-mail: ahirose at ee.t.u-tokyo.ac.jp 5. Special Issue Editorial Committee Guest Editor: Akira Hirose (Univ. of Tokyo, Tokyo) Members: Shane Cloude (Applied Electromagnetics, St. Andrews) George Georgiou (California State Univ., San Bernardino) Kazuo Hotate (Univ. of Tokyo, Tokyo) Eric Pottier (Univ. of Rennes 1, Rennes) Motoyuki Sato (Tohoku Univ. Sendai) Toru Sato (Kyoto Univ., Kyoto) Mitsuo Takeda (Univ. of Electro-Commun., Tokyo) Yoshio Yamaguchi (Niigata Univ., Niigata) * Please note that if accepted, all authors, including authors of invited papers, should pay for the page charges covering partial cost of publication. Authors will receive 100 copies of the reprint. Updated information will be posted at http://www.ee.t.u-tokyo.ac.jp/~ahirose/ieice-special-issue2001/ -- From dr+ at cs.cmu.edu Mon Nov 27 16:00:47 2000 From: dr+ at cs.cmu.edu (Douglas Rohde) Date: Mon, 27 Nov 2000 16:00:47 -0500 Subject: The Lens Neural Net Simulator Message-ID: <3A22CB7F.4BBBBDEB@cs.cmu.edu> Lens 2.3 is Now Available!! http://www.cs.cmu.edu/~dr/Lens I'm pleased to announce the first public release of the Lens neural network simulator. Lens is a general-purpose simulator that runs on Windows as well as most Unix platforms. It is primarily designed for training feed-forward and recurrent backpropagation networks, but can be easily adapted to handle other architectures and currently includes deterministic Boltzmann machines, Kohonen networks, and interactive-activation models. Lens provides a variety of graphical interfaces to aid in executing common commands, visualizing unit and link values, accessing internal parameters, and plotting data. However, it also includes a complete scripting language (Tcl), that enables networks to be constructed quickly and easily. Although Lens was designed with the serious modeler in mind, it would actually be a good choice for use in teaching introductory courses. Basic networks can be created in just a few simple commands and students will be up and running in minutes. The following are some of the main advantages of Lens over other neural network simulators I have used, including PDP++, SNNS, Xerion, RCS, Tlearn, and Matlab: -SPEED: Lens is up to 4 times faster than the other major simulators. This is mainly due to tight inner loops that minimize memory references and achieve good cache performance. Lens is also able to do batch parallel training on multiple processors. -EFFICIENCY: Lens makes efficient use of memory, so it can handle larger networks and larger example sets. -FLEXIBILITY: Because it includes a scripting language, the user has considerable flexibility in writing procedures to automatically build, train, or test customized networks. Unit input and output procedures can be composed from a variety of sub-procedures, allowing many group types to be created without the need to add additional code. -CUSTOMIZABILITY: Lens was designed around the principle that no simulator can satisfy all users out of the box. Sophisticated users will inevitably need to customize the simulator at the source code level. But altered code always causes problems if a new generic version of the source is released. Therefore, Lens includes facilities for easily extending network structures and creating new types, algorithms, and commands without altering the main code base. You can find additional information about downloading and using Lens at its homepage: http://www.cs.cmu.edu/~dr/Lens AVAILABILITY Lens is available free-of-charge to those conducting research at academic or non-profit institutions. Other users should contact me for licensing information at dr+lens at cs.cmu.edu. CAVEATS Lens is designed for simulating connectionist networks. It is not appropriate for compartmental membrane-level or differential equation-based modeling. From oreilly at grey.colorado.edu Tue Nov 28 17:32:08 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 28 Nov 2000 15:32:08 -0700 Subject: CU-Boulder Graduate Training in Cognitive Neuroscience Message-ID: <200011282232.PAA07733@grey.colorado.edu> Please forward this to relevant mailing lists, colleagues, potential students, etc. Cognitive Neuroscience University of Colorado Boulder Graduate Training Opportunities This is an invitation to apply for graduate training in Cognitive Neuroscience at the University of Colorado Boulder (CU). The CU Department of Psychology has a strong nucleus of cognitive neuroscientists with major interests in learning, memory, attention, executive functions, and emotion. Training is available in a variety of cognitive neuroscience methods including functional magnetic resonance imaging (fMRI), event-related brain potentials (ERP), computational cognitive neuroscience, human neuropsychology, and animal behavioral neuroscience methods. Faculty include Marie Banich (fMRI, human neuropsychology, attention and executive functions), Tim Curran (ERP, learning and memory), Tiffany Ito (ERP, social neuroscience, emotion, stereotyping), Akira Miyake (working memory and executive functions), Randall O'Reilly (computational cognitive neuroscience, learning and memory, executive functions), and Jerry Rudy (behavioral neuroscience, learning & memory). Several other members of the CU Psychology faculty are interested in cognitive neuroscience approaches to a variety of topics: Edward Craighead (prevention of major depression), Reid Hastie (judgment and decision making, memory and cognition, social psychology), Kent Hutchinson (substance abuse and dependence), Steve Maier (behavioral neuroscience, adaptation to challenge), David Miklowitz (adult psychopathology), Linda Watkins (behavioral neuroscience, adaptation to challenge), and Mark Whisman (cognitive therapy and depression). The cognitive neuroscientists in the CU Psychology Department are complemented by an outstanding community of scientists with similar interests. The CU Psychology Department, with graduate programs in Cognitive Psychology and Behavioral Neuroscience, is consistently in the top 20 of the US News & World Report rankings. The Institute of Cognitive Science includes scientists from computer science, education, linguistics, philosophy, and psychology. The Institute for Behavioral Genetics provides an unique resource for conducting and facilitating research on the genetic and environmental bases of individual differences in behavior. Associated hospitals provide the possibility of conducting research with patient populations. Our computational cognitive neuroscience laboratory particularly benefits from interactions from professors Mike Mozer (CU Computer Science) and Yuko Munakata (University of Denver (DU), Psychology). In addition to providing an exciting research environment and hosting the annual Neural Information Processing Systems conference, the greater Denver/Boulder area offers an exceptional quality of life. Spectacularly situated at the eastern edge of the Rockies, this area provides a wide variety of extraordinary outdoor activities, an average of 330 sunny days per year, and also affords a broad range of cultural activities. For more information, full lists of associated faculty, and instructions on applying to the graduate programs, see the following web sites: CU Overview Web Page: http://www.cs.colorado.edu/~mozer/resgroup.html CU Psychology: http://psych-www.colorado.edu/ CU Institute of Cognitive Science: http://psych-www.colorado.edu/ics/home.html CU Institute of Behavioral Genetics: http://ibgwww.colorado.edu/ CU Computer Science: http://www.cs.colorado.edu/ DU Psychology: http://www.du.edu/psychology/ One or more of the following faculty should be contacted for any further information. Marie Banich, CU Psych mbanich at psych.colorado.edu http://psych.colorado.edu/~mbanich/lab/ Tim Curran, CU Psych tcurran at psych.colorado.edu http://psych.colorado.edu/~tcurran/ Tiffany Ito, CU Psych tito at psych.colorado.edu http://psych.colorado.edu/~tito/ Akira Miyake, CU Psych miyake at psych.colorado.edu http://psych.colorado.edu/~miyake/ Michael Mozer, CU CS mozer at cs.colorado.edu http://www.cs.colorado.edu/~mozer/Home.html Yuko Munakata, DU Psych munakata at kore.psy.du.edu http://kore.psy.du.edu/munakata Randall O'Reilly, CU Psych oreilly at psych.colorado.edu http://psych.colorado.edu/~oreilly Jerry Rudy, CU Psych jrudy at clipr.Colorado.edu http://psych.colorado.edu/~jrudy/ From heiko.wersing at hre-ftr.f.rd.honda.co.jp Wed Nov 29 11:24:28 2000 From: heiko.wersing at hre-ftr.f.rd.honda.co.jp (Heiko Wersing) Date: Wed, 29 Nov 2000 17:24:28 +0100 Subject: Two PhD Studentships at HONDA R&D Europe Message-ID: <3A252DBC.C02F35B0@hre-ftr.f.rd.honda.co.jp> Two PhD Studentships at HONDA R&D Europe Systems, which support humans in their complex and dynamic environments and help to save natural resources, are the aim of the Future Technology Research Division of HONDA R&D EUROPE (Deutschland). In close cooperation with national and international research institutions, we develop intelligent systems based on neural architectures and evolutionary processes for the design and optimisation of technical systems. One example for the leading-edge fundamental research activities at HONDA is the development of the walking humanoid robots P3 (see http://www.honda-p3.com/) and Asimo. Our research centre is located in Offenbach/Main, next to Frankfurt/Main, the financial capital of Europe with its international flair. Candidates will work in a high-class team conducting leading-edge research in computational intelligence. As part of a national research project on learning systems, we are looking for two researchers to strengthen our young and dynamic team. The positions will focus on behaviour-based learning and evolutionary principles for structure adaptation with applications to robotics, man-machine interaction, and computer vision. Due to the nature of the project the positions are initially limited to a period of 3 years. The candidates will be supported to do a doctorate degree. Applicants should have a Diploma or Master degree in physics, computer science or electrical engineering and enjoy to work in a multidisciplinary research project in an international team. The candidates should be fluent in English or German. Please send your application to: Prof. Dr.-Ing habil. Edgar K?rner Future Technology Research Division HONDA R&D EUROPE (Deutschland) GmbH Carl-Legien-Strasse 30 63073 Offenbach/Main GERMANY Telephone : ++49 (0)69 890110 email : edgar.koerner at hre-ftr.f.rd.honda.co.jp From rogilmore at psu.edu Wed Nov 29 11:39:59 2000 From: rogilmore at psu.edu (Rick Gilmore) Date: Wed, 29 Nov 2000 11:39:59 -0500 Subject: Graduate Training at Penn State Message-ID: Please circulate the following graduate program announcement. Thanks, Rick Gilmore --------------------------------------------- Graduate Training in Psychobiology, Psychophysiology, and Neuroscience at Penn State The Psychology Department at Penn State invites students with excellent undergraduate academic records and research experience to apply for graduate training in psychobiology, psychophysiology, and neuroscience. Faculty in the department have wide-ranging interests and offer a variety of personalized training experiences for graduate students. Faculty research specialties include motion sickness, nausea and appetite; affective neuroscience; chronobiology (time analysis of behavior's rhythms); developmental and clinical cognitive neuroscience; attention and information processing; and visual cognition. The department has particular expertise in and excellent facilities for research using psychophysiological (EEG, ERP, EGG, EKG) and behavioral measures. Penn State's Psychology Department offers an exceptionally congenial and collaborative environment for graduate students who are seeking individualized and hands-on training. Incoming students are guaranteed a competitive multi-year financial package. The Department has collaborative relationships with other neuroscience researchers who are part of the university-wide Life Sciences Consortium based at University Park and at the College of Medicine in Hershey. In addition, the Department expects to add a senior neuroscientist to the faculty by Fall 2001. Penn State is located in an area known for its affordability and exceptional quality of life. For more information about graduate training at Penn State, visit the Department's web site (http://psych.la.psu.edu) or contact one of the faculty members directly. Frederick Brown, Chronobiology (time analysis of behavior's rhythms) f3b at psu.edu, http://psych.la.psu.edu/faculty/brown.htm Rick Gilmore, Perceptual development, developmental cognitive neuroscience rogilmore at psu.edu, http://psych.la.psu.edu/faculty/gilmore.htm Cathleen Moore, Visual cognition cmm15 at psu.edu, http://psych.la.psu.edu/faculty/moore.htm Toby Mordkoff, Attention and information processing jtm12 at psu.edu, http://psych.la.psu.edu/faculty/mordkoff.htm Karen Quigley, Affective neuroscience ksq1 at psu.edu, http://psych.la.psu.edu/faculty/quigley.htm William Ray, Clinical cognitive neuroscience wjr at psu.edu, http://psych.la.psu.edu/faculty/ray.htm Robert Stern, Motion sickness, nausea, and appetite rs3 at psu.edu, http://psych.la.psu.edu/faculty/stern.htm From ken at phy.ucsf.edu Thu Nov 30 01:00:06 2000 From: ken at phy.ucsf.edu (Ken Miller) Date: Wed, 29 Nov 2000 22:00:06 -0800 (PST) Subject: UCSF Postdoctoral/Graduate Fellowships in Theoretical Neurobiology Message-ID: <14885.60646.421523.8769@coltrane.ucsf.edu> FULL INFO: http://www.sloan.ucsf.edu/sloan/sloan-info.html PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR CONTACT ADDRESSES GIVEN BELOW The Sloan Center for Theoretical Neurobiology at UCSF solicits applications for pre- and post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. The Sloan Center offers opportunities to combine theoretical and experimental approaches to understanding the operation of the intact brain. Young scientists with strong theoretical backgrounds will receive scientific training in experimental approaches to understanding the operation of the intact brain. They will learn to integrate their theoretical abilities with these experimental approaches to form a mature research program in integrative neuroscience. The research undertaken by the trainees may be theoretical, experimental, or a combination. Resident Faculty of the Sloan Center and their research interests include: Herwig Baier: Genetic analysis of the visual system William Bialek (25\% time): Information-theoretic and statistical characterization of, and physical limits to, neural coding and representation Allison Doupe: Development of song recognition and production in songbirds Stephen Lisberger: Learning and memory in a simple motor reflex, the vestibulo-ocular reflex, and visual guidance of smooth pursuit eye movements by the cerebral cortex Michael Merzenich: Experience-dependent plasticity underlying learning in the adult cerebral cortex, and the neurological bases of learning disabilities in children Kenneth Miller: Circuitry of the cerebral cortex: its structure, self-organization, and computational function (primarily using cat primary visual cortex as a model system) Philip Sabes: Sensorimotor coordination, adaptation and development of spatially guided behaviors, experience dependent cortical plasticity Christoph Schreiner: Cortical mechanisms of perception of complex sounds such as speech in adults, and plasticity of speech recognition in children and adults Michael Stryker: Mechanisms that guide development of the visual cortex There are also a number of visiting faculty, including Larry Abbott, Brandeis University; Sebastian Seung, MIT; David Sparks, Baylor University; Steve Zucker, Yale University. TO APPLY, please send a curriculum vitae, a statement of previous research and research goals, up to three relevant publications, and have two letters of recommendation sent to us. The application deadline is February 1, 2000. Send applications to: Sloan Center 2001 Admissions Sloan Center for Theoretical Neurobiology at UCSF Department of Physiology University of California 513 Parnassus Ave. San Francisco, CA 94143-0444 PRE-DOCTORAL applicants with strong theoretical training may seek admission into the UCSF Neuroscience Graduate Program as a first-year student. Applicants seeking such admission must apply by Jan. 5, 2000 to be considered for fall, 2000 admission. Application materials for the UCSF Neuroscience Program may be obtained from http://www.neuroscience.ucsf.edu/neuroscience/admission.html or from Pat Vietch Neuroscience Graduate Program Department of Physiology University of California San Francisco San Francisco, CA 94143-0444 neuroscience at phy.ucsf.edu Be sure to include your surface-mail address. The procedure is: make a normal application to the UCSF Neuroscience program; but also alert the Sloan Center of your application, by writing to Steve Lisberger at the address given above. If you need more information: -- Consult the Sloan Center WWW Home Page: http://www.sloan.ucsf.edu/sloan -- Send e-mail to sloan-info at phy.ucsf.edu -- See also the home page for the W.M. Keck Foundation Center for Integrative Neuroscience, in which the Sloan Center is housed: http://www.keck.ucsf.edu/ From henkel at physik.uni-bremen.de Thu Nov 30 03:33:48 2000 From: henkel at physik.uni-bremen.de (Rolf D. Henkel) Date: Thu, 30 Nov 2000 09:33:48 +0100 Subject: Follow-Up on TR "Sync & Coherence-Detection" Message-ID: <00113009425000.16709@axon> Dear Connectionists, some people had difficulties accessing the technical report Title: Synchronization, Coherence-Detection and Three-Dimensional Vision Author: Rolf D. Henkel, Institute for Theoretical Neurophysics Keywords: Synchronization, Coherence, Neural Code, Neural Computations, Robust Estimators, Three-dimensional Vision, Integrate-And-Fire-Neurons. in which a new operational mode for networks of spiking neurons is proposed. There is now an easier accessable webpage available, at http://axon.physik.uni-bremen.de/~rdh/research/stereo/spiking/ Also, the principle idea of coherence-detection can be tested online, with own data, by going to the webpage http://axon.physik.uni-bremen.de/~rdh/online_calc/stereo/ Regards, Rolf Henkel -- Dr. Rolf Henkel Institute for Theoretical Neurophysics University Bremen Kufsteiner Straße 1, D-28359 Bremen Phone: +49-421-218-3688 henkel at physik.uni-bremen.de Fax: +49-421-218-4869 http://axon.physik.uni-bremen.de/