From qian at brahms.cpmc.columbia.edu Sun Apr 1 16:59:21 2001 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Sun, 1 Apr 2001 16:59:21 -0400 Subject: paper available: Modeling V1 Disparity Tuning to Time-varying Stimuli Message-ID: <200104012059.QAA29825@bach.cpmc.columbia.edu> Dear Colleagues. The following paper on "Modeling V1 Disparity Tuning to Time-varying Stimuli" is available at: http://brahms.cpmc.columbia.edu/publications/v1time.ps.gz Best regards, Ning ---------------------------------------------------- Modeling V1 Disparity Tuning to Time-varying Stimuli Yuzhi Chen, Yunjiu Wang, and Ning Qian, J. Neurophysiol. (in press). Abstract Most models of disparity selectivity consider only the spatial properties of binocular cells. However, the temporal response is an integral component of real neurons' activities, and time-varying stimuli are often used in the experiments of disparity tuning. To understand the temporal dimension of V1 disparity representation, we incorporate a specific temporal response function into the disparity energy model, and demonstrate that the binocular interaction of complex cells is separable into a Gabor disparity function and a positive time function. We then investigate how the model simple and complex cells respond to widely used time-varying stimuli, including motion-in-depth patterns, drifting gratings, moving bars, moving random dot stereograms, and dynamic random dot stereograms. It is found that both model simple and complex cells show more reliable disparity tuning to time-varying stimuli than to static stimuli, but similarities in the disparity tuning between simple and complex cells depend on the stimulus. Specifically, the disparity tuning curves of the two cell types are similar to each other for either drifting sinusoidal gratings or moving bars. In contrast, when the stimuli are dynamic random dot stereograms, the disparity tuning of simple cells is highly variable, whereas the tuning of complex cells remains reliable. Moreover, cells with similar motion preferences in the two eyes cannot be truly tuned to motion in depth, regardless of the stimulus types. These simulation results are consistent with a large body of extant physiological data, and provide some specific, testable predictions. From qian at brahms.cpmc.columbia.edu Mon Apr 2 15:20:35 2001 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Mon, 2 Apr 2001 15:20:35 -0400 Subject: postdoc position available at Columbia Message-ID: <200104021920.PAA30812@bach.cpmc.columbia.edu> Postdoctoral Position in Visual Psychophysics or Modeling Center for Neurobiology and Behavior Columbia University A postdoctoral position in visual psychophysics or computational modeling of vision is available immediately in Dr. Ning Qian's lab at Columbia. The postdoc will be working on vision-related projects including (but not restricted to) binocular depth perception and rivalry, motion perception, structure from motion, and visual perceptual learning. The details of the on-going projects and recent publications can be found at the webpage: http://brahms.cpmc.columbia.edu The funding is available for at least three years. The initial appointment will be for one year, and will be renewable on a yearly basis. The candidate should have a strong background in either psychophysics or mathematical/computational modeling, as evidenced by first-authored publications. Programming skills in Matlab or C will be a plus. Please send CV, two to three letters of reference, and representative publications to: Dr. Ning Qian Ctr. Neurobiology & Behavior Columbia University Annex Rm 730 722 W. 168th St. New York, NY 10032, USA nq6 at columbia.edu 212-543-5213 E-mail applications and inquiries welcome. From zemel at cs.toronto.edu Mon Apr 2 16:40:44 2001 From: zemel at cs.toronto.edu (Richard Zemel) Date: Mon, 2 Apr 2001 16:40:44 -0400 Subject: NIPS*2001 Call For Papers Message-ID: <01Apr2.164049edt.453159-23025@jane.cs.toronto.edu> CALL FOR PAPERS -- NIPS*2001 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Dec. 3 -- Saturday, Dec. 8, 2001 Vancouver, British Columbia, Canada ========================================== This is the fifteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Dec. 3), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 7-8). Invited speakers this year will be Barbara Finlay (Departments of Psychology, and Neurobiology and Behavior, Cornell University), Alison Gopnik (Department of Psychology, University of California at Berkeley), Jon M. Kleinberg (Department of Computer Science, Cornell University), Shihab Shamma (Department of Electrical Engineering University of Maryland), Judea Pearl (Department of Computer Science, UCLA), and Tom Knight (Artificial Intelligence Laboratory, MIT). Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, kernel methods, committee models, graphical models, support vector machines, Gaussian processes, decision trees, factor analysis, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, web and network applications, intrusion and fraud detection, bio-informatics, medical diagnosis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music and video applications, animation, virtual environments. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, human learning and memory, conditioning, categorization, attention, language, reasoning, spatial cognition, emotional cognition, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, neuromorphic engineering, microelectromechanical systems, optical systems, vision chips, head-eye systems, cochlear implants, roving robots, computational sensors and actuators, molecular and quantum computing, novel neurodevices, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, population codes, temporal codes, spike train analysis, and experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, computational models of classical and operant conditioning, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, game-playing, Markov decision processes, multi-agent cooperative algorithms. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory. Visual Processing: image processing, image coding, object recognition, face recognition, visual feature detection, visual psychophysics, stereopsis, optic flow algorithms, motion detection and tracking, spatial representations, spatial attention, scene analysis, visual search, visuo-spatial working memory. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable from the web page listed below. The style files are unchanged from NIPS*2000. Submission Instructions: NIPS accepts only electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We will accept postscript and PDF documents, but we prefer postscript. The electronic submission page will be available on June 6, 2001 Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT JUNE 20, 2001 PACIFIC DAYLIGHT TIME (08:00 GMT JUNE 21, 2001). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS For general inquiries or requests for registration material, send e-mail to nipsinfo at salk.edu or fax to (619)587-0417. NIPS*2001 Organizing Committee: General Chair, Tom Dietterich, Oregon State University; Program Chair, Sue Becker, McMaster University; Publications Chair, Zoubin Ghahramani, University College London; Tutorial Chair, Yoshua Bengio, University of Montreal; Workshops Co-Chairs, Virginia de Sa, Sloan Center for Theoretical Neurobiology, Barak Pearlmutter, University of New Mexico; Publicity Chair, Richard Zemel, University of Toronto; Volunteer Coordinator, Sidney Fels, University of British Columbia; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Alex Gray, Carnegie Mellon University, Xin Wang, Oregon State University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2001 Program Committee: Sue Becker, McMaster University (chair); Gert Cauwenberghs, Johns Hopkins University; Bill Freeman, Mitsubishi Electric Research Lab; Thomas Hofmann, Brown University; Dan Lee, Bell Laboratories, Lucent Technologies; Sridhar Mahadevan, Michigan State University; Marina Meila-Predoviciu, University of Washington; Klaus Mueller, GMD First, Berlin; Klaus Obermayer, TU Berlin; Sam Roweis, Gatsby Computational Neuroscience Unit, UCL; John Shawe-Taylor, Royal Holloway, University of London; Josh Tenenbaum, Stanford University; Volker Tresp, Siemens, Munich; Richard Zemel, University of Toronto. PAPERS MUST BE SUBMITTED BY JUNE 20, 2001 From mieko at isd.atr.co.jp Mon Apr 2 22:15:13 2001 From: mieko at isd.atr.co.jp (Mieko Namba) Date: Tue, 3 Apr 2001 11:15:13 +0900 Subject: CALL FOR PAPERS [Neural Networks 2002 Special Issue] Message-ID: Dear members, We are glad to inform you that the Japanese Neural Networks Society will edit the NEURAL NETWORKS 2002 Special Issue as below. NEURAL NETWORKS is an official international compilation of the Journal of the International Neural Networks Society, the European Neural Networks Society and the Japanese Neural Networks Society. We are looking forward to receiving your contributions. Mitsuo Kawato Co-Editor-in-Chief Neural Networks (ATR) ****************************************************************** CALL FOR PAPERS Neural Networks 2002 Special Issue "Computational Models of Neuromodulation" ****************************************************************** Co-Editors Dr. Kenji Doya, ATR, Japan Dr. Peter Dayan, University College London, U.K. Professor Michael E. Hasselmo, Boston University, U.S.A. Submission Deadline for submission: September 30, 2001 Notification of acceptance: January 31, 2002 Format: as for normal papers in the journal (APA format) and no longer than 10,000 words Address for Papers Dr. Mitsuo Kawato ATR 2-2-2 Hikaridai, Seika-cho Soraku-gun, Kyoto 619-0288, Japan. MORE DETAIL: http://www.isd.atr.co.jp/nip/NNSP2002.html ****************************************************************** Neuromodulators such as acetylcholine, dopamine, norepinephrine and serotonin exert widespread and diverse computational influences, based on a range of subtle cellular effects and a non-specific and diffuse pattern of anatomical connectivity. The roles for neuromodulators can often be characterised in terms of meta-learning, that is, regulation of global parameters and the structure of a learning system. Their specific roles include the prediction of reward and punishment, the allocation of selective attention, the regulation of behavioral variability, and the control of memory acquisition. Drugs influencing these modulatory systems have profound effects on neural network dynamics and plasticity, and thus on cognition and behavior. Computational modeling is essential to understand the effects of such subtle cellular changes on the macroscopic function of neural networks. The Special Issue will focus on the computational models of neuromodulators. Contributed articles covering the range of neuromodulatory effects are solicited, and integrative accounts will be specially welcome. The special issue will include theories of meta-learning, such as automatic tuning of learning rates and noise for exploration, and models of the roles of particular neuromodulators in particular systems, such as acetylcholine in the hippocampus and dopamine in the pre-frontal cortex. Models of the activities and interactions of cells releasing the neuromodulators, their roles in behavioral and cognitive functions, and models of invertebrate neuromodulation will also be welcome. ****************************************************************** end. From duch at phys.uni.torun.pl Tue Apr 3 06:02:08 2001 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Tue, 3 Apr 2001 11:02:08 +0100 Subject: CALL FOR PAPERS [TASK Quarterly 2002 Special Issue] Message-ID: Dear members, Polish Neural Networks Society plans a special issue of the TASK Quarterly Journal on neural networks. Of our particular interest are longer papers of tutorial nature, perhaps longer versions of papers that you have prepared for other journals or conferences but had to shorten and thus some important details were left unpublished. ****************************************************************** CALL FOR PAPERS TASK Quarterly 2002 Special Issue "Neural Networks" ****************************************************************** This special issue will be devoted to neural and other computational intelligence methods and applications. Special emphasis will be put on: models that do more than classification and approximation, systems that go beyond associative capabilities of simple networks, for example are not based on vectors in feature spaces of fixed dimensions; understanding the data and building theories, modular networks, multi-strategy learning, general theories of learning, hybrid systems, such as the neuro-fuzzy systems combining neural and symbolic components, simulations of brain functions and models with spiking neurons. Deadline for submission: September 30, 2001 Notification of acceptance: January 31, 2002 Format: no restrictions on length, detailed instructions on the journal WWW page: http://www.task.gda.pl/quart/ Editors Prof. Wlodzislaw Duch, Nicholas Copernicus University, Poland. Prof. Danuta Rutkowska, Technical University of Czestochowa, Poland Send your papers via email to: duch at ieee.org ****************************************************************** The TASK Quarterly journal http://www.task.gda.pl/quart/ started in 1997 and is kept on a very decent level, with strict refereeing system. The journal covers all subjects of computational nature, accepts and prints color illustrations (free of charge). It is covered by INSPEC abstracts and has a regular subscription of about 400 (for a scientific journal it is not bad). So far two special issues on informatics in medicine have been published. The editors may send you sample copies if you send them an email to the quarterly at task.gda.pl address. ****************************************************************** Wlodzislaw Duch Dept. of Computer Methods, N. Copernicus University http://www.phys.uni.torun.pl/~duch From duch at phys.uni.torun.pl Tue Apr 3 12:52:09 2001 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Tue, 3 Apr 2001 17:52:09 +0100 Subject: Review of PNL NN site Message-ID: Dear members, Very nice Neural Networks web infromation has been provided for some time at the PNL site http://www.emsl.pnl.gov:2080/proj/neuron/neural/ IEEE Transaction on Neural Networks intends to publish reviews of interesting neural sites and other media. Unfortunately for more than a year I was not able to get any response from the site developers. Please let me know if there are any volunteers to review this site. You are welcomed to suggest other sites for reviewing. I'd like to thank all book review volunteers who responded to my previous call. Please be sure that whenever an appropriate book arrives I'll contact you. Wlodzislaw Duch, duch at ieee.org Dept. of Computer Methods, N. Copernicus University http://www.phys.uni.torun.pl/~duch From robert at physik.uni-wuerzburg.de Tue Apr 3 08:03:21 2001 From: robert at physik.uni-wuerzburg.de (Robert Urbanczik) Date: Tue, 3 Apr 2001 14:03:21 +0200 (CEST) Subject: Paper on SVMs Message-ID: Dear Connectionists, the following paper (5 pages, to appear in Phys. Rev. Letts.) is available from: ftp://ftp.physik.uni-wuerzburg.de/pub/preprint/2001/WUE-ITP-2001-006.ps.gz M. Opper and R. Urbanczik Universal learning curves of support vector machines ABSTRACT: Using methods of Statistical Physics, we investigate the r\^ole of model complexity in learning with support vector machines (SVMs), which are an important alternative to neural networks. We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal generalization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which only depend on the target rule but not on the SVM kernel. _________________________________________________________________________ R. Urbanczik Email: Inst. for Theoretical Physics III urbanczik at physik.uni-wuerzburg.de University Wuerzburg Phone: Am Hubland ++49 931 888 4908 97074 Wuerzburg Fax: Germany ++49 931 888 5141 _________________________________________________________________________ From Nigel.Goddard at ed.ac.uk Tue Apr 3 18:21:05 2001 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Tue, 03 Apr 2001 23:21:05 +0100 Subject: fMRI Research Associate and Ph.D Studentships Message-ID: <3ACA4CD1.687A9068@ed.ac.uk> OPPORTUNITIES IN FUNCTIONAL MRI & NEUROINFORMATICS AT EDINBURGH The Centre for Functional Imaging Studies has the following position and studentships available. For details see http://anc.ed.ac.uk/CFIS. 1. Research Associate in Functional MRI. An outstanding opportunity to gain a wide range of experience in all of the techniques used in fMRI studies and across a wide range of studies. Closing April 13th. Intending applicants should consult the website and contact by email Nigel.Goddard at ed.ac.uk as soon as possible. 2. A possible MRC-funded Ph.D. studentship in memory impairment in schizophrenia. Closing April 29th. 3. An MRC-funded Ph.D. studentship with superior stipend, focused on methodology in the context of one or more of our ongoing fMRI studies. Closing May 1st. The Centre for Functional Imaging Studies at the University of Edinburgh has been established to provide a focal grouping for expertise and experience in the methodologies used in functional brain imaging. The Centre undertakes and assists with imaging-based studies of brain function, working with research groups at Edinburgh and elsewhere. Our current focus is studies of cognitive function using functional Magnetic Resonance Imaging in collaboration with the SHEFC Brain Imaging Research Centre at the Western General Hospital, which houses the research-dedicated MRI scanner (see the BIRC website at http://www.dcn.ed.ac.uk/bic) . We have extensive facilities for fMRI studies including a parallel computer, high-speed networking, a large online data archive, state-of-the art stimulus presentation and paradigm programming software and hardware, and a custom-built simulator. We have a wide range of experience in paradigm design, with ongoing projects including clinical studies of schizophrenia, depression, pain, conversion and sleep disorders; scientific studies in language, decision-making, memory and affect; and methodological studies in realtime fMRI, repeatability, and statistical data analysis. From P.J.Lisboa at livjm.ac.uk Wed Apr 4 07:33:01 2001 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Wed, 4 Apr 2001 12:33:01 +0100 Subject: Industrial use of safety-related artificial neural networks Message-ID: A contract research report on industrial use of safety-related artificial neural networks has been published on the web by the contractors, the UK's Health and Safety Executive. A link address and abstract are appended to this email. This is in the nature of a consultation paper, so feedback regarding any aspect of the paper is very welcome. Paulo Lisboa. http://www.hse.gov.uk/research/crr_pdf/2001/crr01327.pdf Abstract The overall objective of this study is to investigate to what extent neural networks are used, and are likely to be used in the near future, in safety-related applications. Neural network products are actively being marketed and some are routinely used in safety-related areas, including cancer screening and fire detection in office blocks. Some are medical devices already certified by the FDA. The commercial potential for this technology is evident from the extent of industry-led research, and safety benefits will arise. In the process industries, for instance, there is real potential for closer plant surveillance and consequently productive maintenance, including plant life extension. It is clear from the applications reviewed that the key to successful transfer of neural networks to the marketplace is successful integration with routine practice, rather than optimisation for the idealised environments where much of the current development effort takes place. This requires the ability to evaluate their empirically derived response using structured domain knowledge, as well as performance testing. In controller design, the scalability of solutions to production models, and the need to maintain safe and efficient operation under plant wear, have led to the integration of linear design methods with neural network architectures. Further research is necessary in two directions, first to systematise current best practice in the design of a wide range of quite different neural computing software models and hardware systems, then to formulate a unified perspective of high-complexity computation in safety-related applications. There is a need to develop guidelines for good practice, to educate non-specialist users and inform what is already a wide base of practitioners. Combined with a safety awareness initiative, this would be of as much of benefit to the development of this commercially important new technology, as to its safe use in safety-related applications. From patrick at neuro.kuleuven.ac.be Wed Apr 4 09:00:43 2001 From: patrick at neuro.kuleuven.ac.be (Patrick De Maziere) Date: Wed, 4 Apr 2001 15:00:43 +0200 (MET DST) Subject: JOINT PUBLICATION OF 2 BOOKS ON SELF-ORGANIZING TOPOGRAPHIC MAPS Message-ID: JOINT PUBLICATION OF 2 BOOKS ON SELF-ORGANIZING TOPOGRAPHIC MAPS ================================================================ Both books, one in English and another in Japanese, offer a new perspective on topographic map formation and the advantages of information-based learning. The complete learning algorithms and simulation details are given throughout, along with comparative performance analysis tables and extensive references. The books provide the reader with an excellent, eye-opening guide for neural network researchers and students, industrial scientists involved in data mining, and anyone interested in self-organization and topographic maps. Forewords are by Prof. Teuvo Kohonen and Prof. Helge Ritter English version: Faithful Representations and Topographic Maps: From Distortion- to Information-based Self-organization by Marc M. Van Hulle, published by J. Wiley (New York) For more information, visit: http://catalog2.wiley.com/catalog/frameset/1,8279,,00.html (search via author name "Van Hulle") and to order it, visit: http://www.amazon.com/exec/obidos/ASIN/0471345075/qid=948382599/sr=1-1/002-0713799-7248240 (which includes several reviews) ------------------------- Japanese version: Self-organizing Maps: Theory, Design, and Application by Marc M. Van Hulle, Heizo Tokutaka, Kikuro Fujimura, published by Kaibundo (Tokyo) For more information, visit: http://member.nifty.ne.jp/kaibundo/syousai/ISBN4-303-73150-1.htm and to order it, visit: http://www.amazon.co.jp/exec/obidos/ASIN/4303731501/qid%3D986382452/249-5665754-6235519 ------------------------- Reviews: "I am convinced that this book marks an important contribution to the field of topographic map representations and that it will become a major reference for many years." (Ritter) "This book will provide a significant contribution to our theoretical understanding of the brain." (Kohonen) ------------------------- From jaap at murre.com Wed Apr 4 12:09:07 2001 From: jaap at murre.com (Jaap Murre) Date: Wed, 4 Apr 2001 18:09:07 +0200 Subject: Models of language acquisition Message-ID: <003301c0bd21$94852ca0$03000004@psy.uva.nl> Dear Connectionists, Recently two new books edited by us came out that may be of interest to you, in particular the first one. -- Jaap Murre Broeder, P., and J.M.J. Murre (Eds.) (2000). 'Models of Language Acquisition: Inductive and Deductive Approaches'. Oxford University Press. Broeder, P., and J.M.J. Murre (Eds.) (1999). 'Language and Thought in Development. Cross-Linguistic Studies'. Tuebingen: Gunter Narr. The contents of 'Models of Language Acquisition' is: Peter Broeder and Jaap Murre -- 1. Introduction to models of language acquisition Brian MacWhinney -- 2. Lexicalist connectionism Noel Sharkey, Amanda Sharkey, and Stuart Jackson -- 3. Are SRNs sufficient for modelling language acquisition? Antal van den Bosch and Walter Daelemans -- 4. A distributed, yet symbolic model for text-to-speech processing Steven Gillis, Walter Daelemans, and Gert Durieux -- 5. 'Lazy learning': a comparison of natural and machine learning of word stress Richard Shillcock, Paul Cairns, Nick Chater, and Joe Levy -- 6. Statistical and connectionist modelling of the development of speech segmentation Jeffrey Mark Siskind -- 7. Learning word-to-meaning mappings Gary Marcus -- 8. Children's overregularization and its implication for cognition Rainer Goebel and Peter Indefrey -- 9. The performance of a recurrent network with short term memory capacity learning the German -s plural Ramin Nakisa, Kim Plunkett, and Ulrike Hahn -- 10. A cross-linguistic comparison of single and dual-route models of inflectional morphology Partha Nyogi and Robert C. Berwick -- 12. Formal models for learning in the principles and parameters framework Loeki Elbers -- 13. An output-as-input hypothesis for language acquisition: arguments, model, evidence From Volker.Tresp at mchp.siemens.de Thu Apr 5 05:35:52 2001 From: Volker.Tresp at mchp.siemens.de (Volker Tresp) Date: Thu, 05 Apr 2001 11:35:52 +0200 Subject: NIPS Proceedings Available Online Message-ID: <3ACC3C78.BC7FE1A1@mchp.siemens.de> NIPS PROCEEDINGS AVAILABLE ONLINE The papers that will be published by MIT Press in Advances in Neural Information Processing Systems 13 (Proceedings of the 2000 Conference) edited by Todd K. Leen, Thomas G. Dietterich and Volker Tresp are available online. The URL is http://www.cs.cmu.edu/Web/Groups/NIPS/NIPS2000/00abstracts.html Best regards, Volker Tresp NIPS publication chair for NIPS*2000. From mel at lnc.usc.edu Thu Apr 5 17:00:35 2001 From: mel at lnc.usc.edu (Bartlett Mel) Date: Thu, 05 Apr 2001 14:00:35 -0700 Subject: Faculty Position(s) in Neural Engineering Message-ID: <3ACCDCF3.38D8F4F3@lnc.usc.edu> ************* Announcing ************** University of Southern California Department of Biomedical Engineering FACULTY POSITIONS The Department of Biomedical Engineering at USC is engaged in a major expansion of its research and educational programs, supported through the School of Engineering, the Alfred E. Mann Institute and a Special Opportunity Award from the Whitaker Foundation. For the first phase of this expansion we invite applications for tenure-track faculty positions at all levels in the areas of device/diagnostic technologies and neural systems. In the area of neural systems the successful candidate will be expected to complement the Department's existing strengths by establishing an independent research program in areas such as: computational neural science; sensory systems; motor control; neural/device interfaces; neural prostheses. The successful candidates in device/diagnostic technologies will provide leadership in establishing research and educational programs leading to the next generation of biomedical device and diagnostic technologies. Areas of interest include molecular and chemical sensing; biochemical, mechanical, optical microsystems; smart sensor and diagnostic technologies; implantable devices. Faculty will have the opportunity to work with the technology development professionals of the Alfred E. Mann Institute for Biomedical Engineering at USC, to translate their fundamental research discoveries into commercially viable biomedical technologies to improve human health and well-being. Applicants should submit a curriculum vitae and research/education statement along with suggested references to: David Z. D'Argenio, Chair Department of Biomedical Engineering University of Southern California Los Angeles, CA 90089-1451 Applicants are encouraged to visit the following web sites for details on current educational and research programs. BME Department - http://bme.usc.edu Center for Neural Engineering - http://www.usc.edu/dept/engineering/CNE Neuroscience Graduate Program - http://www.usc.edu/dept/nbio/ngp Neural Computation as USC - http://www-slab.usc.edu/neurocomp for details on current educational and research programs. The University of Southern California is an Equal Opportunity/Affirmative Action Employer and Encourages Applications from Women and Minority Candidates. -- Bartlett W. Mel, Assoc Prof Biomed Engin, Neurosci Grad Prog USC, BME Dept, MC 1451, Los Angeles, CA 90089 mel at usc.edu, http://LNC.usc.edu voice: (213)740-0334, lab: -3397, fax: -0343 fedex: 3650 McClintock Ave, 500 Olin Hall From robert at physik.uni-wuerzburg.de Tue Apr 3 08:03:21 2001 From: robert at physik.uni-wuerzburg.de (Robert Urbanczik) Date: Tue, 3 Apr 2001 14:03:21 +0200 (CEST) Subject: Paper on SVMs Message-ID: Dear Connectionists, the following paper (5 pages, to appear in Phys. Rev. Letts.) is available from: ftp://ftp.physik.uni-wuerzburg.de/pub/preprint/2001/WUE-ITP-2001-006.ps.gz M. Opper and R. Urbanczik Universal learning curves of support vector machines ABSTRACT: Using methods of Statistical Physics, we investigate the r\^ole of model complexity in learning with support vector machines (SVMs), which are an important alternative to neural networks. We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal generalization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which only depend on the target rule but not on the SVM kernel. _________________________________________________________________________ R. Urbanczik Email: Inst. for Theoretical Physics III urbanczik at physik.uni-wuerzburg.de University Wuerzburg Phone: Am Hubland ++49 931 888 4908 97074 Wuerzburg Fax: Germany ++49 931 888 5141 _________________________________________________________________________ From bokil at physics.bell-labs.com Fri Apr 6 18:48:44 2001 From: bokil at physics.bell-labs.com (Hemant Bokil) Date: Fri, 6 Apr 2001 18:48:44 -0400 (EDT) Subject: WAND 2001 Woodshole Message-ID: WORKSHOP ON THE ANALYSIS OF NEURAL DATA Modern methods and open issues in the analysis and interpretation of multivariate time series and imaging data in the neurosciences August 20 -- September 1, 2001 Marine Biological Laboratory, Woodhole, MA A working group of scientists committed to quantitative approaches to problems in neuroscience will meet again this summer to focus on theoretical and experimental issues related to the analysis of single and multichannel data sets. As in past years, we expect that a distinguishing feature of the work group will be a close collaboration between experimentalists and theorists with regard to the analysis of data. There will be a limited number of research lectures, supplemented by tutorials on relevant computational, experimental, and mathematical techniques. The topics covered will include the analysis of point process data (spike trains) as well as continuous processes (LFP, imaging data). It has become clear in recent years that issues relating to the "neural code" can be concretely investigated in the context of neural prosthetic devices. We are therefore planning two miniworkshops, (i) Neuronal control signals for prosthetic devices. (ii) Timing issues: departures of spike trains from rate varying Poisson processes. We will also have a third miniworkshop on (iii) Statistical inference for fMRI time series We should be able to accomodate about twenty five participants, both experimentalists and theorists and encourage graduate students, postdoctoral fellows, as well as senior researchers to apply. Experimentalists are encouraged to bring data records; appropriate computational facilities will be provided. PARTICIPANT FEE: $300 Participants will be provided with shared accomodations and board. Support: National Institutes of Health -- NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA and NINDS. Organizers: Partha P. Mitra (Bell Laboratories, Lucent Technologies), Emery Brown (Massachusettes General Hospital) and David Kleinfeld (UCSD) Website: www.vis.caltech.edu/~WAND Application: Send a copy of your c.v. together with a cover letter that contains a brief (ca. 200 word) paragraph on why you wish to attend the work group to: Ms. Jean B. Ainge Bell Laboratories, Lucent Technologies 700 Mountain Avenue 1D-427 Murray Hill, NJ 07974 908-582-4702 (fax) or Graduate students and postdoctoral fellows are encouraged to include a brief letter of support from their research advisor. Applications must be received before 25 May 2001 Participants will be confirmed on or before 1 June 2001 From vaina at engc.bu.edu Sun Apr 8 19:53:31 2001 From: vaina at engc.bu.edu (Lucia M. Vaina) Date: Sun, 8 Apr 2001 19:53:31 -0400 Subject: OPTIC FLOW AND BEYOND: A Boston Area Meeting/May 23 Message-ID: OPTIC FLOW AND BEYOND: A Boston Area Meeting May 23, 2001 Organizers: Lucia M. Vaina, Scott A. Beardsley, Simon Rushton The intention of this one day meeting is to bring together people working in the areas of Optic Flow and/or visually guided locomotion to discuss current research and key issues in the field. The meeting will be hosted by the Brain and Vision Research Laboratory Department of Biomedical Engineering Boston University and will be held May 23 (9am-6pm) Engineering Research Building (ERB) Room 203 44 Cummington Street Boston, Ma 02215 REGISTRATION: DEADLINE MAY 10TH (registration is free) If you are interested in participating please submit a one page position paper by May 10th. Position papers should be sent by e-mail to Lucia Vaina, vaina at engc.bu.edu. If you are interested in attending the meeting but do not wish to present please send an e-mail as well to RSVP. We encourage all participants to send suggestions for discussion points. Submissions received by May 10 will be posted on the Brain and Vision Research Laboratory website by May 15 together with a detailed schedule: http://www.bu.edu/eng/labs/bravi/ under OPTIC FLOW AND BEYOND. *************************************************************** Tentative schedule: 9am Coffee and bagels 10am-1pm Short presentations from those who have submitted position papers. (Presentation time will be EQUALLY divided among all speakers). 1pm-2:30pm Lunch 2:30pm-5:30pm Discussion of key points in current research 5:30pm-6pm Discuss the publication of a book with contributions from the participants. ************************************************************** FORMAT: The morning presentations will consist of short summaries of the key points of each speaker's research (Suggested length: ~3-5 slides). During the afternoon discussion section, slides may also be used. Please volunteer if you wish to lead a disccusion topic! The room has approximately 50 seats togther with a computer projector and audio-visual equipment. Slides and overhead projectors are available upon request. ACCOMODATIONS: While we are unable to cover accomodations for those who attend, breakfast and lunch will be offered to all participants. For additional information contact Lucia or Scott at 617-353-9144 or e-mail: vaina at engc.bu.edu We look forward to hearing from you, Scott, Simon and Lucia From cjlin at csie.ntu.edu.tw Sun Apr 8 20:02:53 2001 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Mon, 09 Apr 2001 08:02:53 +0800 Subject: a new paper on SVM Message-ID: Dear Colleagues: We announce a new paper on support vector machines: A comparison on methods for multi-class support vector machines by Chih-Wei Hsu and Chih-Jen Lin. http://www.csie.ntu.edu.tw/~cjlin/papers/multisvm.ps.gz Abstract: Support vector machines (SVM) was originally designed for binary classification. How to effectively extend it for multi-class classification is still an on-going research issue. Several methods have been proposed where typically we construct a multi-class classifier by combining several binary classifiers. Some authors also proposed methods that consider all classes of data at once. As it is computationally more expensive on solving multi-class problems, comparisons on these methods using large-scale problems have not been seriously conducted. Especially for methods solving multi-class SVM in one step, a much larger optimization problem is required so up to now experiments are limited to small data sets. In this paper we give decomposition implementation for two such ``all-together" methods: (Vapnik 98; Weston and Watkins 1998) and (Crammer and Singer 2000). We then compare their performance with three methods based on binary classification: ``one-against-all,'' ``one-against-one,'' and DAGSVM (Platt et al. 2000). Our experiments indicate that the ``one-against-one'' and DAG methods are more suitable for practical use than the other methods. Results also show that for large problems the method by considering all data at once in general needs fewer support vectors. Any comments are very welcome. Best Chih-Jen Lin Dept. of Computer Science National Taiwan Univ. From swilke at physik.uni-bremen.de Mon Apr 9 08:18:29 2001 From: swilke at physik.uni-bremen.de (Stefan Wilke) Date: Mon, 09 Apr 2001 14:18:29 +0200 Subject: Paper on Population Coding available Message-ID: <3AD1A894.701C4612@physik.uni-bremen.de> Dear Connectionists, the following preprint is available for downloading: http://www-neuro.physik.uni-bremen.de/institute/publications/download/swilke/WilkeEurich2001-NeuralComp.pdf Stefan D. Wilke & Christian W. Eurich Representational Accuracy of Stochastic Neural Populations. Neural Computation, in press. Abstract: Fisher information is used to analyze the accuracy with which a neural population encodes D stimulus features. It turns out that the form of response variability has a major impact on the encoding capacity and therefore plays an important role in the selection of an appropriate neural model. In particular, in the presence of baseline firing, the reconstruction error rapidly increases with D in the case of Poissonian noise, but not for additive noise. The existence of limited-range correlations of the type found in cortical tissue yields a saturation of the Fisher information content as a function of the population size only for an additive noise model. We also show that random variability in the correlation coefficient within a neural population, as found empirically, considerably improves the average encoding quality. Finally, the representational accuracy of populations with inhomogeneous tuning properties, either with variability in the tuning widths or fragmented into specialized subpopulations, is superior to the case of identical and radially symmetric tuning curves usually considered in the literature. -- Stefan D. Wilke Institut fuer Theoretische Physik (NW1) Universitaet Bremen Postfach 330 440 D-28334 Bremen, Germany Phone: +49 (421) 218 4524 WWW : http://www-neuro.physik.uni-bremen.de/~swilke From maggini at dii.unisi.it Tue Apr 10 03:42:48 2001 From: maggini at dii.unisi.it (Marco Maggini) Date: Tue, 10 Apr 2001 09:42:48 +0200 (CEST) Subject: CfP: LFTNC 2001 - Poster session Message-ID: **************************************************** * * * LFTNC 2001 * * * * NATO ADVANCED RESEARCH WORKSHOP ON * * LIMITATIONS AND FUTURE TRENDS * * IN NEURAL COMPUTATION * * * * SIENA, ITALY * * 22 - 24 OCTOBER 2001 * * * * Poster Session * * Call for papers * * * **************************************************** The poster session is an event within the NATO ARW, LFTNC 2001. It will be the opportunity for the participants to give their own contribution on the topics of the workshop. This contribution will complement the view on the future trends in neural computation that will be given by the key speakers who will present critical issues and proposals for new very promising research guidelines in the next few years. The topics for the contribution include: * limitations of neural computation * complexity issues in the continuum * continuous optimisation based learning * generalisation * real-world applications showing the benefits of neural approaches * integration of neural models with knowledge-based models The papers selected by the program committee will be published in separate proceedings titled "LFTNC-SC 2001 - 2001 NATO ARW on Limits and Future Trends of Neural Computing". Refer to the ARW web page for further details: http://www.ewh.ieee.org/soc/im/2001/lftnc/ Submission procedure -------------------- Authors should submit an extended abstract in English, not exceeding 8 pages (A4 paper, 12 pt size for text, double spacing) including tables, figures and references. Submission should include the authors' names and affiliations. The corresponding author should be clearly identified providing his/her mail address, telephone and fax numbers, and email address. The extended abstract must be sent to Prof. Marco Maggini by e-mail (maggini at dii.unisi.it). The message body must contain the following information: title, list of authors and their affiliations, keywords, a short abstract, and contact information for the corresponding author. The subject of the e-mail should be "LFTNC2001 Submission". The paper should be attached to the message as a postscript or pdf file. Long files should be compressed before emailing, by using compress, pkzip, or winzip. Please, do not send Word or Latex files. Corresponding authors will be notified by email within one week after receiving the submission. A paper number will be provided in the receipt to identify each paper. The list of received papers will be also posted on the web site of the LFTNC 2001. Acceptance/rejection will be notified by JULY 8, 2001. A postscript or pdf file of the final version of the accepted papers will be due by SEPTEMBER 8, 2001. Visit the author's kit page on the workshop web site for detailed instructions concerning the preparation of the final manuscript. Papers will be included in the proceedings only if at least one author will have been registered at the NATO ARW LFTNC 2001 by the given deadline. Important dates --------------- Submission of extended abstracts: June 8, 2001 Notification of acceptance: July 8, 2001 Author registration: July 31, 2001 Camera ready due: September 8, 2001 Contacts -------- Poster session chair Prof. Marco Maggini Dipartimento di Ingegneria dell'Informazione Università di Siena Via Roma 56 I-53100 - Siena (Italy) Tel: +39 0577 233696 Fax: +39 0577 233602 e-mail: maggini at dii.unisi.it From d.mareschal at bbk.ac.uk Tue Apr 10 07:16:59 2001 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Tue, 10 Apr 2001 12:16:59 +0100 Subject: Phd position in psychology/cognitive science Message-ID: PLEASE BRING TO THE ATTENTION OF RELEVANT PEOPLE A 3 year PhD position funded by the European Commission has recently become available for a student interested in Implicit Learning and its relationship to conscious awareness. The project will be carried out under the supervision of Dr. Axel Cleermans in Brussels. Interested candidates should contact Professor Robert FRENCH directly at the following address for more information: Robert M. French, Ph.D Quantitative Psychology and Cognitive Science Psychology Department (B32) University of Liege 4000 Liege, Belgium Tel: (32.4) 366.20.10 (work) FAX: (32.4) 366.28.59 email: rfrench at ulg.ac.be URL: http://www.fapse.ulg.ac.be/Lab/cogsci/rfrench.html Best Regrds, Denis Mareschal ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 (0)20 7631-6582/6207 fax +44 (0)20 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From s.perkins at lanl.gov Tue Apr 10 19:05:24 2001 From: s.perkins at lanl.gov (Simon Perkins) Date: Tue, 10 Apr 2001 17:05:24 -0600 Subject: Post-doc job opening Message-ID: <3AD391B4.78480384@lanl.gov> POSTDOCTORAL POSITION IN MACHINE LEARNING THEORY AND APPLICATIONS Space and Remote Sensing Sciences Group Los Alamos National Laboratory Candidates are sought for a postdoctoral position in the Space and Remote Sensing Sciences Group at Los Alamos National Laboratory in New Mexico, USA. The job will involve developing and applying state of the art machine learning techniques to practical problems in multispectral image feature identification, and in multichannel time series analysis. Prospective candidates should have a demonstrated ability to perform independent and creative research, and should have good mathematical skills. Familiarity with modern statistical machine learning techniques such as support vector machines, boosting, Gaussian processes or Bayesian methods is essential. Experience with other machine learning paradigms including neural networks and genetic algorithms is also desirable. The candidate should be able to program competently in a language such as C, C++ or Java. Experience with image or signal processing is a plus, and some knowledge of remote sensing or space physics could also be useful. The Space and Remote Sensing Sciences Group is part of the Nonproliferation and International Security Division at LANL. Its mission is to develop and apply remote sensing technologies to a variety of problems of national and international interest, including nonproliferation, detection of nuclear explosions, safeguarding nuclear materials, climate studies, environmental monitoring, volcanology, space sciences, and astrophysics. Los Alamos is a small and very friendly town situated 7200' up in the scenic Jemez mountains in northern New Mexico. The climate is very pleasant and opportunities for outdoor recreation are numerous (skiing, hiking, biking, climbing, etc). The Los Alamos public school system is excellent. LANL provides a very constructive working environment with abundant resources and support, and the opportunity to work with intelligent and creative people on a variety of interesting projects. Post-doc starting salaries are usually in the range $50-60K depending on experience, and generous assistance is provided with relocation expenses. The initial contract offered would be for two years, with good possibilities for contract extensions. The ability to get a US Department of Energy 'Q' clearance (which normally requires US citizenship) is helpful but not essential. Applicants must have received their PhD within the last three years. Interested candidates should contact Dr Simon Perkins, by e-mail: s.perkins at lanl.gov; or snail mail: Los Alamos National Laboratory, Mail Stop D-436, Los Alamos, NM 87545, USA. Please include a full resume and a covering letter explaining why you think you would make a good candidate. E-mail Postscript/PDF/Word attachments are fine. The deadline for applications is Friday, May 4th, 2001. From Igor.Tetko at iphysiol.unil.ch Wed Apr 11 03:53:54 2001 From: Igor.Tetko at iphysiol.unil.ch (Igor Tetko) Date: Wed, 11 Apr 2001 09:53:54 +0200 Subject: Article: Associative Neural Network Message-ID: Dear Connectionists, the following paper (15 pages in pdf format) is available as CogPrints archive http://cogprints.soton.ac.uk/documents/disk0/00/00/14/41/index.html ID code: cog00001441. and also from http://www.lnh.unil.ch/~itetko/articles/asnn.pdf Best regards, Igor Tetko Igor V. Tetko Associative Neural Network ABSTRACT: An associative neural network (ASNN) is a combination of an ensemble of the feed-forward neural networks and the K-nearest neighbor technique. The introduced network uses correlation between ensemble responses as a measure of distance of the analyzed cases for the nearest neighbor technique and provides an improved prediction by the bias correction of the neural network ensemble. An associative neural network has a memory that can coincide with the training set. If new data become available, the network further improves its predicting ability and can often provide a reasonable approximation of the unknown function without a need to retrain the neural network ensemble. From schultz at cns.nyu.edu Wed Apr 11 10:14:09 2001 From: schultz at cns.nyu.edu (Simon Schultz) Date: Wed, 11 Apr 2001 10:14:09 -0400 Subject: preprint: neural spike trains Message-ID: <3AD466B1.AC399C2D@cns.nyu.edu> Dear Connectionists, The following preprint is available for downloading: S. R. Schultz and S. Panzeri (2001), Temporal correlations and neural spike train entropy. Physical Review Letters, in press. Abstract: Sampling considerations limit the experimental conditions under which information theoretic analyses of neurophysiological data yield reliable results. We develop a procedure for computing the full temporal entropy and information of ensembles of neural spike trains, which performs reliably for limited samples of data. This approach also yields insight upon the role of correlations between spikes in temporal coding mechanisms. The method, when applied to recordings from complex cells of the monkey primary visual cortex, results in lower RMS error information estimates in comparison to a `brute force' approach. A preprint (4 pages in PDF format) can now be downloaded: http://www.cns.nyu.edu/~schultz/tempent.pdf It can also be obtained from the Los Alamos archive: http://arXiv.org/abs/physics/0001006 Cheers, Simon Schultz -- Dr. Simon R. Schultz Phone: +1-212 998 3775 Howard Hughes Medical Institute & Fax: +1-212 995 4011 Center for Neural Science, Email:schultz at cns.nyu.edu New York University, 4 Washington Place, New York NY 10003, U.S.A. http://www.cns.nyu.edu/~schultz/ From cpoon at mit.edu Wed Apr 11 18:20:32 2001 From: cpoon at mit.edu (Chi-Sang Poon) Date: Wed, 11 Apr 2001 18:20:32 -0400 Subject: Hebbian feedback covariance learning control In-Reply-To: <20010323080111.23EBF2B229@endor.bbb.caltech.edu> Message-ID: The following paper is available for viewing/downloading (as PDF file) from the IEEE electronic archive: http://ieeexplore.ieee.org/lpdocs/epic03/RecentIssues.htm?punumber=3477 OR http://ieeexplore.ieee.org/iel5/3477/19768/00915341.pdf -------------------------------------------------------------------- A Hebbian Feedback Covariance Learning Paradigm for Self-Tuning Optimal Control D.L. Young and C.-S. Poon IEEE Trans. Systems, Man and Cybernetics, Part B, Volume: 31 Issue: 2, pp. 173-186, April 2001 We propose a novel adaptive optimal control paradigm inspired by Hebbian covariance synaptic adaptation, a preeminent model of learning and memory and other malleable functions in the brain. The adaptation is driven by the spontaneous fluctuations in the system input and output, the covariance of which provides useful information about the changes in the system behavior. The control structure represents a novel form of associative reinforcement learning in which the reinforcement signal is implicitly given by the covariance of the input-output signals. Theoretical foundations for the paradigm are derived using Lyapunov theory and are verified by means of computer simulations. The learning algorithm is applicable to a general class of non-linear adaptive control problems. This on-line direct adaptive control method benefits from a computationally straightforward design, proof of convergence, no need for complete system identification, robustness to noise and uncertainties, and the ability to optimize a general performance criterion in terms of system states and control signals. These attractive properties of Hebbian feedback covariance learning control lend themselves to future investigations into the computational functions of synaptic plasticity in biological neurons. From C.Campbell at bristol.ac.uk Thu Apr 12 10:50:42 2001 From: C.Campbell at bristol.ac.uk (Colin Campbell, Engineering Mathematics) Date: Thu, 12 Apr 2001 15:50:42 +0100 (GMT Daylight Time) Subject: Fixed term lectureship position In-Reply-To: <200103211315.OAA06453@mail.gmd.de> Message-ID: Lectureship in the Department of Engineering Mathematics Applications are invited for a 5 year Lectureship in the Department of Engineering Mathematics, University of Bristol, United Kingdom. Candidates should have an excellent track record in research related to, or complementing, those of the Artificial Intelligence and Computational Intelligence groups in the department. The Artificial Intelligence Research Group The Artificial Intelligence group has an international reputation for the development and use of innovative methods for handling uncertainty in real-world AI applications. Current research includes theories and applications of logic programming, reasoning with uncertainty, modelling with words, fuzzy sets and fuzzy logic. For more information on the Artificial Intelligence group please visit: http://www.enm.bris.ac.uk/ai The Computational Intelligence group The research of the Computational Intelligence group centres on sub-symbolic approaches to machine intelligence. On the theoretical side interests include statistical learning theory, support vector machines, neural networks and the design of learning algorithms. Applications include applying these methods to medical decision support, bioinformatics and machine vision datasets. For more information on the Computational Intelligence group please visit: http://lara.enm.bris.ac.uk/~cig The Artificial Intelligence and Computational Intelligence groups are part of the University's Advanced Computing Research Centre (ACRC). The department runs its own degree programmes in Engineering Mathematics and Mathematics for Intelligent Systems in addition to providing mathematical, AI and theoretical computer science courses for undergraduate and masters degree programmes across the Faculty. The department achieved a score of 5 (research quality of international excellence) in the last Research Assessment Exercise and 23/24 for the HEFCE TQA assessment of its courses. For more information on this position and details of how to apply please visit the following web page: http://www.enm.bris.ac.uk/admin/vacancies/Lect_01.htm For more information on research and life at Bristol please visit the following web pages: Artificial Intelligence Group: http://www.enm.bris.ac.uk/ai Computational Intelligence group: http://lara.enm.bris.ac.uk/~cig Department of Engineering Mathematics: http://www.enm.bris.ac.uk Faculty of Engineering: http://www.fen.bris.ac.uk University of Bristol: http://www.bris.ac.uk -- -------------------------------------------------------------------------- Artificial Intelligence Group Tel: (+44) 117 9289743 Dept of Engineering Maths Fax: (+44) 117 9251154 University of Bristol Email: Jonathan.Rossiter at bris.ac.uk Bristol BS8 1TR UK http://eis.bris.ac.uk/~enjmr -------------------------------------------------------------------------- From dimi at ci.tuwien.ac.at Thu Apr 12 05:58:57 2001 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Thu, 12 Apr 2001 11:58:57 +0200 (CEST) Subject: CI BibTeX Collection -- Update In-Reply-To: Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 4/4 IEEE Transactions on Fuzzy Systems, Volumes 8/6 Machine Learning, Volumes 42/1-43/2 Neural Computation, Volumes 12/12-13/1 Neural Networks, Volumes 13/7-14/3 Neural Processing Letters, Volumes 12/3-13/1 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fuer Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitaet Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ _______________________________________________ nn-at mailing list - nn-at at ci.tuwien.ac.at http://fangorn.ci.tuwien.ac.at/cgi-bin/mailman/listinfo/nn-at From bozinovs at rea.etf.ukim.edu.mk Fri Apr 13 15:39:16 2001 From: bozinovs at rea.etf.ukim.edu.mk (Stevo Bozinovski) Date: Fri, 13 Apr 2001 21:39:16 +0200 (DFT) Subject: neural cell genetics CFP Message-ID: -------------------------------------------------------------------- We apologize if you receive multiple copies of this message Please, feel free to distribute it to interested persons -------------------------------------------------------------------- --- Call for Papers --- BIONICS OF PRODUCTION LINES: GENETICS, METABOLICS, AND FLEXIBLE MANUFACTURING Invited Session at Fifth Multiconference on Systemics, Cybernetics and Informatics July 22-25 Orlando, Florida Sheraton World Resort Background and Motivation Analogies between biological and technical systems have been explored since the Wiener's statement of Cybernetics. A successful example today is the research in neural networks, natural and artificial. We believe that the analogy between biological and non-biological production lines is another interesting area to explore in the realm of cybernetics and bionics. This session is aimed to be a forum of exchanging ideas and presenting facts about both the cell production systems and modern concepts of adaptive manufacturing. Emphasis is put on exploring the biological production lines in terms of the concepts relevant for flexible manufacturing systems, but also on recognizing solutions in biology relevant for human-made manufacturing systems. List of relevant topics includes, but is not limited to Genetics of manufacturing Control of protein biosynthesis Metabolic networks Biomolecular robots and biomolecular machines Autonomous manufacturing systems Unicellular systems Neural cell genetics Genes, sensors, behavior, adaptation Surviving strategies Flexible manufacturing systems Material processing in biological and human-made FMS Information processing in biological and human-made FMS Adaptive manufacturing systems Mobile factories on other planets Bionic manufacturing systems Submissions Potential participants should submit an extended abstract or paper draft of their work in the area. Submissions will be reviewed by independent referees, and should not exceed 2000 words for extended abstracts and 5000 words for paper drafts. Submissions should be sent via e-mail as ASCII or PDF file to either of the session organizers. Accepted papers will be published in the Conference Proceedings. In addition, an effort will be made to publish the papers in a special volume of a major publisher. Important dates Submission of manuscripts: April 30, 2001 Notification of acceptance: May 13, 2001 Camera ready copy: May 23, 2001 Registration Attendants to the invited session must register for the main conference. There is no additional fee for the session. Please see the Conference web page http://www.iiis.org/sci/ for details. Organizers Stevo Bozinovski Computer Science and Engineering Institute Center for Beings Research Electrical Engineering Department Sts Cyril and Methodius University Skopje, Macedonia bozinovs at rea.etf.ukim.edu.mk Ralf Hofestaedt Computer Science Department Otto-von-Guericke University Magdeburg, Germany hofestae at iti.cs.uni-magdeburg.de From greiner at cs.ualberta.ca Fri Apr 13 21:15:52 2001 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Fri, 13 Apr 2001 19:15:52 -0600 Subject: Call for Applications: PostDoctoral Research Scientists Message-ID: <20010414011552Z433495-3037+191@scapa.cs.ualberta.ca> POSTDOCTORAL RESEARCH SCIENTISTS Call for Applications Bioinformatics - Query Answering - Game Playing - Adaptive Agents Machine Learning - Probabilistic Modelling - Decision Support Just got your PhD and want to focus on pure curiosity-driven research before jumping into the tenure-track pressure cooker? We are looking for one or more postdoctoral research scientists (PDRS), to help us work on theoretical and applied research in various specific research projects -- related to the topics listed above and with the individual researchers below, in collaboration with various research-friendly companies, including BioTools, Chenomx, Electronic Arts, CELcorp, net-linx, Syncrude, ... These PostDoc positions improve on most positions as... * The salary will be competitive with tenure-track positions. * You will also be allowed/expected to spend 50% of your time on your own curiousity-driven agenda -- we hope in collaboration with various members of our faculty; see http://www.cs.ualberta.ca/~ai. You will be part of a team that has recently emerged as one of the strongest AI groups anywhere, with a number of world-class professors that include editors-in-chief of major journals, AAAI Fellows, Steacie Fellows, McCalla Fellows ... And we are continuing to grow and improve. Moreover, our department is known for its collegiality. You will also help us show off our group in 2002, when we will host AAAI'02 http://aaai.org KDD'02 http://www.acm.org/sigkdd ISMB'02 http://www.iscb.org Applicants should * have a solid background in one or more of the areas described above * have good scientific skills * be good at writing software to implement and evaluate algorithms Successful applicants will have the opportunity to do sessional teaching in the department of Computing Science. Applicants should EMAIL a CV, the email addresses of 3 references, and a short description of their research interests and goals as a postdoc (ascii format, < 500 words) to Russ Greiner (greiner at cs.ualberta.ca) You are encouraged to *also* post additional information on http://www.cs.ualberta.ca/jobs/postdoc.html We are very flexible with time commitments; applicants should indicate how long they would like to remain as a PDRS -- typical stay is between 1 and 3 years. We are an equal opportunity employer, eagerly seeking applicants from Canada or any other country. For more information, see http://www.cs.ualberta.ca/~greiner/PostDoc.html | R Greiner Phone: (780) 492-5461 | | Dep't of Computing Science FAX: (780) 492-1071 | | University of Alberta Email: greiner at cs.ualberta.ca | | Edmonton, AB T6G 2E8 Canada http://www.cs.ualberta.ca/~greiner/ | From terry at salk.edu Tue Apr 17 16:45:29 2001 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 17 Apr 2001 13:45:29 -0700 (PDT) Subject: NEURAL COMPUTATION 13:5 In-Reply-To: <200103072248.f27MmVH58010@kepler.salk.edu> Message-ID: <200104172045.f3HKjTA44914@purkinje.salk.edu> Neural Computation - Contents - Volume 13, Number 5 - May 1, 2001 ARTICLE Patterns of Synchrony in Neural Networks with Spike Adaptation C. van Vreeswijk and D. Hansel NOTE Bayesian Analysis of Mixtures of Factor Analyzers Akio Utsugi and Toru Kumagai LETTERS Synchronization in Relaxation Oscillator Networks with Conduction Delays Jeffrey J. Fox, Ciriyam Jayaprakash, DeLiang Wang and Shannon R. Campbell Predictions of the Spontaneous Symmetry-Breaking Theory for Visual Code Completeness and Spatial Scaling in Single-Cell Learning Rules Chris J. S. Webber Localist Attractor Networks Richard S. Zemel and Michael C. Mozer Stochastic Organization of Output Codes in Multiclass Learning Problems Wolfgang Utschick and Werner Weichselberger Predictive Approaches for Choosing Hyperparameters in Gaussian Processes S. Sundararajan and S.S. Keerthi Architecture-Independent Approximation of Functions Vicente Ruiz de Angulo and Carme Torras Analyzing Holistic Parsers: Implications for Robust Parsing and Systematicity Edward Kei Shiu Ho and Lai Wan Chan The Computational Exploration of Visual Word Recognition in a Split Model Richard Shillcock and Padraic Monaghan ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From mschmitt at lmi.ruhr-uni-bochum.de Tue Apr 17 07:31:43 2001 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Tue, 17 Apr 2001 13:31:43 +0200 Subject: Preprint on Multiplicative Neural Networks Message-ID: <3ADC299F.DD35F42B@lmi.ruhr-uni-bochum.de> Dear Colleagues, a preprint of the paper "On the complexity of computing and learning with multiplicative neural networks" by Michael Schmitt, to appear in Neural Computation, is available on-line from http://www.ruhr-uni-bochum.de/lmi/mschmitt/multiplicative.ps.gz (63 pages gzipped PostScript). Regards, Michael Schmitt ------------------------------------------------------------ TITLE: On the Complexity of Computing and Learning with Multiplicative Neural Networks AUTHOR: Michael Schmitt ABSTRACT In a great variety of neuron models neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units which multiply their inputs instead of summing them and, thus, allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well studied network types as higher-order networks and product unit networks. We investigate the complexity of computing and learning for multiplicative neural networks. In particular, we derive upper and lower bounds on the Vapnik-Chervonenkis (VC) dimension and the pseudo dimension for various types of networks with multiplicative units. As the most general case, we consider feedforward networks consisting of product and sigmoidal units, showing that their pseudo dimension is bounded from above by a polynomial with the same order of magnitude as the currently best known bound for purely sigmoidal networks. Moreover, we show that this bound holds even in the case when the unit type, product or sigmoidal, may be learned. Crucial for these results are calculations of solution set components bounds for new network classes. As to lower bounds we construct product unit networks of fixed depth with superlinear VC dimension. For sigmoidal networks of higher order we establish polynomial bounds that, in contrast to previous results, do not involve any restriction of the network order. We further consider various classes of higher-order units, also known as sigma-pi units, that are characterized by connectivity constraints. In terms of these we derive some asymptotically tight bounds. Multiplication plays an important role both in neural modeling of biological behavior and in computing and learning with artificial neural networks. We briefly survey research in biology and in applications where multiplication is considered an essential computational element. The results we present here provide new tools for assessing the impact of multiplication on the computational power and the learning capabilities of neural networks. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: +49 234 32-23209 , Fax: +49 234 32-14465 http://www.ruhr-uni-bochum.de/lmi/mschmitt/ From ceesvl at brain.riken.go.jp Wed Apr 18 00:29:28 2001 From: ceesvl at brain.riken.go.jp (Cees van Leeuwen) Date: Wed, 18 Apr 2001 13:29:28 +0900 Subject: pre- and postdoc positions available Message-ID: <001a01c0c7c0$27f32bb0$b4a6a086@CEESVDESKTOP> Post and pre-doctoral Positions, and Technical Assistant Positions in Experimental and Computational Psychology Perceptual Dynamics Laboratory RIKEN BSI Several pre and postdoctoral research positions, and technical assistant positions are available immediately in the newly established Perceptual Dynamics Laboratory (head: Cees van Leeuwen) at the RIKEN Brain Science Institute, Japan. The RIKEN BSI is the primary government-funded basic research institute in Japan. The working language is English. The researchers will be working on interdisciplinary projects relating to perceptual integration and the perception of visual objects and scenes. Focus of the laboratory is the application of complex (i.e. nonlinear, nonstationary) dynamical systems to visual perception. We are looking for candidates who are interested in computational and experimental approaches. Candidates should have a strong background in one or more of the following: computational or mathematical modeling of neural information processes, cognitive neuroscience, cognitive psychology, psychophysiology, psychophyics, or related disciplines. Technical assistant positions include: a software engineer to further develop an interactive research environment for the numerical simulation of complex dynamical systems. The preferred computer languages are Matlab and C++. We will also consider those who are interested in developing applications for running and analyzing experiments, including multi-channel EEG and eye-movement recording. Candidates should have an M.Sc. or equivalent. Those interested to qualify for a higher degree will be considered. The laboratory offers excellent research facilities for conducting computer simulations and experiments, travel support for conferences, an attractive international academic environment, and is located within the Tokyo metropolitan area. Housing facilities are available for an initial period. Competitive salaries are offered. The initial appointment will be for one year, and will be renewable on an annual basis. Recruitment continues until the applications are filled. Applicants should send curriculum vitae, statement of research interests, two letters of reference, and representative publications to: Prof. Cees van Leeuwen Perceptual Dynamics Laboratory 2-1 Hirosawa, Wakoshi Saitama, 351-0198 Japan ceesvl at brain.riken.go.jp www.brain.riken.go.jp From jfgf at cs.berkeley.edu Wed Apr 18 18:34:42 2001 From: jfgf at cs.berkeley.edu (Nando de Freitas) Date: Wed, 18 Apr 2001 15:34:42 -0700 Subject: Particle filtering website Message-ID: <3ADE1682.14AF02F3@cs.berkeley.edu> Dear Connectionists, For those of you interested in sequential data analysis and tracking using particle filters (aka condensation, survival of the fittest, sequential Monte Carlo, ...), there is a website at http://www-sigproc.eng.cam.ac.uk/smc/index.html and mirrored at http://www.cs.berkeley.edu/~jfgf/smc/ that has a list of people, papers, links an software in this field. We encourage you to submit your papers in this area so as to strengthen the link between the related work in the fields of control, signal processing, physics, AI, vision, econometrics and statistics. Best, Nando -- Computer Science Division | Phone : (510) 642-4979 387 Soda Hall | Fax : (510) 642-5775 University of California, Berkeley | E-mail: jfgf at cs.berkeley.edu Berkeley, CA 94720-1776 USA | URL : http://www.cs.berkeley.edu/~jfgf From zemel at cs.toronto.edu Thu Apr 19 10:39:08 2001 From: zemel at cs.toronto.edu (Richard Zemel) Date: Thu, 19 Apr 2001 10:39:08 -0400 Subject: NIPS*2001 Announcement Message-ID: <01Apr19.103910edt.453179-20163@jane.cs.toronto.edu> ***** NIPS is moving to Vancouver in 2001 ***** CALL FOR PAPERS -- NIPS*2001 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Dec. 3 -- Saturday, Dec. 8, 2001 Vancouver, British Columbia, Canada Whistler Ski Resort ========================================== This is the fifteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Dec. 3), and following it there will be two days of focused workshops on topical issues at Whistler Ski Resort (Dec. 7-8). Invited speakers this year will be: Barbara Finlay (Departments of Psychology, and Neurobiology and Behavior, Cornell University) Alison Gopnik (Department of Psychology, University of California at Berkeley) Jon M. Kleinberg (Department of Computer Science, Cornell University) Shihab Shamma (Department of Electrical Engineering University of Maryland) Judea Pearl (Department of Computer Science, UCLA) Tom Knight (Artificial Intelligence Laboratory, MIT) Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, kernel methods, committee models, graphical models, support vector machines, Gaussian processes, decision trees, factor analysis, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, web and network applications, intrusion and fraud detection, bio-informatics, medical diagnosis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music and video applications, animation, virtual environments. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, human learning and memory, conditioning, categorization, attention, language, reasoning, spatial cognition, emotional cognition, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, neuromorphic engineering, microelectromechanical systems, optical systems, vision chips, head-eye systems, neural prostheses, roving robots, computational sensors and actuators, molecular and quantum computing, novel neurodevices, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, population codes, temporal codes, spike train analysis, and experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, computational models of classical and operant conditioning, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, game-playing, Markov decision processes, multi-agent cooperative algorithms. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory. Visual Processing: image processing, image coding, object recognition, face recognition, visual feature detection, visual psychophysics, stereopsis, optic flow algorithms, motion detection and tracking, spatial representations, spatial attention, scene analysis, visual search, visuo-spatial working memory. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable from the web page listed below. The style files are unchanged from NIPS*2000. Submission Instructions: NIPS accepts only electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We will accept postscript and PDF documents, but we prefer postscript. The electronic submission page will be available on June 6, 2001 Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT JUNE 20, 2001 PACIFIC DAYLIGHT TIME (08:00 GMT JUNE 21, 2001). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS For general inquiries send e-mail to nipsinfo at salk.edu. NIPS*2001 Organizing Committee: General Chair, Tom Dietterich, Oregon State University; Program Chair, Sue Becker, McMaster University; Publications Chair, Zoubin Ghahramani, University College London; Tutorial Chair, Yoshua Bengio, University of Montreal; Workshops Co-Chairs, Virginia de Sa, Sloan Center for Theoretical Neurobiology, Barak Pearlmutter, University of New Mexico; Publicity Chair, Richard Zemel, University of Toronto; Volunteer Coordinator, Sidney Fels, University of British Columbia; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Alex Gray, Carnegie Mellon University, Xin Wang, Oregon State University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2001 Program Committee: Sue Becker, McMaster University (chair); Gert Cauwenberghs, Johns Hopkins University; Bill Freeman, Mitsubishi Electric Research Lab; Thomas Hofmann, Brown University; Dan Lee, Bell Laboratories, Lucent Technologies; Sridhar Mahadevan, Michigan State University; Marina Meila-Predoviciu, University of Washington; Klaus Mueller, GMD First, Berlin; Klaus Obermayer, TU Berlin; Sam Roweis, Gatsby Computational Neuroscience Unit, UCL; John Shawe-Taylor, Royal Holloway, University of London; Josh Tenenbaum, Stanford University; Volker Tresp, Siemens, Munich; Richard Zemel, University of Toronto. PAPERS MUST BE SUBMITTED BY JUNE 20, 2001 From ingber at ingber.com Thu Apr 19 18:20:51 2001 From: ingber at ingber.com (Lester Ingber) Date: Thu, 19 Apr 2001 17:20:51 -0500 Subject: Computational Finance Position Message-ID: <20010419172051.A23252@ingber.com> If you have very strong credentials for the position described below, please send your resume to: Prof. Lester Ingber Director Research & Development DRW Investments LLC 311 S Wacker Dr Ste 900 Chicago, IL 60606 Email (preferred) ingber at ingber.com COMPUTATIONAL FINANCE: Experienced programmer in Java, C and/or C++. Previous financial experience preferred. Excellent background in Physics, Math, or similar disciplines, at least at PhD level. The R&D group works directly with other traders and develops its own automated trading systems. See www.ingber.com for papers on some current projects. -- Prof. Lester Ingber http://www.ingber.com/ PO Box 06440 Sears Tower Chicago IL 60606-0440 http://www.alumni.caltech.edu/~ingber/ From bis at prip.tuwien.ac.at Thu Apr 19 13:05:12 2001 From: bis at prip.tuwien.ac.at (Horst Bischof) Date: Thu, 19 Apr 2001 19:05:12 +0200 Subject: ICANN-WS on Kernel based Methods for Computer Vision Message-ID: <3ADF1AC8.48A4949@prip.tuwien.ac.at> ICANN 2001 Workshop on Kernel & Subspace Methods for Computer Vision http://www.prip.tuwien.ac.at/~bis/kernelws/ Call for Papers Workshop Co-organizers: Ales Leonardis Horst Bischof Scope of the workshop: This half-day workshop will be held in conjunction with ICANN 2001 on August 25, 2001 in Vienna. In the past years, we have been witnessing vivid developments of sophisticated kernel and subspace methods in neural network and pattern recognition communities on one hand and extensive use of these methods in the area of computer vision on the other hand. These methods seem to be especially relevant for object and scene recognition. The purpose of the workshop is to bring together scientists from the neural network (pattern recognition) and computer vision community to analyze new developments, identify open problems, and discuss possible solutions in the area of kernel & subspace methods such as: Support Vector Machines Independent Component Analysis Principle Component Analysis Canonical Correlation Analysis, etc. for computer vision problems such as: Object Recognition Navigation and Robotics 3D Vision, etc. Contributions in the above mentioned areas are welcome. The program will consist of invited and selected contributed papers. The papers selected for the workshop will appear in a Workshop Proceedings which will be distributed among the workshop participants. It is planned that selected papers from the workshop will be published in a journal. Important dates: Submission Deadline: 31.5.2001 Notification of Accaptance: 29.6.2001 Final Papers Due: 3.8.2001 Submission instructions: A complete paper, not longer than 12 pages including figures and references, should be submitted in the LNCS page format. The layout of final papers must adhere strictly to the guidelines set out in the Instructions for the Preparation of Camera-Ready Contributions to LNCS Proceedings. Authors are asked to follow these instructions exactly. In order to reduce the handling effort of papers we allow only for electronic submissions by ftp (either ps or pdf files). ftp ftp.prip.tuwien.ac.at [anonymous ftp, i.e.: Name: ftp Password: ] cd kernelws binary put .ext quit Workshop Registration: Registration for the Workshop can be done at the ICANN 2001 Homepage http://www.ai.univie.ac.at/icann/ From kap-listman at wkap.nl Thu Apr 19 20:06:17 2001 From: kap-listman at wkap.nl (kap-listman@wkap.nl) Date: Fri, 20 Apr 2001 02:06:17 +0200 (METDST) Subject: New Issue: Neural Processing Letters. Vol. 13, Issue 2 Message-ID: <200104200006.CAA10491@wkap.nl> Kluwer ALERT, the free notification service from Kluwer Academic/PLENUM Publishers and Kluwer Law International ------------------------------------------------------------ Neural Processing Letters ISSN 1370-4621 http://www.wkap.nl/issuetoc.htm/1370-4621+13+2+2001 Vol. 13, Issue 2, April 2001. TITLE: Using a New Model of Recurrent Neural Network for Control AUTHOR(S): L. Boquete, L. M. Bergasa, R. Barea, R. Garcia, M. Mazo KEYWORD(S): intelligent control, Lyapunov stability, radial basis function, recurrent neural network. PAGE(S): 101-113 TITLE: Multi-step Learning Rule for Recurrent Neural Models: An Application to Time Series Forecasting AUTHOR(S): Ines M. Galvan, Pedro Isasi KEYWORD(S): multi-step prediction, neural networks, time series, time series modelling. PAGE(S): 115-133 TITLE: Finite-Sample Convergence Properties of the LVQ1 Algorithm and the Batch LVQ1 Algorithm AUTHOR(S): Sergio Bermejo, Joan Cabestany KEYWORD(S): LVQ1 algorithm, asymptotic convergence, online gradient descent, finite-sample properties, BLVQ1 algorithm, Newton optimisation. PAGE(S): 135-157 TITLE: Learning with Nearest Neighbour Classifiers AUTHOR(S): Sergio Bermejo, Joan Cabestany KEYWORD(S): nearest neighbour classifiers, online gradient descent, Learning Vector Quantization, hand-written character recognition. PAGE(S): 159-181 TITLE: Generalizations of the Hamming Associative Memory AUTHOR(S): Paul Watta, Mohamad H. Hassoun KEYWORD(S): artificial neural network, associative memory, capacity, error correction, Hamming net. PAGE(S): 183-194 -------------------------------------------------------------- Thank you for your interest in Kluwer's books and journals. NORTH, CENTRAL AND SOUTH AMERICA Kluwer Academic Publishers Order Department, PO Box 358 Accord Station, Hingham, MA 02018-0358 USA Telephone (781) 871-6600 Fax (781) 681-9045 E-Mail: kluwer at wkap.com EUROPE, ASIA AND AFRICA Kluwer Academic Publishers Distribution Center PO Box 322 3300 AH Dordrecht The Netherlands Telephone 31-78-6392392 Fax 31-78-6546474 E-Mail: orderdept at wkap.nl From jf218 at hermes.cam.ac.uk Fri Apr 20 05:56:43 2001 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Fri, 20 Apr 2001 10:56:43 +0100 (BST) Subject: six papers on modelling single neuron and SVM are available Message-ID: Dear All, Five papers on modelling single neuron and one on SVM (see below for abstracts) are available on my home-page http://www.cogs.susx.ac.uk/users/jianfeng the best Jianfeng -------------------------------------------------------------------------- Titles: [54] Feng J. (2001) Is the integrate-and-fire model good enough? --a review Neural Networks (in press) [53] Feng J., Brown D., Wei G., and Tirozzi B. (2001) Detectable And Undetectable Input Signals For The Integrate-and-fire Model? J. Phys. A. vol. 34, 1637-1648 [52] Feng J., and, Zhang P. (2001) The Behaviour of Integrate-and-fire and Hodgkin-Huxley Models With Correlated Inputs Phys. Rev. E. (in press). [51] Feng J., Li, G.B., Brown D., and Buxton H. (2001) Balance between four types of synaptic input for the integrate-and-fire model J. Theor. Biol. vol. 203, 61-79 [50] Feng J., and, Li G. (2001) Neuronal models with current inputs J. Phys. A. vol. 34, 1649-1664 [55] Feng J., and Williams P. M. (2001) The generalization error of the symmetric and scaled Support Vector Machines IEEE T. Neural Networks (in press). -------------------------------------------------------------------------- Abstracts: [54] Feng J. (2001) Is the integrate-and-fire model good enough? --a review Neural Networks(in press) We review some recent results on the behaviour of the integrate-and-fire (IF) model, the FitzHugh-Nagumo (FHN) model, a simplified version of the FHN (IF-FHN) model [11] and the Hodgkin-Huxley (HH) model with correlated inputs. The effect of inhibitory inputs on the model behaviour is also taken into account. Here inputs exclusively take the form of diffusion approximation and correlated inputs mean correlated synaptic inputs (Section 2 and 3). It is found that the IF and HH models respond to correlated inputs in totally opposite ways, but the IF-FHN model shows the similar behaviour as the HH model. Increasing inhibitory input to single neuronal model, such as the FHN model and the HH model, can sometimes increase their firing rates, which we termed as inhibition-boosted firing (IBF). Using the IF model and IF-FHN model, we theoretically explore how and when the IBF can happen. The computational complexity of the IF-FHN model is very similar to the conventional IF model, but the former captures some interesting and essential features of biophysical models and could serve as a better model for spiking neuron computation. [53] Feng J., Brown D., Wei G., and Tirozzi B. (2001) Detectable And Undetectable Input Signals For The Integrate-and-fire Model? J. Phys. A. vol. 34, 1637-1648 We consider the integrate-and-fire model with non-stationary, stochastic inputs and address the following issue: what are the conditions on the input currents that make the input signal undetectable? A novel theoretical approach to tackle the problem for the model with non-stationary inputs is introduced. When the noise strength is independent of the deterministic component of the synaptic input, an expression for the critical input signal is given. If the input signal is weaker than the critical input signal, the neuron ultimately stops firing, i.e. is not able to detect the input signal; otherwise it fires with probability one. Similar results are established for Poisson type inputs where the strength of the noise is proportional to the deterministic component of the synaptic input. [52] Feng J., and, Zhang P. (2001) The Behaviour of Integrate-and-fire and Hodgkin-Huxley Models With Correlated Inputs Phys. Rev. E. (in press). We assess, both numerically and theoretically, how positively correlated Poisson inputs affect the output of the integrate-and-fire and Hodgkin-Huxley models. For the integrate-and-fire model the variability of efferent spike trains is an increasing function of input correlation, and of the ratio between inhibitory and excitatory inputs. Interestingly for the Hodgkin-Huxley model the variability of efferent spike trains is a decreasing function of input correlations, and for fixed input correlation it is almost independent of the ratio between inhibitory and excitatory inputs. In terms of the signal to noise ratio of efferent spike trains the IF model works better in an environment of asynchronous inputs, but the Hodgkin-Huxley model has an advantage for more synchronous (correlated ) inputs. In conclusion the integrate-and-fire and HH models respond to correlated inputs in totally opposite ways. [51] Feng J., Li, G.B., Brown D., and Buxton H. (2001) Balance between four types of synaptic input for the integrate-and-fire model J. Theor. Biol. vol. 203, 61-79 We consider the integrate-and-fire model with AMPA, NMDA, GABA_A and GABA_B synaptic inputs, wit model parameters based upon experimental data. An analytical approach is presented to determine when a post-synaptic balance between excitation and inhibition can be achieved. Secondly we compare the model behaviour subject to these four types of input, with its behaviour subjected to conventional point process inputs. We conclude that point processes are not a good approximation, even away from exact presynaptic balance. Thirdly, numerical simulations are presented which demonstrate that we can treat NMDA and GABA_B as DC currents. Finally we conclude that a balanced input is plausible neither presynaptically not postsynaptically for the model and parameters we employed. [50] Feng J., and, Li G. (2001) Neuronal models with current inputs J. Phys. A. vol. 34, 1649-1664 For the integrate-and-fire model and the HH model, we consider how current inputs including alpha-wave and square-wave affect their outputs. Firstly the usual approximation is employed to approximate the models with current inputs which reveals the difference between instantaneous and non-instantaneous (current) inputs. When the rising time of alpha-wave inputs is long or the ratio between the inhibitory and excitatory inputs is close to one, the usual approximation fails to approximate the alpha-wave inputs in the integrate-and-fire model. For the Hodgkin-Huxley model, the usual approximation in general gives an unsatisfying approximation. A novel approach based upon a superposition of 'coloured' and 'white' noise is then proposed to replace the usual approximation. Numerical results show that the novel approach substantially improves the approximation within widely physiologically reasonable regions of the rising rime of alpha-wave inputs. [55] Feng J., and Williams P. M. (2001) The generalization error of the symmetric and scaled Support Vector Machines IEEE T. Neural Networks (in press). It is generally believed that the support vector machine (SVM) optimises the generalisation error and output performs other learning machines. We show analytically, by concrete examples in the one dimensional case, that the support vector machine does improve the mean and standard deviation of the generalisation error by a constant factor, compared to the worst learning machine. Our approach is in terms of extreme value theory and both the mean and variance of the generalisation error are calculated exactly for all cases considered. We propose a new version of the SVM (scaled SVM) which can further reduce the mean of the generalisation error of the SVM. From paolo at dsi.unifi.it Fri Apr 20 07:07:21 2001 From: paolo at dsi.unifi.it (Paolo Frasconi) Date: Fri, 20 Apr 2001 13:07:21 +0200 Subject: Special issue on connectionist models for learning in structured domains Message-ID: The members of this list may be interested in the most recent issue of the IEEE Transactions on Knowledge and Data Engineering which is a Special Issue on Connectionist Models for Learning in Structured Domains. IEEE Transactions on Knowledge and Data Engineering Vol. 13, No. 2, March/April 2001 SPECIAL SECTION ON CONNECTIONIST MODELS FOR LEARNING IN STRUCTURED DOMAINS Abstracts can be found at http://www.dsi.unifi.it/neural/tkde-datas.html Full text is available to subscribers from the IEEE TKDE home page http://computer.org/tkde/index.htm Guest Editorial Introduction to the Special Section P. Frasconi, M. Gori, and A. Sperduti Simple Strategies to Encode Tree Automata in Sigmoid Recursive Neural Networks R.C. Carrasco and M.L. Forcada Integrating Linguistic Primitives in Learning Context-Dependent Representation S.W.K. Chan Symbolic vs. Connectionist Learning: An Experimental Comparison in a Structured Domain P. Foggia, R. Genna, and M. Vento Generalization Ability of Folding Networks B. Hammer Hierarchical Growing Cell Structures: TreeGCS V.J. Hodge and J. Austin Incremental Syntactic Parsing of Natural Language Corpora with Simple Synchrony Networks P.C.R. Lane and J.B. Henderson Learning Distributed Representations of Concepts Using Linear Relational Embedding A. Paccanaro and G.E. Hinton Clustering and Classification in Structured Data Domains Using Fuzzy Lattice Neurocomputing (FLN) V. Petridis and V.G. Kaburlasos Representation and Processing of Structures with Binary Sparse Distributed Codes D.A. Rachkovskij [Sorry, I can provide no hardcopies - for electronic copies, please contact the authors directly]. From Nigel.Goddard at ed.ac.uk Fri Apr 20 13:30:55 2001 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Fri, 20 Apr 2001 18:30:55 +0100 Subject: Multilevel Modeling and Simulation Workshop Message-ID: <3AE0724F.504F8147@ed.ac.uk> WORKSHOP ON MULTILEVEL NEURONAL MODELLING AND SIMULATION a Maxwell Neuroinformatics Workshop Call for Participation May 21-25, 2001, Edinburgh, Scotland http://www.anc.ed.ac.uk/workshops An emerging theme in modelling and understanding brain processes is that understanding processes at one level can be greatly enhanced by considering the process embedded in its context and by consideration of the complexities of processes operating at a much finer level of spatiotemporal resolution. The aim of this workshop is to bring together scientists with experimental, computational and theoretical approaches spanning multiple levels to provide an opportunity for interaction between methodological and phenomenological foci. One goal is to explore how abstractions at different levels are related, from molecular to system levels, with reference to both natural and artificial systems. A second goal is to discuss the nature of the computational tools needed to support effective modelling across abstractions and levels. The meeting is being organized in a small workshop style with emphasis on short presentations from invited speakers and from participants, round table discussions, and open debates on emerging topics. Time is scheduled for informal, self-organised, small-group activities. Computers will be available to support explorative work and demonstrations. In addition to the invited speakers, a limited number of places will be available to interested scientists, who will be chosen on the basis of the contribution they can make to the workshop. A number of places are reserved for junior faculty, postdoctoral researchers and senior graduate students who are early on in a research career in the areas covered by the workshop and who could gain significantly from exposure to the workshop presentations and discussions. We expect to have some travel/accommodation stipends for some of these participants who do not have access to their own funding to participate. Registration is via the developing Neuroinformatics portal at http://www.neuroinf.org, and further information can be found at the workshope site: http://www.anc.ed.ac.uk/workshops/Workshop1.html From psarroa at wmin.ac.uk Tue Apr 24 08:43:50 2001 From: psarroa at wmin.ac.uk (Dr Alexandra Psarrou) Date: Tue, 24 Apr 2001 12:43:50 -0000 Subject: PhD studentships in behaviour modelling and content-based indexing Message-ID: <012c01c0ccbc$36c0d920$0300a8c0@as7400> PhD STUDENTSHIPS FOR BEHAVIOUR MODELLING & CONTENT-BASED INDEXING UNIVERSITY OF WESTMINSTER HARROW SCHOOL OF COMPUTER SCIENCE Department of Artificial Intelligence and Interactive Multimedia Applications are invited for studentships leading to a PhD in the areas of behaviour modelling and content-based indexing. Starting date for the position is September/October 2001. The aims of the studentships are to pursue research in developing dynamic face and behavioural models from temporal information and their applications in indexing image and video databases. The candidates should ideally have a good first degree in one of the following subjects: Computer Science, Electronic Engineering, Mathematics or Physics. Normally, candidates are also required to have an appropriate master's degree, although exceptions can be made. Programming experience in C/C++ is essential and knowledge computer vision and statistical learning would be advantageous. Each post carries a bursary of 9000 pounds per annum plus home (and EU) postgraduate fees and is tenable for three years. There may also be opportunities to supplement the bursary income by undertaking tutorial work within the School. Applicants should send their resume and letters of recommendations to: Dr Alexandra Psarrou, Att: PhD Studentships Harrow School of Computer Science University of Westminster, Harrow Campus Watford Road, Northwick Park, Harrow HA1 3TP, UK Telephone: ++44 (020) 7911 5904 Email: psarroa at wmin.ac.uk From eric at research.nj.nec.com Tue Apr 24 16:29:28 2001 From: eric at research.nj.nec.com (Eric Baum) Date: Tue, 24 Apr 2001 16:29:28 -0400 (EDT) Subject: Postdoctoral Research Opportunity Message-ID: <15077.57737.609146.245904@yin.nj.nec.com> Postdoctoral Research Opportunity A post-doctoral position is available in the CS Division of the NEC Research Institute in Princeton NJ, USA (http://www.neci.nj.nec.com). This position is for work on machine and reinforcement learning. One project will extend ideas of the Hayek Machine (c.f. papers at http://www.neci.nj.nec.com/homepages/eric/) on evolving artificial economies of agents that reinforcement learn to web search and automatically personalized computing. The position is a one year term position, possibly renewable subject to mutual agreement and funding. Candidates should have a Ph.D. in computer science or related field, a strong background in machine learning or genetic algorithms/programming, programming experience in the Unix/C(++) environment, and should have a keen interest in building high performance AI systems. If interested please contact Eric Baum by email (see below). Include - CV - List of Publications - Three selected papers - Names & addresses of three scientists who could act as reference (ascii, ps, or pdf files welcome, no MS-Word files please) -- Eric Baum | eric at research.nj.nec.com NEC Research Institute | http://www.neci.nj.nec.com/homepages/eric/ 4 Independence Way | Tel: +1 (609) 951-2712 Princeton NJ 08540 | Fax: +1 (609) 951-2488 From mschmitt at lmi.ruhr-uni-bochum.de Wed Apr 25 06:20:19 2001 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Wed, 25 Apr 2001 12:20:19 +0200 Subject: Preprints on Spiking and Product Unit Neural Networks Message-ID: <3AE6A4E3.6D02B45F@lmi.ruhr-uni-bochum.de> Dear Colleagues, the following two preprints are available on-line: "Complexity of learning for networks of spiking neurons with nonlinear synaptic interactions" http://www.ruhr-uni-bochum.de/lmi/mschmitt/nonlinear.ps.gz (8 pages gzipped PostScript), "Product unit neural networks with constant depth and superlinear VC dimension" http://www.ruhr-uni-bochum.de/lmi/mschmitt/superlinear.ps.gz (9 pages gzipped PostScript). Both papers are going to be presented in talks at the International Conference on Artificial Neural Networks ICANN 2001, August 21-25, 2001, Vienna, Austria. Regards, Michael Schmitt ------------------------------------------------------------ TITLE: Complexity of Learning for Networks of Spiking Neurons with Nonlinear Synaptic Interactions AUTHOR: Michael Schmitt ABSTRACT We study model networks of spiking neurons where synaptic inputs interact in terms of nonlinear functions. These nonlinearities are used to represent the spatial grouping of synapses on the dendrites and to model the computations performed at local branches. We analyze the complexity of learning in these networks in terms of the VC dimension and the pseudo dimension. Polynomial upper bounds on these dimensions are derived for various types of synaptic nonlinearities. ------------------------------------------------------------ TITLE: Product Unit Neural Networks with Constant Depth and Superlinear VC Dimension AUTHOR: Michael Schmitt ABSTRACT It has remained an open question whether there exist product unit networks with constant depth that have superlinear VC dimension. In this paper we give an answer by constructing two-hidden-layer networks with this property. We further show that the pseudo dimension of a single product unit is linear. These results bear witness to the cooperative effects on the computational capabilities of product unit networks as they are used in practice. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: +49 234 32-23209 , Fax: +49 234 32-14465 http://www.ruhr-uni-bochum.de/lmi/mschmitt/ From cindy at cns.bu.edu Wed Apr 25 14:22:18 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 25 Apr 2001 14:22:18 -0400 Subject: Neural Networks 14(4/5) Message-ID: <200104251822.OAA01897@mattapan.bu.edu> NEURAL NETWORKS 14(4/5) Contents - Volume 14, Numbers 4/5 - 2001 ------------------------------------------------------------------ CONTRIBUTED ARTICLES: ***** Psychology and Cognitive Science ***** Quantitative examinations for multi joint arm trajectory planning: Using a robust calculation algorithm of the minimum commanded torque change trajectory Yasuhiro Wada, Yuichi Kaneko, Eri Nakano, Reiko Osu, and Mitsuo Kawato ***** Neuroscience and Neuropsychology ***** Solving the binding problem of the brain with bi-directional functional connectivity Masataka Watanabe, Kousaku Nakanishi, and Kazuyuki Aihara ***** Mathematical and Computational Analysis ***** Learning from noisy information in FasArt and FasBack neuro-fuzzy systems Jose Manuel Cano Izquierdo, Yannis A. Dimitriadis, Eduardo Gomez Sanchez, and Juan Lopez Coronado Comparing Bayesian neural network algorithms for classifying segmented outdoor images Francesco Vivarelli and Christopher K.I. Williams Three learning phases for radial-basis-function networks Friedhelm Schwenker, Hans A. Kestler, and Gunther Palm Noise suppression in training examples for improving generalization capability Akiko Nakashima and Hidemitsu Ogawa Networks with trainable amplitude of activation functions Edmondo Trentin A model with an intrinsic property of learning higher order correlations Marifi Guler ***** Engineering and Design ***** S-TREE: Self-organizing trees for data clustering and online vector quantization Marcos M. Campos and Gail A. Carpenter The constraint based decomposition (CBD) training architecture Sorin Draghici ***** Technology and Applications ***** Life-long learning cell structures: Continuously learning without catastrophic interference Fred H. Hamker Pattern classification by a condensed neural network A. Mitiche and M. Lebidoff ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From cindy at cns.bu.edu Wed Apr 25 15:56:18 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 25 Apr 2001 15:56:18 -0400 Subject: 5th ICCNS: Call for Registration Message-ID: <200104251956.PAA15444@retina.bu.edu> Apologies if you receive this more than once. ***** CALL FOR REGISTRATION ***** ***** AND ***** ***** FINAL INVITED PROGRAM ***** FIFTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS Tutorials: May 30, 2001 Meeting: May 31 - June 2, 2001 Boston University http://www.cns.bu.edu/meetings/ This interdisciplinary conference focuses on two fundamental questions: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? A single oral or poster session enables all presented work to be highly visible. Contributed talks will be presented on each of the three conference days. Three-hour poster sessions with no conflicting events will be held on two of the conference days. All posters will be up all day, and can also be viewed during breaks in the talk schedule. CONFIRMED INVITED SPEAKERS TUTORIAL SPEAKERS: Wednesday, May 30, 2001 Ted Adelson: The perception of surface properties Yiannis Aloimonos: What geometry and statistics tell us about the motion pathway Gail A. Carpenter: Adaptive resonance theory Michael Jordan: Inference and learning in graphical models INVITED SPEAKERS Thursday, May 31, 2001 Larry Abbott: Spike-timing effects in Hebbian synaptic plasticity Wulfram Gerstner: Rapid signal transmission by populations of spiking neurons Nancy Kopell: Rhythms and cell assemblies in the nervous system Wolfgang Maass: Liquid state machines: A new framework for understanding neural computation on spike trains Henry Markram: Neocortical microcircuits of perception, attention, and memory Victor Lamme: The role of recurrent processing in visual awareness Wolf Singer: Neuronal synchrony in cerebral cortex and its functional implications (keynote lecture) Friday, June 1, 2001 Ralph D. Freeman: Organization of receptive fields of neurons in the primary visual cortex Nikos Logothetis: On bistable perception David J. Heeger: Attention and sensory signals in primary visual cortex Maggie Shiffrar: The visual analysis of moving bodies Stephen Grossberg: The complementary brain: Unifying brain dynamics and modularity Allen Waxman: Multi-sensor 3D image fusion technologies Saturday, June 2, 2001 Peter L. Strick: Basal ganglia and cerebellar "loops" with the cerebral cortex: Motor and cognitive circuits Richard Ivry: Timing, temporal coupling, and response selection Daniel Bullock: Action selection and reinforcement learning in a model of laminar frontal cortex and the basal ganglia Christoph Schreiner: Temporal correlation and information transfer in the auditory thalamo-cortical system Rochel Gelman: Continuity and discontinuity in cognitive development: Numerical cognition as a case Maja Mataric: From what you see to what you do: Imitation in humans and humanoid robots Leon Cooper: Bi-directionally modifiable synapses: From theoretical fantasy to experimental fact (keynote lecture) REGISTRATION FORM Fifth International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 30, 2001 Meeting: May 31 - June 2, 2001 FAX: (617) 353-7755 http://www.cns.bu.edu/meetings/ (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $75 Conference plus Tutorial (Regular) ( ) $50 Conference plus Tutorial (Student) ( ) $50 Conference Only (Regular) ( ) $35 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From shastri at ICSI.Berkeley.EDU Fri Apr 27 22:07:10 2001 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Fri, 27 Apr 2001 19:07:10 PDT Subject: Episodic Memory Formation via Cortico-Hippocampal Interactions Message-ID: <200104280207.TAA06655@dill.ICSI.Berkeley.EDU> Dear Connectionists, The following article may be of interest to you. Best wishes, -- Lokendra Shastri -------------------------------------------------------------------------- http://www.icsi.berkeley.edu/~shastri/psfiles/shastri_em.pdf From Transient Patterns to Persistent Structures: A model of episodic memory formation via cortico-hippocampal interactions Lokendra Shastri International Computer Science Institute Berkeley, CA 94704 Abstract We readily remember events and situations in our daily lives and rapidly acquire memories of specific events by watching a telecast or reading a newspaper. There is a broad consensus that the hippocampal system (HS), consisting of the hippocampal formation and neighboring cortical areas, plays a critical role in the encoding and retrieval of such ``episodic'' memories. But how the HS subserves this mnemonic function is not fully understood. This article presents a computational model, SMRITI, that demonstrates how a cortically expressed transient pattern of activity representing an event can be transformed rapidly into a persistent and robust memory trace as a result of long-term pot- entiation within structures whose architecture and circuitry resemble those of the HS. Memory traces formed by the model respond to partial cues, and at the same time, reject similar but erroneous cues. During retrieval these memory traces, acting in concert with cortical circuits encoding semantic, causal, and procedural knowledge, can recreate activation-based representations of memorized events. The model explicates the representational requirements of encoding episodic memories, and suggests that the idiosyncratic architecture of the HS is well matched to the representational problems it must solve in order to support the episodic memory function. The model predicts the nature of memory deficits that would result from insult to specific HS components and to cortical circuits projecting to the HS. It also identifies the sorts of memories that must remain encoded in the HS for the long-term, and helps delineate the semantic and episodic memory distinction. (Submitted to Behavioral and Brain Sciences) From E.Koning at elsevier.nl Fri Apr 13 03:53:59 2001 From: E.Koning at elsevier.nl (Koning, Esther (ELS)) Date: Fri, 13 Apr 2001 09:53:59 +0200 Subject: CITE: Elsevier abstracts and journals available online Message-ID: <4FAD455E0BA3D31196270008C784DAE202B3B85A@elsamssonyx.elsevier.nl> Announcement: New user interface of CITE, The Computational Intelligence platform. CITE integrates contents and services, covering all subject areas in the field of computational intelligence. Visit CITE at: http://www.elsevier.com/cite CITE offers: - Access to the major journals in computational intelligence: Neural Networks, Artificial Intelligence, Biosystems, Fuzzy Sets and Systems, and Pattern Recognition. - An abstracts database covering recent citations and abstracts from more than 60 key journals of Elsevier Science and other publishers. - A book list, which provides you with information and reviews of new books. - Events list on forthcoming events world-wide. - Links to publishers' sites providing information on additional contents of journals in the area of computational intelligence. Note: Access to abstracts, tables of content of 60 journals, information on events, books, bibliographies and related sites is free to everyone. Your personal or your institution's subscription to Elsevier Science journals in CITE allows you to access the full-text articles of those journals. Contact: Esther Koning mailto:e.koning at elsevier.nl From qian at brahms.cpmc.columbia.edu Sun Apr 1 16:59:21 2001 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Sun, 1 Apr 2001 16:59:21 -0400 Subject: paper available: Modeling V1 Disparity Tuning to Time-varying Stimuli Message-ID: <200104012059.QAA29825@bach.cpmc.columbia.edu> Dear Colleagues. The following paper on "Modeling V1 Disparity Tuning to Time-varying Stimuli" is available at: http://brahms.cpmc.columbia.edu/publications/v1time.ps.gz Best regards, Ning ---------------------------------------------------- Modeling V1 Disparity Tuning to Time-varying Stimuli Yuzhi Chen, Yunjiu Wang, and Ning Qian, J. Neurophysiol. (in press). Abstract Most models of disparity selectivity consider only the spatial properties of binocular cells. However, the temporal response is an integral component of real neurons' activities, and time-varying stimuli are often used in the experiments of disparity tuning. To understand the temporal dimension of V1 disparity representation, we incorporate a specific temporal response function into the disparity energy model, and demonstrate that the binocular interaction of complex cells is separable into a Gabor disparity function and a positive time function. We then investigate how the model simple and complex cells respond to widely used time-varying stimuli, including motion-in-depth patterns, drifting gratings, moving bars, moving random dot stereograms, and dynamic random dot stereograms. It is found that both model simple and complex cells show more reliable disparity tuning to time-varying stimuli than to static stimuli, but similarities in the disparity tuning between simple and complex cells depend on the stimulus. Specifically, the disparity tuning curves of the two cell types are similar to each other for either drifting sinusoidal gratings or moving bars. In contrast, when the stimuli are dynamic random dot stereograms, the disparity tuning of simple cells is highly variable, whereas the tuning of complex cells remains reliable. Moreover, cells with similar motion preferences in the two eyes cannot be truly tuned to motion in depth, regardless of the stimulus types. These simulation results are consistent with a large body of extant physiological data, and provide some specific, testable predictions. From qian at brahms.cpmc.columbia.edu Mon Apr 2 15:20:35 2001 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Mon, 2 Apr 2001 15:20:35 -0400 Subject: postdoc position available at Columbia Message-ID: <200104021920.PAA30812@bach.cpmc.columbia.edu> Postdoctoral Position in Visual Psychophysics or Modeling Center for Neurobiology and Behavior Columbia University A postdoctoral position in visual psychophysics or computational modeling of vision is available immediately in Dr. Ning Qian's lab at Columbia. The postdoc will be working on vision-related projects including (but not restricted to) binocular depth perception and rivalry, motion perception, structure from motion, and visual perceptual learning. The details of the on-going projects and recent publications can be found at the webpage: http://brahms.cpmc.columbia.edu The funding is available for at least three years. The initial appointment will be for one year, and will be renewable on a yearly basis. The candidate should have a strong background in either psychophysics or mathematical/computational modeling, as evidenced by first-authored publications. Programming skills in Matlab or C will be a plus. Please send CV, two to three letters of reference, and representative publications to: Dr. Ning Qian Ctr. Neurobiology & Behavior Columbia University Annex Rm 730 722 W. 168th St. New York, NY 10032, USA nq6 at columbia.edu 212-543-5213 E-mail applications and inquiries welcome. From zemel at cs.toronto.edu Mon Apr 2 16:40:44 2001 From: zemel at cs.toronto.edu (Richard Zemel) Date: Mon, 2 Apr 2001 16:40:44 -0400 Subject: NIPS*2001 Call For Papers Message-ID: <01Apr2.164049edt.453159-23025@jane.cs.toronto.edu> CALL FOR PAPERS -- NIPS*2001 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Dec. 3 -- Saturday, Dec. 8, 2001 Vancouver, British Columbia, Canada ========================================== This is the fifteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Dec. 3), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 7-8). Invited speakers this year will be Barbara Finlay (Departments of Psychology, and Neurobiology and Behavior, Cornell University), Alison Gopnik (Department of Psychology, University of California at Berkeley), Jon M. Kleinberg (Department of Computer Science, Cornell University), Shihab Shamma (Department of Electrical Engineering University of Maryland), Judea Pearl (Department of Computer Science, UCLA), and Tom Knight (Artificial Intelligence Laboratory, MIT). Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, kernel methods, committee models, graphical models, support vector machines, Gaussian processes, decision trees, factor analysis, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, web and network applications, intrusion and fraud detection, bio-informatics, medical diagnosis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music and video applications, animation, virtual environments. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, human learning and memory, conditioning, categorization, attention, language, reasoning, spatial cognition, emotional cognition, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, neuromorphic engineering, microelectromechanical systems, optical systems, vision chips, head-eye systems, cochlear implants, roving robots, computational sensors and actuators, molecular and quantum computing, novel neurodevices, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, population codes, temporal codes, spike train analysis, and experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, computational models of classical and operant conditioning, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, game-playing, Markov decision processes, multi-agent cooperative algorithms. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory. Visual Processing: image processing, image coding, object recognition, face recognition, visual feature detection, visual psychophysics, stereopsis, optic flow algorithms, motion detection and tracking, spatial representations, spatial attention, scene analysis, visual search, visuo-spatial working memory. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable from the web page listed below. The style files are unchanged from NIPS*2000. Submission Instructions: NIPS accepts only electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We will accept postscript and PDF documents, but we prefer postscript. The electronic submission page will be available on June 6, 2001 Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT JUNE 20, 2001 PACIFIC DAYLIGHT TIME (08:00 GMT JUNE 21, 2001). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS For general inquiries or requests for registration material, send e-mail to nipsinfo at salk.edu or fax to (619)587-0417. NIPS*2001 Organizing Committee: General Chair, Tom Dietterich, Oregon State University; Program Chair, Sue Becker, McMaster University; Publications Chair, Zoubin Ghahramani, University College London; Tutorial Chair, Yoshua Bengio, University of Montreal; Workshops Co-Chairs, Virginia de Sa, Sloan Center for Theoretical Neurobiology, Barak Pearlmutter, University of New Mexico; Publicity Chair, Richard Zemel, University of Toronto; Volunteer Coordinator, Sidney Fels, University of British Columbia; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Alex Gray, Carnegie Mellon University, Xin Wang, Oregon State University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2001 Program Committee: Sue Becker, McMaster University (chair); Gert Cauwenberghs, Johns Hopkins University; Bill Freeman, Mitsubishi Electric Research Lab; Thomas Hofmann, Brown University; Dan Lee, Bell Laboratories, Lucent Technologies; Sridhar Mahadevan, Michigan State University; Marina Meila-Predoviciu, University of Washington; Klaus Mueller, GMD First, Berlin; Klaus Obermayer, TU Berlin; Sam Roweis, Gatsby Computational Neuroscience Unit, UCL; John Shawe-Taylor, Royal Holloway, University of London; Josh Tenenbaum, Stanford University; Volker Tresp, Siemens, Munich; Richard Zemel, University of Toronto. PAPERS MUST BE SUBMITTED BY JUNE 20, 2001 From mieko at isd.atr.co.jp Mon Apr 2 22:15:13 2001 From: mieko at isd.atr.co.jp (Mieko Namba) Date: Tue, 3 Apr 2001 11:15:13 +0900 Subject: CALL FOR PAPERS [Neural Networks 2002 Special Issue] Message-ID: Dear members, We are glad to inform you that the Japanese Neural Networks Society will edit the NEURAL NETWORKS 2002 Special Issue as below. NEURAL NETWORKS is an official international compilation of the Journal of the International Neural Networks Society, the European Neural Networks Society and the Japanese Neural Networks Society. We are looking forward to receiving your contributions. Mitsuo Kawato Co-Editor-in-Chief Neural Networks (ATR) ****************************************************************** CALL FOR PAPERS Neural Networks 2002 Special Issue "Computational Models of Neuromodulation" ****************************************************************** Co-Editors Dr. Kenji Doya, ATR, Japan Dr. Peter Dayan, University College London, U.K. Professor Michael E. Hasselmo, Boston University, U.S.A. Submission Deadline for submission: September 30, 2001 Notification of acceptance: January 31, 2002 Format: as for normal papers in the journal (APA format) and no longer than 10,000 words Address for Papers Dr. Mitsuo Kawato ATR 2-2-2 Hikaridai, Seika-cho Soraku-gun, Kyoto 619-0288, Japan. MORE DETAIL: http://www.isd.atr.co.jp/nip/NNSP2002.html ****************************************************************** Neuromodulators such as acetylcholine, dopamine, norepinephrine and serotonin exert widespread and diverse computational influences, based on a range of subtle cellular effects and a non-specific and diffuse pattern of anatomical connectivity. The roles for neuromodulators can often be characterised in terms of meta-learning, that is, regulation of global parameters and the structure of a learning system. Their specific roles include the prediction of reward and punishment, the allocation of selective attention, the regulation of behavioral variability, and the control of memory acquisition. Drugs influencing these modulatory systems have profound effects on neural network dynamics and plasticity, and thus on cognition and behavior. Computational modeling is essential to understand the effects of such subtle cellular changes on the macroscopic function of neural networks. The Special Issue will focus on the computational models of neuromodulators. Contributed articles covering the range of neuromodulatory effects are solicited, and integrative accounts will be specially welcome. The special issue will include theories of meta-learning, such as automatic tuning of learning rates and noise for exploration, and models of the roles of particular neuromodulators in particular systems, such as acetylcholine in the hippocampus and dopamine in the pre-frontal cortex. Models of the activities and interactions of cells releasing the neuromodulators, their roles in behavioral and cognitive functions, and models of invertebrate neuromodulation will also be welcome. ****************************************************************** end. From duch at phys.uni.torun.pl Tue Apr 3 06:02:08 2001 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Tue, 3 Apr 2001 11:02:08 +0100 Subject: CALL FOR PAPERS [TASK Quarterly 2002 Special Issue] Message-ID: Dear members, Polish Neural Networks Society plans a special issue of the TASK Quarterly Journal on neural networks. Of our particular interest are longer papers of tutorial nature, perhaps longer versions of papers that you have prepared for other journals or conferences but had to shorten and thus some important details were left unpublished. ****************************************************************** CALL FOR PAPERS TASK Quarterly 2002 Special Issue "Neural Networks" ****************************************************************** This special issue will be devoted to neural and other computational intelligence methods and applications. Special emphasis will be put on: models that do more than classification and approximation, systems that go beyond associative capabilities of simple networks, for example are not based on vectors in feature spaces of fixed dimensions; understanding the data and building theories, modular networks, multi-strategy learning, general theories of learning, hybrid systems, such as the neuro-fuzzy systems combining neural and symbolic components, simulations of brain functions and models with spiking neurons. Deadline for submission: September 30, 2001 Notification of acceptance: January 31, 2002 Format: no restrictions on length, detailed instructions on the journal WWW page: http://www.task.gda.pl/quart/ Editors Prof. Wlodzislaw Duch, Nicholas Copernicus University, Poland. Prof. Danuta Rutkowska, Technical University of Czestochowa, Poland Send your papers via email to: duch at ieee.org ****************************************************************** The TASK Quarterly journal http://www.task.gda.pl/quart/ started in 1997 and is kept on a very decent level, with strict refereeing system. The journal covers all subjects of computational nature, accepts and prints color illustrations (free of charge). It is covered by INSPEC abstracts and has a regular subscription of about 400 (for a scientific journal it is not bad). So far two special issues on informatics in medicine have been published. The editors may send you sample copies if you send them an email to the quarterly at task.gda.pl address. ****************************************************************** Wlodzislaw Duch Dept. of Computer Methods, N. Copernicus University http://www.phys.uni.torun.pl/~duch From duch at phys.uni.torun.pl Tue Apr 3 12:52:09 2001 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Tue, 3 Apr 2001 17:52:09 +0100 Subject: Review of PNL NN site Message-ID: Dear members, Very nice Neural Networks web infromation has been provided for some time at the PNL site http://www.emsl.pnl.gov:2080/proj/neuron/neural/ IEEE Transaction on Neural Networks intends to publish reviews of interesting neural sites and other media. Unfortunately for more than a year I was not able to get any response from the site developers. Please let me know if there are any volunteers to review this site. You are welcomed to suggest other sites for reviewing. I'd like to thank all book review volunteers who responded to my previous call. Please be sure that whenever an appropriate book arrives I'll contact you. Wlodzislaw Duch, duch at ieee.org Dept. of Computer Methods, N. Copernicus University http://www.phys.uni.torun.pl/~duch From robert at physik.uni-wuerzburg.de Tue Apr 3 08:03:21 2001 From: robert at physik.uni-wuerzburg.de (Robert Urbanczik) Date: Tue, 3 Apr 2001 14:03:21 +0200 (CEST) Subject: Paper on SVMs Message-ID: Dear Connectionists, the following paper (5 pages, to appear in Phys. Rev. Letts.) is available from: ftp://ftp.physik.uni-wuerzburg.de/pub/preprint/2001/WUE-ITP-2001-006.ps.gz M. Opper and R. Urbanczik Universal learning curves of support vector machines ABSTRACT: Using methods of Statistical Physics, we investigate the r\^ole of model complexity in learning with support vector machines (SVMs), which are an important alternative to neural networks. We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal generalization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which only depend on the target rule but not on the SVM kernel. _________________________________________________________________________ R. Urbanczik Email: Inst. for Theoretical Physics III urbanczik at physik.uni-wuerzburg.de University Wuerzburg Phone: Am Hubland ++49 931 888 4908 97074 Wuerzburg Fax: Germany ++49 931 888 5141 _________________________________________________________________________ From Nigel.Goddard at ed.ac.uk Tue Apr 3 18:21:05 2001 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Tue, 03 Apr 2001 23:21:05 +0100 Subject: fMRI Research Associate and Ph.D Studentships Message-ID: <3ACA4CD1.687A9068@ed.ac.uk> OPPORTUNITIES IN FUNCTIONAL MRI & NEUROINFORMATICS AT EDINBURGH The Centre for Functional Imaging Studies has the following position and studentships available. For details see http://anc.ed.ac.uk/CFIS. 1. Research Associate in Functional MRI. An outstanding opportunity to gain a wide range of experience in all of the techniques used in fMRI studies and across a wide range of studies. Closing April 13th. Intending applicants should consult the website and contact by email Nigel.Goddard at ed.ac.uk as soon as possible. 2. A possible MRC-funded Ph.D. studentship in memory impairment in schizophrenia. Closing April 29th. 3. An MRC-funded Ph.D. studentship with superior stipend, focused on methodology in the context of one or more of our ongoing fMRI studies. Closing May 1st. The Centre for Functional Imaging Studies at the University of Edinburgh has been established to provide a focal grouping for expertise and experience in the methodologies used in functional brain imaging. The Centre undertakes and assists with imaging-based studies of brain function, working with research groups at Edinburgh and elsewhere. Our current focus is studies of cognitive function using functional Magnetic Resonance Imaging in collaboration with the SHEFC Brain Imaging Research Centre at the Western General Hospital, which houses the research-dedicated MRI scanner (see the BIRC website at http://www.dcn.ed.ac.uk/bic) . We have extensive facilities for fMRI studies including a parallel computer, high-speed networking, a large online data archive, state-of-the art stimulus presentation and paradigm programming software and hardware, and a custom-built simulator. We have a wide range of experience in paradigm design, with ongoing projects including clinical studies of schizophrenia, depression, pain, conversion and sleep disorders; scientific studies in language, decision-making, memory and affect; and methodological studies in realtime fMRI, repeatability, and statistical data analysis. From P.J.Lisboa at livjm.ac.uk Wed Apr 4 07:33:01 2001 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Wed, 4 Apr 2001 12:33:01 +0100 Subject: Industrial use of safety-related artificial neural networks Message-ID: A contract research report on industrial use of safety-related artificial neural networks has been published on the web by the contractors, the UK's Health and Safety Executive. A link address and abstract are appended to this email. This is in the nature of a consultation paper, so feedback regarding any aspect of the paper is very welcome. Paulo Lisboa. http://www.hse.gov.uk/research/crr_pdf/2001/crr01327.pdf Abstract The overall objective of this study is to investigate to what extent neural networks are used, and are likely to be used in the near future, in safety-related applications. Neural network products are actively being marketed and some are routinely used in safety-related areas, including cancer screening and fire detection in office blocks. Some are medical devices already certified by the FDA. The commercial potential for this technology is evident from the extent of industry-led research, and safety benefits will arise. In the process industries, for instance, there is real potential for closer plant surveillance and consequently productive maintenance, including plant life extension. It is clear from the applications reviewed that the key to successful transfer of neural networks to the marketplace is successful integration with routine practice, rather than optimisation for the idealised environments where much of the current development effort takes place. This requires the ability to evaluate their empirically derived response using structured domain knowledge, as well as performance testing. In controller design, the scalability of solutions to production models, and the need to maintain safe and efficient operation under plant wear, have led to the integration of linear design methods with neural network architectures. Further research is necessary in two directions, first to systematise current best practice in the design of a wide range of quite different neural computing software models and hardware systems, then to formulate a unified perspective of high-complexity computation in safety-related applications. There is a need to develop guidelines for good practice, to educate non-specialist users and inform what is already a wide base of practitioners. Combined with a safety awareness initiative, this would be of as much of benefit to the development of this commercially important new technology, as to its safe use in safety-related applications. From patrick at neuro.kuleuven.ac.be Wed Apr 4 09:00:43 2001 From: patrick at neuro.kuleuven.ac.be (Patrick De Maziere) Date: Wed, 4 Apr 2001 15:00:43 +0200 (MET DST) Subject: JOINT PUBLICATION OF 2 BOOKS ON SELF-ORGANIZING TOPOGRAPHIC MAPS Message-ID: JOINT PUBLICATION OF 2 BOOKS ON SELF-ORGANIZING TOPOGRAPHIC MAPS ================================================================ Both books, one in English and another in Japanese, offer a new perspective on topographic map formation and the advantages of information-based learning. The complete learning algorithms and simulation details are given throughout, along with comparative performance analysis tables and extensive references. The books provide the reader with an excellent, eye-opening guide for neural network researchers and students, industrial scientists involved in data mining, and anyone interested in self-organization and topographic maps. Forewords are by Prof. Teuvo Kohonen and Prof. Helge Ritter English version: Faithful Representations and Topographic Maps: From Distortion- to Information-based Self-organization by Marc M. Van Hulle, published by J. Wiley (New York) For more information, visit: http://catalog2.wiley.com/catalog/frameset/1,8279,,00.html (search via author name "Van Hulle") and to order it, visit: http://www.amazon.com/exec/obidos/ASIN/0471345075/qid=948382599/sr=1-1/002-0713799-7248240 (which includes several reviews) ------------------------- Japanese version: Self-organizing Maps: Theory, Design, and Application by Marc M. Van Hulle, Heizo Tokutaka, Kikuro Fujimura, published by Kaibundo (Tokyo) For more information, visit: http://member.nifty.ne.jp/kaibundo/syousai/ISBN4-303-73150-1.htm and to order it, visit: http://www.amazon.co.jp/exec/obidos/ASIN/4303731501/qid%3D986382452/249-5665754-6235519 ------------------------- Reviews: "I am convinced that this book marks an important contribution to the field of topographic map representations and that it will become a major reference for many years." (Ritter) "This book will provide a significant contribution to our theoretical understanding of the brain." (Kohonen) ------------------------- From jaap at murre.com Wed Apr 4 12:09:07 2001 From: jaap at murre.com (Jaap Murre) Date: Wed, 4 Apr 2001 18:09:07 +0200 Subject: Models of language acquisition Message-ID: <003301c0bd21$94852ca0$03000004@psy.uva.nl> Dear Connectionists, Recently two new books edited by us came out that may be of interest to you, in particular the first one. -- Jaap Murre Broeder, P., and J.M.J. Murre (Eds.) (2000). 'Models of Language Acquisition: Inductive and Deductive Approaches'. Oxford University Press. Broeder, P., and J.M.J. Murre (Eds.) (1999). 'Language and Thought in Development. Cross-Linguistic Studies'. Tuebingen: Gunter Narr. The contents of 'Models of Language Acquisition' is: Peter Broeder and Jaap Murre -- 1. Introduction to models of language acquisition Brian MacWhinney -- 2. Lexicalist connectionism Noel Sharkey, Amanda Sharkey, and Stuart Jackson -- 3. Are SRNs sufficient for modelling language acquisition? Antal van den Bosch and Walter Daelemans -- 4. A distributed, yet symbolic model for text-to-speech processing Steven Gillis, Walter Daelemans, and Gert Durieux -- 5. 'Lazy learning': a comparison of natural and machine learning of word stress Richard Shillcock, Paul Cairns, Nick Chater, and Joe Levy -- 6. Statistical and connectionist modelling of the development of speech segmentation Jeffrey Mark Siskind -- 7. Learning word-to-meaning mappings Gary Marcus -- 8. Children's overregularization and its implication for cognition Rainer Goebel and Peter Indefrey -- 9. The performance of a recurrent network with short term memory capacity learning the German -s plural Ramin Nakisa, Kim Plunkett, and Ulrike Hahn -- 10. A cross-linguistic comparison of single and dual-route models of inflectional morphology Partha Nyogi and Robert C. Berwick -- 12. Formal models for learning in the principles and parameters framework Loeki Elbers -- 13. An output-as-input hypothesis for language acquisition: arguments, model, evidence From Volker.Tresp at mchp.siemens.de Thu Apr 5 05:35:52 2001 From: Volker.Tresp at mchp.siemens.de (Volker Tresp) Date: Thu, 05 Apr 2001 11:35:52 +0200 Subject: NIPS Proceedings Available Online Message-ID: <3ACC3C78.BC7FE1A1@mchp.siemens.de> NIPS PROCEEDINGS AVAILABLE ONLINE The papers that will be published by MIT Press in Advances in Neural Information Processing Systems 13 (Proceedings of the 2000 Conference) edited by Todd K. Leen, Thomas G. Dietterich and Volker Tresp are available online. The URL is http://www.cs.cmu.edu/Web/Groups/NIPS/NIPS2000/00abstracts.html Best regards, Volker Tresp NIPS publication chair for NIPS*2000. From mel at lnc.usc.edu Thu Apr 5 17:00:35 2001 From: mel at lnc.usc.edu (Bartlett Mel) Date: Thu, 05 Apr 2001 14:00:35 -0700 Subject: Faculty Position(s) in Neural Engineering Message-ID: <3ACCDCF3.38D8F4F3@lnc.usc.edu> ************* Announcing ************** University of Southern California Department of Biomedical Engineering FACULTY POSITIONS The Department of Biomedical Engineering at USC is engaged in a major expansion of its research and educational programs, supported through the School of Engineering, the Alfred E. Mann Institute and a Special Opportunity Award from the Whitaker Foundation. For the first phase of this expansion we invite applications for tenure-track faculty positions at all levels in the areas of device/diagnostic technologies and neural systems. In the area of neural systems the successful candidate will be expected to complement the Department's existing strengths by establishing an independent research program in areas such as: computational neural science; sensory systems; motor control; neural/device interfaces; neural prostheses. The successful candidates in device/diagnostic technologies will provide leadership in establishing research and educational programs leading to the next generation of biomedical device and diagnostic technologies. Areas of interest include molecular and chemical sensing; biochemical, mechanical, optical microsystems; smart sensor and diagnostic technologies; implantable devices. Faculty will have the opportunity to work with the technology development professionals of the Alfred E. Mann Institute for Biomedical Engineering at USC, to translate their fundamental research discoveries into commercially viable biomedical technologies to improve human health and well-being. Applicants should submit a curriculum vitae and research/education statement along with suggested references to: David Z. D'Argenio, Chair Department of Biomedical Engineering University of Southern California Los Angeles, CA 90089-1451 Applicants are encouraged to visit the following web sites for details on current educational and research programs. BME Department - http://bme.usc.edu Center for Neural Engineering - http://www.usc.edu/dept/engineering/CNE Neuroscience Graduate Program - http://www.usc.edu/dept/nbio/ngp Neural Computation as USC - http://www-slab.usc.edu/neurocomp for details on current educational and research programs. The University of Southern California is an Equal Opportunity/Affirmative Action Employer and Encourages Applications from Women and Minority Candidates. -- Bartlett W. Mel, Assoc Prof Biomed Engin, Neurosci Grad Prog USC, BME Dept, MC 1451, Los Angeles, CA 90089 mel at usc.edu, http://LNC.usc.edu voice: (213)740-0334, lab: -3397, fax: -0343 fedex: 3650 McClintock Ave, 500 Olin Hall From robert at physik.uni-wuerzburg.de Tue Apr 3 08:03:21 2001 From: robert at physik.uni-wuerzburg.de (Robert Urbanczik) Date: Tue, 3 Apr 2001 14:03:21 +0200 (CEST) Subject: Paper on SVMs Message-ID: Dear Connectionists, the following paper (5 pages, to appear in Phys. Rev. Letts.) is available from: ftp://ftp.physik.uni-wuerzburg.de/pub/preprint/2001/WUE-ITP-2001-006.ps.gz M. Opper and R. Urbanczik Universal learning curves of support vector machines ABSTRACT: Using methods of Statistical Physics, we investigate the r\^ole of model complexity in learning with support vector machines (SVMs), which are an important alternative to neural networks. We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal generalization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which only depend on the target rule but not on the SVM kernel. _________________________________________________________________________ R. Urbanczik Email: Inst. for Theoretical Physics III urbanczik at physik.uni-wuerzburg.de University Wuerzburg Phone: Am Hubland ++49 931 888 4908 97074 Wuerzburg Fax: Germany ++49 931 888 5141 _________________________________________________________________________ From bokil at physics.bell-labs.com Fri Apr 6 18:48:44 2001 From: bokil at physics.bell-labs.com (Hemant Bokil) Date: Fri, 6 Apr 2001 18:48:44 -0400 (EDT) Subject: WAND 2001 Woodshole Message-ID: WORKSHOP ON THE ANALYSIS OF NEURAL DATA Modern methods and open issues in the analysis and interpretation of multivariate time series and imaging data in the neurosciences August 20 -- September 1, 2001 Marine Biological Laboratory, Woodhole, MA A working group of scientists committed to quantitative approaches to problems in neuroscience will meet again this summer to focus on theoretical and experimental issues related to the analysis of single and multichannel data sets. As in past years, we expect that a distinguishing feature of the work group will be a close collaboration between experimentalists and theorists with regard to the analysis of data. There will be a limited number of research lectures, supplemented by tutorials on relevant computational, experimental, and mathematical techniques. The topics covered will include the analysis of point process data (spike trains) as well as continuous processes (LFP, imaging data). It has become clear in recent years that issues relating to the "neural code" can be concretely investigated in the context of neural prosthetic devices. We are therefore planning two miniworkshops, (i) Neuronal control signals for prosthetic devices. (ii) Timing issues: departures of spike trains from rate varying Poisson processes. We will also have a third miniworkshop on (iii) Statistical inference for fMRI time series We should be able to accomodate about twenty five participants, both experimentalists and theorists and encourage graduate students, postdoctoral fellows, as well as senior researchers to apply. Experimentalists are encouraged to bring data records; appropriate computational facilities will be provided. PARTICIPANT FEE: $300 Participants will be provided with shared accomodations and board. Support: National Institutes of Health -- NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA and NINDS. Organizers: Partha P. Mitra (Bell Laboratories, Lucent Technologies), Emery Brown (Massachusettes General Hospital) and David Kleinfeld (UCSD) Website: www.vis.caltech.edu/~WAND Application: Send a copy of your c.v. together with a cover letter that contains a brief (ca. 200 word) paragraph on why you wish to attend the work group to: Ms. Jean B. Ainge Bell Laboratories, Lucent Technologies 700 Mountain Avenue 1D-427 Murray Hill, NJ 07974 908-582-4702 (fax) or Graduate students and postdoctoral fellows are encouraged to include a brief letter of support from their research advisor. Applications must be received before 25 May 2001 Participants will be confirmed on or before 1 June 2001 From vaina at engc.bu.edu Sun Apr 8 19:53:31 2001 From: vaina at engc.bu.edu (Lucia M. Vaina) Date: Sun, 8 Apr 2001 19:53:31 -0400 Subject: OPTIC FLOW AND BEYOND: A Boston Area Meeting/May 23 Message-ID: OPTIC FLOW AND BEYOND: A Boston Area Meeting May 23, 2001 Organizers: Lucia M. Vaina, Scott A. Beardsley, Simon Rushton The intention of this one day meeting is to bring together people working in the areas of Optic Flow and/or visually guided locomotion to discuss current research and key issues in the field. The meeting will be hosted by the Brain and Vision Research Laboratory Department of Biomedical Engineering Boston University and will be held May 23 (9am-6pm) Engineering Research Building (ERB) Room 203 44 Cummington Street Boston, Ma 02215 REGISTRATION: DEADLINE MAY 10TH (registration is free) If you are interested in participating please submit a one page position paper by May 10th. Position papers should be sent by e-mail to Lucia Vaina, vaina at engc.bu.edu. If you are interested in attending the meeting but do not wish to present please send an e-mail as well to RSVP. We encourage all participants to send suggestions for discussion points. Submissions received by May 10 will be posted on the Brain and Vision Research Laboratory website by May 15 together with a detailed schedule: http://www.bu.edu/eng/labs/bravi/ under OPTIC FLOW AND BEYOND. *************************************************************** Tentative schedule: 9am Coffee and bagels 10am-1pm Short presentations from those who have submitted position papers. (Presentation time will be EQUALLY divided among all speakers). 1pm-2:30pm Lunch 2:30pm-5:30pm Discussion of key points in current research 5:30pm-6pm Discuss the publication of a book with contributions from the participants. ************************************************************** FORMAT: The morning presentations will consist of short summaries of the key points of each speaker's research (Suggested length: ~3-5 slides). During the afternoon discussion section, slides may also be used. Please volunteer if you wish to lead a disccusion topic! The room has approximately 50 seats togther with a computer projector and audio-visual equipment. Slides and overhead projectors are available upon request. ACCOMODATIONS: While we are unable to cover accomodations for those who attend, breakfast and lunch will be offered to all participants. For additional information contact Lucia or Scott at 617-353-9144 or e-mail: vaina at engc.bu.edu We look forward to hearing from you, Scott, Simon and Lucia From cjlin at csie.ntu.edu.tw Sun Apr 8 20:02:53 2001 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Mon, 09 Apr 2001 08:02:53 +0800 Subject: a new paper on SVM Message-ID: Dear Colleagues: We announce a new paper on support vector machines: A comparison on methods for multi-class support vector machines by Chih-Wei Hsu and Chih-Jen Lin. http://www.csie.ntu.edu.tw/~cjlin/papers/multisvm.ps.gz Abstract: Support vector machines (SVM) was originally designed for binary classification. How to effectively extend it for multi-class classification is still an on-going research issue. Several methods have been proposed where typically we construct a multi-class classifier by combining several binary classifiers. Some authors also proposed methods that consider all classes of data at once. As it is computationally more expensive on solving multi-class problems, comparisons on these methods using large-scale problems have not been seriously conducted. Especially for methods solving multi-class SVM in one step, a much larger optimization problem is required so up to now experiments are limited to small data sets. In this paper we give decomposition implementation for two such ``all-together" methods: (Vapnik 98; Weston and Watkins 1998) and (Crammer and Singer 2000). We then compare their performance with three methods based on binary classification: ``one-against-all,'' ``one-against-one,'' and DAGSVM (Platt et al. 2000). Our experiments indicate that the ``one-against-one'' and DAG methods are more suitable for practical use than the other methods. Results also show that for large problems the method by considering all data at once in general needs fewer support vectors. Any comments are very welcome. Best Chih-Jen Lin Dept. of Computer Science National Taiwan Univ. From swilke at physik.uni-bremen.de Mon Apr 9 08:18:29 2001 From: swilke at physik.uni-bremen.de (Stefan Wilke) Date: Mon, 09 Apr 2001 14:18:29 +0200 Subject: Paper on Population Coding available Message-ID: <3AD1A894.701C4612@physik.uni-bremen.de> Dear Connectionists, the following preprint is available for downloading: http://www-neuro.physik.uni-bremen.de/institute/publications/download/swilke/WilkeEurich2001-NeuralComp.pdf Stefan D. Wilke & Christian W. Eurich Representational Accuracy of Stochastic Neural Populations. Neural Computation, in press. Abstract: Fisher information is used to analyze the accuracy with which a neural population encodes D stimulus features. It turns out that the form of response variability has a major impact on the encoding capacity and therefore plays an important role in the selection of an appropriate neural model. In particular, in the presence of baseline firing, the reconstruction error rapidly increases with D in the case of Poissonian noise, but not for additive noise. The existence of limited-range correlations of the type found in cortical tissue yields a saturation of the Fisher information content as a function of the population size only for an additive noise model. We also show that random variability in the correlation coefficient within a neural population, as found empirically, considerably improves the average encoding quality. Finally, the representational accuracy of populations with inhomogeneous tuning properties, either with variability in the tuning widths or fragmented into specialized subpopulations, is superior to the case of identical and radially symmetric tuning curves usually considered in the literature. -- Stefan D. Wilke Institut fuer Theoretische Physik (NW1) Universitaet Bremen Postfach 330 440 D-28334 Bremen, Germany Phone: +49 (421) 218 4524 WWW : http://www-neuro.physik.uni-bremen.de/~swilke From maggini at dii.unisi.it Tue Apr 10 03:42:48 2001 From: maggini at dii.unisi.it (Marco Maggini) Date: Tue, 10 Apr 2001 09:42:48 +0200 (CEST) Subject: CfP: LFTNC 2001 - Poster session Message-ID: **************************************************** * * * LFTNC 2001 * * * * NATO ADVANCED RESEARCH WORKSHOP ON * * LIMITATIONS AND FUTURE TRENDS * * IN NEURAL COMPUTATION * * * * SIENA, ITALY * * 22 - 24 OCTOBER 2001 * * * * Poster Session * * Call for papers * * * **************************************************** The poster session is an event within the NATO ARW, LFTNC 2001. It will be the opportunity for the participants to give their own contribution on the topics of the workshop. This contribution will complement the view on the future trends in neural computation that will be given by the key speakers who will present critical issues and proposals for new very promising research guidelines in the next few years. The topics for the contribution include: * limitations of neural computation * complexity issues in the continuum * continuous optimisation based learning * generalisation * real-world applications showing the benefits of neural approaches * integration of neural models with knowledge-based models The papers selected by the program committee will be published in separate proceedings titled "LFTNC-SC 2001 - 2001 NATO ARW on Limits and Future Trends of Neural Computing". Refer to the ARW web page for further details: http://www.ewh.ieee.org/soc/im/2001/lftnc/ Submission procedure -------------------- Authors should submit an extended abstract in English, not exceeding 8 pages (A4 paper, 12 pt size for text, double spacing) including tables, figures and references. Submission should include the authors' names and affiliations. The corresponding author should be clearly identified providing his/her mail address, telephone and fax numbers, and email address. The extended abstract must be sent to Prof. Marco Maggini by e-mail (maggini at dii.unisi.it). The message body must contain the following information: title, list of authors and their affiliations, keywords, a short abstract, and contact information for the corresponding author. The subject of the e-mail should be "LFTNC2001 Submission". The paper should be attached to the message as a postscript or pdf file. Long files should be compressed before emailing, by using compress, pkzip, or winzip. Please, do not send Word or Latex files. Corresponding authors will be notified by email within one week after receiving the submission. A paper number will be provided in the receipt to identify each paper. The list of received papers will be also posted on the web site of the LFTNC 2001. Acceptance/rejection will be notified by JULY 8, 2001. A postscript or pdf file of the final version of the accepted papers will be due by SEPTEMBER 8, 2001. Visit the author's kit page on the workshop web site for detailed instructions concerning the preparation of the final manuscript. Papers will be included in the proceedings only if at least one author will have been registered at the NATO ARW LFTNC 2001 by the given deadline. Important dates --------------- Submission of extended abstracts: June 8, 2001 Notification of acceptance: July 8, 2001 Author registration: July 31, 2001 Camera ready due: September 8, 2001 Contacts -------- Poster session chair Prof. Marco Maggini Dipartimento di Ingegneria dell'Informazione Università di Siena Via Roma 56 I-53100 - Siena (Italy) Tel: +39 0577 233696 Fax: +39 0577 233602 e-mail: maggini at dii.unisi.it From d.mareschal at bbk.ac.uk Tue Apr 10 07:16:59 2001 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Tue, 10 Apr 2001 12:16:59 +0100 Subject: Phd position in psychology/cognitive science Message-ID: PLEASE BRING TO THE ATTENTION OF RELEVANT PEOPLE A 3 year PhD position funded by the European Commission has recently become available for a student interested in Implicit Learning and its relationship to conscious awareness. The project will be carried out under the supervision of Dr. Axel Cleermans in Brussels. Interested candidates should contact Professor Robert FRENCH directly at the following address for more information: Robert M. French, Ph.D Quantitative Psychology and Cognitive Science Psychology Department (B32) University of Liege 4000 Liege, Belgium Tel: (32.4) 366.20.10 (work) FAX: (32.4) 366.28.59 email: rfrench at ulg.ac.be URL: http://www.fapse.ulg.ac.be/Lab/cogsci/rfrench.html Best Regrds, Denis Mareschal ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 (0)20 7631-6582/6207 fax +44 (0)20 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From s.perkins at lanl.gov Tue Apr 10 19:05:24 2001 From: s.perkins at lanl.gov (Simon Perkins) Date: Tue, 10 Apr 2001 17:05:24 -0600 Subject: Post-doc job opening Message-ID: <3AD391B4.78480384@lanl.gov> POSTDOCTORAL POSITION IN MACHINE LEARNING THEORY AND APPLICATIONS Space and Remote Sensing Sciences Group Los Alamos National Laboratory Candidates are sought for a postdoctoral position in the Space and Remote Sensing Sciences Group at Los Alamos National Laboratory in New Mexico, USA. The job will involve developing and applying state of the art machine learning techniques to practical problems in multispectral image feature identification, and in multichannel time series analysis. Prospective candidates should have a demonstrated ability to perform independent and creative research, and should have good mathematical skills. Familiarity with modern statistical machine learning techniques such as support vector machines, boosting, Gaussian processes or Bayesian methods is essential. Experience with other machine learning paradigms including neural networks and genetic algorithms is also desirable. The candidate should be able to program competently in a language such as C, C++ or Java. Experience with image or signal processing is a plus, and some knowledge of remote sensing or space physics could also be useful. The Space and Remote Sensing Sciences Group is part of the Nonproliferation and International Security Division at LANL. Its mission is to develop and apply remote sensing technologies to a variety of problems of national and international interest, including nonproliferation, detection of nuclear explosions, safeguarding nuclear materials, climate studies, environmental monitoring, volcanology, space sciences, and astrophysics. Los Alamos is a small and very friendly town situated 7200' up in the scenic Jemez mountains in northern New Mexico. The climate is very pleasant and opportunities for outdoor recreation are numerous (skiing, hiking, biking, climbing, etc). The Los Alamos public school system is excellent. LANL provides a very constructive working environment with abundant resources and support, and the opportunity to work with intelligent and creative people on a variety of interesting projects. Post-doc starting salaries are usually in the range $50-60K depending on experience, and generous assistance is provided with relocation expenses. The initial contract offered would be for two years, with good possibilities for contract extensions. The ability to get a US Department of Energy 'Q' clearance (which normally requires US citizenship) is helpful but not essential. Applicants must have received their PhD within the last three years. Interested candidates should contact Dr Simon Perkins, by e-mail: s.perkins at lanl.gov; or snail mail: Los Alamos National Laboratory, Mail Stop D-436, Los Alamos, NM 87545, USA. Please include a full resume and a covering letter explaining why you think you would make a good candidate. E-mail Postscript/PDF/Word attachments are fine. The deadline for applications is Friday, May 4th, 2001. From Igor.Tetko at iphysiol.unil.ch Wed Apr 11 03:53:54 2001 From: Igor.Tetko at iphysiol.unil.ch (Igor Tetko) Date: Wed, 11 Apr 2001 09:53:54 +0200 Subject: Article: Associative Neural Network Message-ID: Dear Connectionists, the following paper (15 pages in pdf format) is available as CogPrints archive http://cogprints.soton.ac.uk/documents/disk0/00/00/14/41/index.html ID code: cog00001441. and also from http://www.lnh.unil.ch/~itetko/articles/asnn.pdf Best regards, Igor Tetko Igor V. Tetko Associative Neural Network ABSTRACT: An associative neural network (ASNN) is a combination of an ensemble of the feed-forward neural networks and the K-nearest neighbor technique. The introduced network uses correlation between ensemble responses as a measure of distance of the analyzed cases for the nearest neighbor technique and provides an improved prediction by the bias correction of the neural network ensemble. An associative neural network has a memory that can coincide with the training set. If new data become available, the network further improves its predicting ability and can often provide a reasonable approximation of the unknown function without a need to retrain the neural network ensemble. From schultz at cns.nyu.edu Wed Apr 11 10:14:09 2001 From: schultz at cns.nyu.edu (Simon Schultz) Date: Wed, 11 Apr 2001 10:14:09 -0400 Subject: preprint: neural spike trains Message-ID: <3AD466B1.AC399C2D@cns.nyu.edu> Dear Connectionists, The following preprint is available for downloading: S. R. Schultz and S. Panzeri (2001), Temporal correlations and neural spike train entropy. Physical Review Letters, in press. Abstract: Sampling considerations limit the experimental conditions under which information theoretic analyses of neurophysiological data yield reliable results. We develop a procedure for computing the full temporal entropy and information of ensembles of neural spike trains, which performs reliably for limited samples of data. This approach also yields insight upon the role of correlations between spikes in temporal coding mechanisms. The method, when applied to recordings from complex cells of the monkey primary visual cortex, results in lower RMS error information estimates in comparison to a `brute force' approach. A preprint (4 pages in PDF format) can now be downloaded: http://www.cns.nyu.edu/~schultz/tempent.pdf It can also be obtained from the Los Alamos archive: http://arXiv.org/abs/physics/0001006 Cheers, Simon Schultz -- Dr. Simon R. Schultz Phone: +1-212 998 3775 Howard Hughes Medical Institute & Fax: +1-212 995 4011 Center for Neural Science, Email:schultz at cns.nyu.edu New York University, 4 Washington Place, New York NY 10003, U.S.A. http://www.cns.nyu.edu/~schultz/ From cpoon at mit.edu Wed Apr 11 18:20:32 2001 From: cpoon at mit.edu (Chi-Sang Poon) Date: Wed, 11 Apr 2001 18:20:32 -0400 Subject: Hebbian feedback covariance learning control In-Reply-To: <20010323080111.23EBF2B229@endor.bbb.caltech.edu> Message-ID: The following paper is available for viewing/downloading (as PDF file) from the IEEE electronic archive: http://ieeexplore.ieee.org/lpdocs/epic03/RecentIssues.htm?punumber=3477 OR http://ieeexplore.ieee.org/iel5/3477/19768/00915341.pdf -------------------------------------------------------------------- A Hebbian Feedback Covariance Learning Paradigm for Self-Tuning Optimal Control D.L. Young and C.-S. Poon IEEE Trans. Systems, Man and Cybernetics, Part B, Volume: 31 Issue: 2, pp. 173-186, April 2001 We propose a novel adaptive optimal control paradigm inspired by Hebbian covariance synaptic adaptation, a preeminent model of learning and memory and other malleable functions in the brain. The adaptation is driven by the spontaneous fluctuations in the system input and output, the covariance of which provides useful information about the changes in the system behavior. The control structure represents a novel form of associative reinforcement learning in which the reinforcement signal is implicitly given by the covariance of the input-output signals. Theoretical foundations for the paradigm are derived using Lyapunov theory and are verified by means of computer simulations. The learning algorithm is applicable to a general class of non-linear adaptive control problems. This on-line direct adaptive control method benefits from a computationally straightforward design, proof of convergence, no need for complete system identification, robustness to noise and uncertainties, and the ability to optimize a general performance criterion in terms of system states and control signals. These attractive properties of Hebbian feedback covariance learning control lend themselves to future investigations into the computational functions of synaptic plasticity in biological neurons. From C.Campbell at bristol.ac.uk Thu Apr 12 10:50:42 2001 From: C.Campbell at bristol.ac.uk (Colin Campbell, Engineering Mathematics) Date: Thu, 12 Apr 2001 15:50:42 +0100 (GMT Daylight Time) Subject: Fixed term lectureship position In-Reply-To: <200103211315.OAA06453@mail.gmd.de> Message-ID: Lectureship in the Department of Engineering Mathematics Applications are invited for a 5 year Lectureship in the Department of Engineering Mathematics, University of Bristol, United Kingdom. Candidates should have an excellent track record in research related to, or complementing, those of the Artificial Intelligence and Computational Intelligence groups in the department. The Artificial Intelligence Research Group The Artificial Intelligence group has an international reputation for the development and use of innovative methods for handling uncertainty in real-world AI applications. Current research includes theories and applications of logic programming, reasoning with uncertainty, modelling with words, fuzzy sets and fuzzy logic. For more information on the Artificial Intelligence group please visit: http://www.enm.bris.ac.uk/ai The Computational Intelligence group The research of the Computational Intelligence group centres on sub-symbolic approaches to machine intelligence. On the theoretical side interests include statistical learning theory, support vector machines, neural networks and the design of learning algorithms. Applications include applying these methods to medical decision support, bioinformatics and machine vision datasets. For more information on the Computational Intelligence group please visit: http://lara.enm.bris.ac.uk/~cig The Artificial Intelligence and Computational Intelligence groups are part of the University's Advanced Computing Research Centre (ACRC). The department runs its own degree programmes in Engineering Mathematics and Mathematics for Intelligent Systems in addition to providing mathematical, AI and theoretical computer science courses for undergraduate and masters degree programmes across the Faculty. The department achieved a score of 5 (research quality of international excellence) in the last Research Assessment Exercise and 23/24 for the HEFCE TQA assessment of its courses. For more information on this position and details of how to apply please visit the following web page: http://www.enm.bris.ac.uk/admin/vacancies/Lect_01.htm For more information on research and life at Bristol please visit the following web pages: Artificial Intelligence Group: http://www.enm.bris.ac.uk/ai Computational Intelligence group: http://lara.enm.bris.ac.uk/~cig Department of Engineering Mathematics: http://www.enm.bris.ac.uk Faculty of Engineering: http://www.fen.bris.ac.uk University of Bristol: http://www.bris.ac.uk -- -------------------------------------------------------------------------- Artificial Intelligence Group Tel: (+44) 117 9289743 Dept of Engineering Maths Fax: (+44) 117 9251154 University of Bristol Email: Jonathan.Rossiter at bris.ac.uk Bristol BS8 1TR UK http://eis.bris.ac.uk/~enjmr -------------------------------------------------------------------------- From dimi at ci.tuwien.ac.at Thu Apr 12 05:58:57 2001 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Thu, 12 Apr 2001 11:58:57 +0200 (CEST) Subject: CI BibTeX Collection -- Update In-Reply-To: Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 4/4 IEEE Transactions on Fuzzy Systems, Volumes 8/6 Machine Learning, Volumes 42/1-43/2 Neural Computation, Volumes 12/12-13/1 Neural Networks, Volumes 13/7-14/3 Neural Processing Letters, Volumes 12/3-13/1 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fuer Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitaet Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ _______________________________________________ nn-at mailing list - nn-at at ci.tuwien.ac.at http://fangorn.ci.tuwien.ac.at/cgi-bin/mailman/listinfo/nn-at From bozinovs at rea.etf.ukim.edu.mk Fri Apr 13 15:39:16 2001 From: bozinovs at rea.etf.ukim.edu.mk (Stevo Bozinovski) Date: Fri, 13 Apr 2001 21:39:16 +0200 (DFT) Subject: neural cell genetics CFP Message-ID: -------------------------------------------------------------------- We apologize if you receive multiple copies of this message Please, feel free to distribute it to interested persons -------------------------------------------------------------------- --- Call for Papers --- BIONICS OF PRODUCTION LINES: GENETICS, METABOLICS, AND FLEXIBLE MANUFACTURING Invited Session at Fifth Multiconference on Systemics, Cybernetics and Informatics July 22-25 Orlando, Florida Sheraton World Resort Background and Motivation Analogies between biological and technical systems have been explored since the Wiener's statement of Cybernetics. A successful example today is the research in neural networks, natural and artificial. We believe that the analogy between biological and non-biological production lines is another interesting area to explore in the realm of cybernetics and bionics. This session is aimed to be a forum of exchanging ideas and presenting facts about both the cell production systems and modern concepts of adaptive manufacturing. Emphasis is put on exploring the biological production lines in terms of the concepts relevant for flexible manufacturing systems, but also on recognizing solutions in biology relevant for human-made manufacturing systems. List of relevant topics includes, but is not limited to Genetics of manufacturing Control of protein biosynthesis Metabolic networks Biomolecular robots and biomolecular machines Autonomous manufacturing systems Unicellular systems Neural cell genetics Genes, sensors, behavior, adaptation Surviving strategies Flexible manufacturing systems Material processing in biological and human-made FMS Information processing in biological and human-made FMS Adaptive manufacturing systems Mobile factories on other planets Bionic manufacturing systems Submissions Potential participants should submit an extended abstract or paper draft of their work in the area. Submissions will be reviewed by independent referees, and should not exceed 2000 words for extended abstracts and 5000 words for paper drafts. Submissions should be sent via e-mail as ASCII or PDF file to either of the session organizers. Accepted papers will be published in the Conference Proceedings. In addition, an effort will be made to publish the papers in a special volume of a major publisher. Important dates Submission of manuscripts: April 30, 2001 Notification of acceptance: May 13, 2001 Camera ready copy: May 23, 2001 Registration Attendants to the invited session must register for the main conference. There is no additional fee for the session. Please see the Conference web page http://www.iiis.org/sci/ for details. Organizers Stevo Bozinovski Computer Science and Engineering Institute Center for Beings Research Electrical Engineering Department Sts Cyril and Methodius University Skopje, Macedonia bozinovs at rea.etf.ukim.edu.mk Ralf Hofestaedt Computer Science Department Otto-von-Guericke University Magdeburg, Germany hofestae at iti.cs.uni-magdeburg.de From greiner at cs.ualberta.ca Fri Apr 13 21:15:52 2001 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Fri, 13 Apr 2001 19:15:52 -0600 Subject: Call for Applications: PostDoctoral Research Scientists Message-ID: <20010414011552Z433495-3037+191@scapa.cs.ualberta.ca> POSTDOCTORAL RESEARCH SCIENTISTS Call for Applications Bioinformatics - Query Answering - Game Playing - Adaptive Agents Machine Learning - Probabilistic Modelling - Decision Support Just got your PhD and want to focus on pure curiosity-driven research before jumping into the tenure-track pressure cooker? We are looking for one or more postdoctoral research scientists (PDRS), to help us work on theoretical and applied research in various specific research projects -- related to the topics listed above and with the individual researchers below, in collaboration with various research-friendly companies, including BioTools, Chenomx, Electronic Arts, CELcorp, net-linx, Syncrude, ... These PostDoc positions improve on most positions as... * The salary will be competitive with tenure-track positions. * You will also be allowed/expected to spend 50% of your time on your own curiousity-driven agenda -- we hope in collaboration with various members of our faculty; see http://www.cs.ualberta.ca/~ai. You will be part of a team that has recently emerged as one of the strongest AI groups anywhere, with a number of world-class professors that include editors-in-chief of major journals, AAAI Fellows, Steacie Fellows, McCalla Fellows ... And we are continuing to grow and improve. Moreover, our department is known for its collegiality. You will also help us show off our group in 2002, when we will host AAAI'02 http://aaai.org KDD'02 http://www.acm.org/sigkdd ISMB'02 http://www.iscb.org Applicants should * have a solid background in one or more of the areas described above * have good scientific skills * be good at writing software to implement and evaluate algorithms Successful applicants will have the opportunity to do sessional teaching in the department of Computing Science. Applicants should EMAIL a CV, the email addresses of 3 references, and a short description of their research interests and goals as a postdoc (ascii format, < 500 words) to Russ Greiner (greiner at cs.ualberta.ca) You are encouraged to *also* post additional information on http://www.cs.ualberta.ca/jobs/postdoc.html We are very flexible with time commitments; applicants should indicate how long they would like to remain as a PDRS -- typical stay is between 1 and 3 years. We are an equal opportunity employer, eagerly seeking applicants from Canada or any other country. For more information, see http://www.cs.ualberta.ca/~greiner/PostDoc.html | R Greiner Phone: (780) 492-5461 | | Dep't of Computing Science FAX: (780) 492-1071 | | University of Alberta Email: greiner at cs.ualberta.ca | | Edmonton, AB T6G 2E8 Canada http://www.cs.ualberta.ca/~greiner/ | From terry at salk.edu Tue Apr 17 16:45:29 2001 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 17 Apr 2001 13:45:29 -0700 (PDT) Subject: NEURAL COMPUTATION 13:5 In-Reply-To: <200103072248.f27MmVH58010@kepler.salk.edu> Message-ID: <200104172045.f3HKjTA44914@purkinje.salk.edu> Neural Computation - Contents - Volume 13, Number 5 - May 1, 2001 ARTICLE Patterns of Synchrony in Neural Networks with Spike Adaptation C. van Vreeswijk and D. Hansel NOTE Bayesian Analysis of Mixtures of Factor Analyzers Akio Utsugi and Toru Kumagai LETTERS Synchronization in Relaxation Oscillator Networks with Conduction Delays Jeffrey J. Fox, Ciriyam Jayaprakash, DeLiang Wang and Shannon R. Campbell Predictions of the Spontaneous Symmetry-Breaking Theory for Visual Code Completeness and Spatial Scaling in Single-Cell Learning Rules Chris J. S. Webber Localist Attractor Networks Richard S. Zemel and Michael C. Mozer Stochastic Organization of Output Codes in Multiclass Learning Problems Wolfgang Utschick and Werner Weichselberger Predictive Approaches for Choosing Hyperparameters in Gaussian Processes S. Sundararajan and S.S. Keerthi Architecture-Independent Approximation of Functions Vicente Ruiz de Angulo and Carme Torras Analyzing Holistic Parsers: Implications for Robust Parsing and Systematicity Edward Kei Shiu Ho and Lai Wan Chan The Computational Exploration of Visual Word Recognition in a Split Model Richard Shillcock and Padraic Monaghan ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From mschmitt at lmi.ruhr-uni-bochum.de Tue Apr 17 07:31:43 2001 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Tue, 17 Apr 2001 13:31:43 +0200 Subject: Preprint on Multiplicative Neural Networks Message-ID: <3ADC299F.DD35F42B@lmi.ruhr-uni-bochum.de> Dear Colleagues, a preprint of the paper "On the complexity of computing and learning with multiplicative neural networks" by Michael Schmitt, to appear in Neural Computation, is available on-line from http://www.ruhr-uni-bochum.de/lmi/mschmitt/multiplicative.ps.gz (63 pages gzipped PostScript). Regards, Michael Schmitt ------------------------------------------------------------ TITLE: On the Complexity of Computing and Learning with Multiplicative Neural Networks AUTHOR: Michael Schmitt ABSTRACT In a great variety of neuron models neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units which multiply their inputs instead of summing them and, thus, allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well studied network types as higher-order networks and product unit networks. We investigate the complexity of computing and learning for multiplicative neural networks. In particular, we derive upper and lower bounds on the Vapnik-Chervonenkis (VC) dimension and the pseudo dimension for various types of networks with multiplicative units. As the most general case, we consider feedforward networks consisting of product and sigmoidal units, showing that their pseudo dimension is bounded from above by a polynomial with the same order of magnitude as the currently best known bound for purely sigmoidal networks. Moreover, we show that this bound holds even in the case when the unit type, product or sigmoidal, may be learned. Crucial for these results are calculations of solution set components bounds for new network classes. As to lower bounds we construct product unit networks of fixed depth with superlinear VC dimension. For sigmoidal networks of higher order we establish polynomial bounds that, in contrast to previous results, do not involve any restriction of the network order. We further consider various classes of higher-order units, also known as sigma-pi units, that are characterized by connectivity constraints. In terms of these we derive some asymptotically tight bounds. Multiplication plays an important role both in neural modeling of biological behavior and in computing and learning with artificial neural networks. We briefly survey research in biology and in applications where multiplication is considered an essential computational element. The results we present here provide new tools for assessing the impact of multiplication on the computational power and the learning capabilities of neural networks. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: +49 234 32-23209 , Fax: +49 234 32-14465 http://www.ruhr-uni-bochum.de/lmi/mschmitt/ From ceesvl at brain.riken.go.jp Wed Apr 18 00:29:28 2001 From: ceesvl at brain.riken.go.jp (Cees van Leeuwen) Date: Wed, 18 Apr 2001 13:29:28 +0900 Subject: pre- and postdoc positions available Message-ID: <001a01c0c7c0$27f32bb0$b4a6a086@CEESVDESKTOP> Post and pre-doctoral Positions, and Technical Assistant Positions in Experimental and Computational Psychology Perceptual Dynamics Laboratory RIKEN BSI Several pre and postdoctoral research positions, and technical assistant positions are available immediately in the newly established Perceptual Dynamics Laboratory (head: Cees van Leeuwen) at the RIKEN Brain Science Institute, Japan. The RIKEN BSI is the primary government-funded basic research institute in Japan. The working language is English. The researchers will be working on interdisciplinary projects relating to perceptual integration and the perception of visual objects and scenes. Focus of the laboratory is the application of complex (i.e. nonlinear, nonstationary) dynamical systems to visual perception. We are looking for candidates who are interested in computational and experimental approaches. Candidates should have a strong background in one or more of the following: computational or mathematical modeling of neural information processes, cognitive neuroscience, cognitive psychology, psychophysiology, psychophyics, or related disciplines. Technical assistant positions include: a software engineer to further develop an interactive research environment for the numerical simulation of complex dynamical systems. The preferred computer languages are Matlab and C++. We will also consider those who are interested in developing applications for running and analyzing experiments, including multi-channel EEG and eye-movement recording. Candidates should have an M.Sc. or equivalent. Those interested to qualify for a higher degree will be considered. The laboratory offers excellent research facilities for conducting computer simulations and experiments, travel support for conferences, an attractive international academic environment, and is located within the Tokyo metropolitan area. Housing facilities are available for an initial period. Competitive salaries are offered. The initial appointment will be for one year, and will be renewable on an annual basis. Recruitment continues until the applications are filled. Applicants should send curriculum vitae, statement of research interests, two letters of reference, and representative publications to: Prof. Cees van Leeuwen Perceptual Dynamics Laboratory 2-1 Hirosawa, Wakoshi Saitama, 351-0198 Japan ceesvl at brain.riken.go.jp www.brain.riken.go.jp From jfgf at cs.berkeley.edu Wed Apr 18 18:34:42 2001 From: jfgf at cs.berkeley.edu (Nando de Freitas) Date: Wed, 18 Apr 2001 15:34:42 -0700 Subject: Particle filtering website Message-ID: <3ADE1682.14AF02F3@cs.berkeley.edu> Dear Connectionists, For those of you interested in sequential data analysis and tracking using particle filters (aka condensation, survival of the fittest, sequential Monte Carlo, ...), there is a website at http://www-sigproc.eng.cam.ac.uk/smc/index.html and mirrored at http://www.cs.berkeley.edu/~jfgf/smc/ that has a list of people, papers, links an software in this field. We encourage you to submit your papers in this area so as to strengthen the link between the related work in the fields of control, signal processing, physics, AI, vision, econometrics and statistics. Best, Nando -- Computer Science Division | Phone : (510) 642-4979 387 Soda Hall | Fax : (510) 642-5775 University of California, Berkeley | E-mail: jfgf at cs.berkeley.edu Berkeley, CA 94720-1776 USA | URL : http://www.cs.berkeley.edu/~jfgf From zemel at cs.toronto.edu Thu Apr 19 10:39:08 2001 From: zemel at cs.toronto.edu (Richard Zemel) Date: Thu, 19 Apr 2001 10:39:08 -0400 Subject: NIPS*2001 Announcement Message-ID: <01Apr19.103910edt.453179-20163@jane.cs.toronto.edu> ***** NIPS is moving to Vancouver in 2001 ***** CALL FOR PAPERS -- NIPS*2001 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Dec. 3 -- Saturday, Dec. 8, 2001 Vancouver, British Columbia, Canada Whistler Ski Resort ========================================== This is the fifteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Dec. 3), and following it there will be two days of focused workshops on topical issues at Whistler Ski Resort (Dec. 7-8). Invited speakers this year will be: Barbara Finlay (Departments of Psychology, and Neurobiology and Behavior, Cornell University) Alison Gopnik (Department of Psychology, University of California at Berkeley) Jon M. Kleinberg (Department of Computer Science, Cornell University) Shihab Shamma (Department of Electrical Engineering University of Maryland) Judea Pearl (Department of Computer Science, UCLA) Tom Knight (Artificial Intelligence Laboratory, MIT) Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, kernel methods, committee models, graphical models, support vector machines, Gaussian processes, decision trees, factor analysis, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, web and network applications, intrusion and fraud detection, bio-informatics, medical diagnosis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music and video applications, animation, virtual environments. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, human learning and memory, conditioning, categorization, attention, language, reasoning, spatial cognition, emotional cognition, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, neuromorphic engineering, microelectromechanical systems, optical systems, vision chips, head-eye systems, neural prostheses, roving robots, computational sensors and actuators, molecular and quantum computing, novel neurodevices, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, population codes, temporal codes, spike train analysis, and experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, computational models of classical and operant conditioning, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, game-playing, Markov decision processes, multi-agent cooperative algorithms. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory. Visual Processing: image processing, image coding, object recognition, face recognition, visual feature detection, visual psychophysics, stereopsis, optic flow algorithms, motion detection and tracking, spatial representations, spatial attention, scene analysis, visual search, visuo-spatial working memory. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable from the web page listed below. The style files are unchanged from NIPS*2000. Submission Instructions: NIPS accepts only electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We will accept postscript and PDF documents, but we prefer postscript. The electronic submission page will be available on June 6, 2001 Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT JUNE 20, 2001 PACIFIC DAYLIGHT TIME (08:00 GMT JUNE 21, 2001). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS For general inquiries send e-mail to nipsinfo at salk.edu. NIPS*2001 Organizing Committee: General Chair, Tom Dietterich, Oregon State University; Program Chair, Sue Becker, McMaster University; Publications Chair, Zoubin Ghahramani, University College London; Tutorial Chair, Yoshua Bengio, University of Montreal; Workshops Co-Chairs, Virginia de Sa, Sloan Center for Theoretical Neurobiology, Barak Pearlmutter, University of New Mexico; Publicity Chair, Richard Zemel, University of Toronto; Volunteer Coordinator, Sidney Fels, University of British Columbia; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Alex Gray, Carnegie Mellon University, Xin Wang, Oregon State University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2001 Program Committee: Sue Becker, McMaster University (chair); Gert Cauwenberghs, Johns Hopkins University; Bill Freeman, Mitsubishi Electric Research Lab; Thomas Hofmann, Brown University; Dan Lee, Bell Laboratories, Lucent Technologies; Sridhar Mahadevan, Michigan State University; Marina Meila-Predoviciu, University of Washington; Klaus Mueller, GMD First, Berlin; Klaus Obermayer, TU Berlin; Sam Roweis, Gatsby Computational Neuroscience Unit, UCL; John Shawe-Taylor, Royal Holloway, University of London; Josh Tenenbaum, Stanford University; Volker Tresp, Siemens, Munich; Richard Zemel, University of Toronto. PAPERS MUST BE SUBMITTED BY JUNE 20, 2001 From ingber at ingber.com Thu Apr 19 18:20:51 2001 From: ingber at ingber.com (Lester Ingber) Date: Thu, 19 Apr 2001 17:20:51 -0500 Subject: Computational Finance Position Message-ID: <20010419172051.A23252@ingber.com> If you have very strong credentials for the position described below, please send your resume to: Prof. Lester Ingber Director Research & Development DRW Investments LLC 311 S Wacker Dr Ste 900 Chicago, IL 60606 Email (preferred) ingber at ingber.com COMPUTATIONAL FINANCE: Experienced programmer in Java, C and/or C++. Previous financial experience preferred. Excellent background in Physics, Math, or similar disciplines, at least at PhD level. The R&D group works directly with other traders and develops its own automated trading systems. See www.ingber.com for papers on some current projects. -- Prof. Lester Ingber http://www.ingber.com/ PO Box 06440 Sears Tower Chicago IL 60606-0440 http://www.alumni.caltech.edu/~ingber/ From bis at prip.tuwien.ac.at Thu Apr 19 13:05:12 2001 From: bis at prip.tuwien.ac.at (Horst Bischof) Date: Thu, 19 Apr 2001 19:05:12 +0200 Subject: ICANN-WS on Kernel based Methods for Computer Vision Message-ID: <3ADF1AC8.48A4949@prip.tuwien.ac.at> ICANN 2001 Workshop on Kernel & Subspace Methods for Computer Vision http://www.prip.tuwien.ac.at/~bis/kernelws/ Call for Papers Workshop Co-organizers: Ales Leonardis Horst Bischof Scope of the workshop: This half-day workshop will be held in conjunction with ICANN 2001 on August 25, 2001 in Vienna. In the past years, we have been witnessing vivid developments of sophisticated kernel and subspace methods in neural network and pattern recognition communities on one hand and extensive use of these methods in the area of computer vision on the other hand. These methods seem to be especially relevant for object and scene recognition. The purpose of the workshop is to bring together scientists from the neural network (pattern recognition) and computer vision community to analyze new developments, identify open problems, and discuss possible solutions in the area of kernel & subspace methods such as: Support Vector Machines Independent Component Analysis Principle Component Analysis Canonical Correlation Analysis, etc. for computer vision problems such as: Object Recognition Navigation and Robotics 3D Vision, etc. Contributions in the above mentioned areas are welcome. The program will consist of invited and selected contributed papers. The papers selected for the workshop will appear in a Workshop Proceedings which will be distributed among the workshop participants. It is planned that selected papers from the workshop will be published in a journal. Important dates: Submission Deadline: 31.5.2001 Notification of Accaptance: 29.6.2001 Final Papers Due: 3.8.2001 Submission instructions: A complete paper, not longer than 12 pages including figures and references, should be submitted in the LNCS page format. The layout of final papers must adhere strictly to the guidelines set out in the Instructions for the Preparation of Camera-Ready Contributions to LNCS Proceedings. Authors are asked to follow these instructions exactly. In order to reduce the handling effort of papers we allow only for electronic submissions by ftp (either ps or pdf files). ftp ftp.prip.tuwien.ac.at [anonymous ftp, i.e.: Name: ftp Password: ] cd kernelws binary put .ext quit Workshop Registration: Registration for the Workshop can be done at the ICANN 2001 Homepage http://www.ai.univie.ac.at/icann/ From kap-listman at wkap.nl Thu Apr 19 20:06:17 2001 From: kap-listman at wkap.nl (kap-listman@wkap.nl) Date: Fri, 20 Apr 2001 02:06:17 +0200 (METDST) Subject: New Issue: Neural Processing Letters. Vol. 13, Issue 2 Message-ID: <200104200006.CAA10491@wkap.nl> Kluwer ALERT, the free notification service from Kluwer Academic/PLENUM Publishers and Kluwer Law International ------------------------------------------------------------ Neural Processing Letters ISSN 1370-4621 http://www.wkap.nl/issuetoc.htm/1370-4621+13+2+2001 Vol. 13, Issue 2, April 2001. TITLE: Using a New Model of Recurrent Neural Network for Control AUTHOR(S): L. Boquete, L. M. Bergasa, R. Barea, R. Garcia, M. Mazo KEYWORD(S): intelligent control, Lyapunov stability, radial basis function, recurrent neural network. PAGE(S): 101-113 TITLE: Multi-step Learning Rule for Recurrent Neural Models: An Application to Time Series Forecasting AUTHOR(S): Ines M. Galvan, Pedro Isasi KEYWORD(S): multi-step prediction, neural networks, time series, time series modelling. PAGE(S): 115-133 TITLE: Finite-Sample Convergence Properties of the LVQ1 Algorithm and the Batch LVQ1 Algorithm AUTHOR(S): Sergio Bermejo, Joan Cabestany KEYWORD(S): LVQ1 algorithm, asymptotic convergence, online gradient descent, finite-sample properties, BLVQ1 algorithm, Newton optimisation. PAGE(S): 135-157 TITLE: Learning with Nearest Neighbour Classifiers AUTHOR(S): Sergio Bermejo, Joan Cabestany KEYWORD(S): nearest neighbour classifiers, online gradient descent, Learning Vector Quantization, hand-written character recognition. PAGE(S): 159-181 TITLE: Generalizations of the Hamming Associative Memory AUTHOR(S): Paul Watta, Mohamad H. Hassoun KEYWORD(S): artificial neural network, associative memory, capacity, error correction, Hamming net. PAGE(S): 183-194 -------------------------------------------------------------- Thank you for your interest in Kluwer's books and journals. NORTH, CENTRAL AND SOUTH AMERICA Kluwer Academic Publishers Order Department, PO Box 358 Accord Station, Hingham, MA 02018-0358 USA Telephone (781) 871-6600 Fax (781) 681-9045 E-Mail: kluwer at wkap.com EUROPE, ASIA AND AFRICA Kluwer Academic Publishers Distribution Center PO Box 322 3300 AH Dordrecht The Netherlands Telephone 31-78-6392392 Fax 31-78-6546474 E-Mail: orderdept at wkap.nl From jf218 at hermes.cam.ac.uk Fri Apr 20 05:56:43 2001 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Fri, 20 Apr 2001 10:56:43 +0100 (BST) Subject: six papers on modelling single neuron and SVM are available Message-ID: Dear All, Five papers on modelling single neuron and one on SVM (see below for abstracts) are available on my home-page http://www.cogs.susx.ac.uk/users/jianfeng the best Jianfeng -------------------------------------------------------------------------- Titles: [54] Feng J. (2001) Is the integrate-and-fire model good enough? --a review Neural Networks (in press) [53] Feng J., Brown D., Wei G., and Tirozzi B. (2001) Detectable And Undetectable Input Signals For The Integrate-and-fire Model? J. Phys. A. vol. 34, 1637-1648 [52] Feng J., and, Zhang P. (2001) The Behaviour of Integrate-and-fire and Hodgkin-Huxley Models With Correlated Inputs Phys. Rev. E. (in press). [51] Feng J., Li, G.B., Brown D., and Buxton H. (2001) Balance between four types of synaptic input for the integrate-and-fire model J. Theor. Biol. vol. 203, 61-79 [50] Feng J., and, Li G. (2001) Neuronal models with current inputs J. Phys. A. vol. 34, 1649-1664 [55] Feng J., and Williams P. M. (2001) The generalization error of the symmetric and scaled Support Vector Machines IEEE T. Neural Networks (in press). -------------------------------------------------------------------------- Abstracts: [54] Feng J. (2001) Is the integrate-and-fire model good enough? --a review Neural Networks(in press) We review some recent results on the behaviour of the integrate-and-fire (IF) model, the FitzHugh-Nagumo (FHN) model, a simplified version of the FHN (IF-FHN) model [11] and the Hodgkin-Huxley (HH) model with correlated inputs. The effect of inhibitory inputs on the model behaviour is also taken into account. Here inputs exclusively take the form of diffusion approximation and correlated inputs mean correlated synaptic inputs (Section 2 and 3). It is found that the IF and HH models respond to correlated inputs in totally opposite ways, but the IF-FHN model shows the similar behaviour as the HH model. Increasing inhibitory input to single neuronal model, such as the FHN model and the HH model, can sometimes increase their firing rates, which we termed as inhibition-boosted firing (IBF). Using the IF model and IF-FHN model, we theoretically explore how and when the IBF can happen. The computational complexity of the IF-FHN model is very similar to the conventional IF model, but the former captures some interesting and essential features of biophysical models and could serve as a better model for spiking neuron computation. [53] Feng J., Brown D., Wei G., and Tirozzi B. (2001) Detectable And Undetectable Input Signals For The Integrate-and-fire Model? J. Phys. A. vol. 34, 1637-1648 We consider the integrate-and-fire model with non-stationary, stochastic inputs and address the following issue: what are the conditions on the input currents that make the input signal undetectable? A novel theoretical approach to tackle the problem for the model with non-stationary inputs is introduced. When the noise strength is independent of the deterministic component of the synaptic input, an expression for the critical input signal is given. If the input signal is weaker than the critical input signal, the neuron ultimately stops firing, i.e. is not able to detect the input signal; otherwise it fires with probability one. Similar results are established for Poisson type inputs where the strength of the noise is proportional to the deterministic component of the synaptic input. [52] Feng J., and, Zhang P. (2001) The Behaviour of Integrate-and-fire and Hodgkin-Huxley Models With Correlated Inputs Phys. Rev. E. (in press). We assess, both numerically and theoretically, how positively correlated Poisson inputs affect the output of the integrate-and-fire and Hodgkin-Huxley models. For the integrate-and-fire model the variability of efferent spike trains is an increasing function of input correlation, and of the ratio between inhibitory and excitatory inputs. Interestingly for the Hodgkin-Huxley model the variability of efferent spike trains is a decreasing function of input correlations, and for fixed input correlation it is almost independent of the ratio between inhibitory and excitatory inputs. In terms of the signal to noise ratio of efferent spike trains the IF model works better in an environment of asynchronous inputs, but the Hodgkin-Huxley model has an advantage for more synchronous (correlated ) inputs. In conclusion the integrate-and-fire and HH models respond to correlated inputs in totally opposite ways. [51] Feng J., Li, G.B., Brown D., and Buxton H. (2001) Balance between four types of synaptic input for the integrate-and-fire model J. Theor. Biol. vol. 203, 61-79 We consider the integrate-and-fire model with AMPA, NMDA, GABA_A and GABA_B synaptic inputs, wit model parameters based upon experimental data. An analytical approach is presented to determine when a post-synaptic balance between excitation and inhibition can be achieved. Secondly we compare the model behaviour subject to these four types of input, with its behaviour subjected to conventional point process inputs. We conclude that point processes are not a good approximation, even away from exact presynaptic balance. Thirdly, numerical simulations are presented which demonstrate that we can treat NMDA and GABA_B as DC currents. Finally we conclude that a balanced input is plausible neither presynaptically not postsynaptically for the model and parameters we employed. [50] Feng J., and, Li G. (2001) Neuronal models with current inputs J. Phys. A. vol. 34, 1649-1664 For the integrate-and-fire model and the HH model, we consider how current inputs including alpha-wave and square-wave affect their outputs. Firstly the usual approximation is employed to approximate the models with current inputs which reveals the difference between instantaneous and non-instantaneous (current) inputs. When the rising time of alpha-wave inputs is long or the ratio between the inhibitory and excitatory inputs is close to one, the usual approximation fails to approximate the alpha-wave inputs in the integrate-and-fire model. For the Hodgkin-Huxley model, the usual approximation in general gives an unsatisfying approximation. A novel approach based upon a superposition of 'coloured' and 'white' noise is then proposed to replace the usual approximation. Numerical results show that the novel approach substantially improves the approximation within widely physiologically reasonable regions of the rising rime of alpha-wave inputs. [55] Feng J., and Williams P. M. (2001) The generalization error of the symmetric and scaled Support Vector Machines IEEE T. Neural Networks (in press). It is generally believed that the support vector machine (SVM) optimises the generalisation error and output performs other learning machines. We show analytically, by concrete examples in the one dimensional case, that the support vector machine does improve the mean and standard deviation of the generalisation error by a constant factor, compared to the worst learning machine. Our approach is in terms of extreme value theory and both the mean and variance of the generalisation error are calculated exactly for all cases considered. We propose a new version of the SVM (scaled SVM) which can further reduce the mean of the generalisation error of the SVM. From paolo at dsi.unifi.it Fri Apr 20 07:07:21 2001 From: paolo at dsi.unifi.it (Paolo Frasconi) Date: Fri, 20 Apr 2001 13:07:21 +0200 Subject: Special issue on connectionist models for learning in structured domains Message-ID: The members of this list may be interested in the most recent issue of the IEEE Transactions on Knowledge and Data Engineering which is a Special Issue on Connectionist Models for Learning in Structured Domains. IEEE Transactions on Knowledge and Data Engineering Vol. 13, No. 2, March/April 2001 SPECIAL SECTION ON CONNECTIONIST MODELS FOR LEARNING IN STRUCTURED DOMAINS Abstracts can be found at http://www.dsi.unifi.it/neural/tkde-datas.html Full text is available to subscribers from the IEEE TKDE home page http://computer.org/tkde/index.htm Guest Editorial Introduction to the Special Section P. Frasconi, M. Gori, and A. Sperduti Simple Strategies to Encode Tree Automata in Sigmoid Recursive Neural Networks R.C. Carrasco and M.L. Forcada Integrating Linguistic Primitives in Learning Context-Dependent Representation S.W.K. Chan Symbolic vs. Connectionist Learning: An Experimental Comparison in a Structured Domain P. Foggia, R. Genna, and M. Vento Generalization Ability of Folding Networks B. Hammer Hierarchical Growing Cell Structures: TreeGCS V.J. Hodge and J. Austin Incremental Syntactic Parsing of Natural Language Corpora with Simple Synchrony Networks P.C.R. Lane and J.B. Henderson Learning Distributed Representations of Concepts Using Linear Relational Embedding A. Paccanaro and G.E. Hinton Clustering and Classification in Structured Data Domains Using Fuzzy Lattice Neurocomputing (FLN) V. Petridis and V.G. Kaburlasos Representation and Processing of Structures with Binary Sparse Distributed Codes D.A. Rachkovskij [Sorry, I can provide no hardcopies - for electronic copies, please contact the authors directly]. From Nigel.Goddard at ed.ac.uk Fri Apr 20 13:30:55 2001 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Fri, 20 Apr 2001 18:30:55 +0100 Subject: Multilevel Modeling and Simulation Workshop Message-ID: <3AE0724F.504F8147@ed.ac.uk> WORKSHOP ON MULTILEVEL NEURONAL MODELLING AND SIMULATION a Maxwell Neuroinformatics Workshop Call for Participation May 21-25, 2001, Edinburgh, Scotland http://www.anc.ed.ac.uk/workshops An emerging theme in modelling and understanding brain processes is that understanding processes at one level can be greatly enhanced by considering the process embedded in its context and by consideration of the complexities of processes operating at a much finer level of spatiotemporal resolution. The aim of this workshop is to bring together scientists with experimental, computational and theoretical approaches spanning multiple levels to provide an opportunity for interaction between methodological and phenomenological foci. One goal is to explore how abstractions at different levels are related, from molecular to system levels, with reference to both natural and artificial systems. A second goal is to discuss the nature of the computational tools needed to support effective modelling across abstractions and levels. The meeting is being organized in a small workshop style with emphasis on short presentations from invited speakers and from participants, round table discussions, and open debates on emerging topics. Time is scheduled for informal, self-organised, small-group activities. Computers will be available to support explorative work and demonstrations. In addition to the invited speakers, a limited number of places will be available to interested scientists, who will be chosen on the basis of the contribution they can make to the workshop. A number of places are reserved for junior faculty, postdoctoral researchers and senior graduate students who are early on in a research career in the areas covered by the workshop and who could gain significantly from exposure to the workshop presentations and discussions. We expect to have some travel/accommodation stipends for some of these participants who do not have access to their own funding to participate. Registration is via the developing Neuroinformatics portal at http://www.neuroinf.org, and further information can be found at the workshope site: http://www.anc.ed.ac.uk/workshops/Workshop1.html From psarroa at wmin.ac.uk Tue Apr 24 08:43:50 2001 From: psarroa at wmin.ac.uk (Dr Alexandra Psarrou) Date: Tue, 24 Apr 2001 12:43:50 -0000 Subject: PhD studentships in behaviour modelling and content-based indexing Message-ID: <012c01c0ccbc$36c0d920$0300a8c0@as7400> PhD STUDENTSHIPS FOR BEHAVIOUR MODELLING & CONTENT-BASED INDEXING UNIVERSITY OF WESTMINSTER HARROW SCHOOL OF COMPUTER SCIENCE Department of Artificial Intelligence and Interactive Multimedia Applications are invited for studentships leading to a PhD in the areas of behaviour modelling and content-based indexing. Starting date for the position is September/October 2001. The aims of the studentships are to pursue research in developing dynamic face and behavioural models from temporal information and their applications in indexing image and video databases. The candidates should ideally have a good first degree in one of the following subjects: Computer Science, Electronic Engineering, Mathematics or Physics. Normally, candidates are also required to have an appropriate master's degree, although exceptions can be made. Programming experience in C/C++ is essential and knowledge computer vision and statistical learning would be advantageous. Each post carries a bursary of 9000 pounds per annum plus home (and EU) postgraduate fees and is tenable for three years. There may also be opportunities to supplement the bursary income by undertaking tutorial work within the School. Applicants should send their resume and letters of recommendations to: Dr Alexandra Psarrou, Att: PhD Studentships Harrow School of Computer Science University of Westminster, Harrow Campus Watford Road, Northwick Park, Harrow HA1 3TP, UK Telephone: ++44 (020) 7911 5904 Email: psarroa at wmin.ac.uk From eric at research.nj.nec.com Tue Apr 24 16:29:28 2001 From: eric at research.nj.nec.com (Eric Baum) Date: Tue, 24 Apr 2001 16:29:28 -0400 (EDT) Subject: Postdoctoral Research Opportunity Message-ID: <15077.57737.609146.245904@yin.nj.nec.com> Postdoctoral Research Opportunity A post-doctoral position is available in the CS Division of the NEC Research Institute in Princeton NJ, USA (http://www.neci.nj.nec.com). This position is for work on machine and reinforcement learning. One project will extend ideas of the Hayek Machine (c.f. papers at http://www.neci.nj.nec.com/homepages/eric/) on evolving artificial economies of agents that reinforcement learn to web search and automatically personalized computing. The position is a one year term position, possibly renewable subject to mutual agreement and funding. Candidates should have a Ph.D. in computer science or related field, a strong background in machine learning or genetic algorithms/programming, programming experience in the Unix/C(++) environment, and should have a keen interest in building high performance AI systems. If interested please contact Eric Baum by email (see below). Include - CV - List of Publications - Three selected papers - Names & addresses of three scientists who could act as reference (ascii, ps, or pdf files welcome, no MS-Word files please) -- Eric Baum | eric at research.nj.nec.com NEC Research Institute | http://www.neci.nj.nec.com/homepages/eric/ 4 Independence Way | Tel: +1 (609) 951-2712 Princeton NJ 08540 | Fax: +1 (609) 951-2488 From mschmitt at lmi.ruhr-uni-bochum.de Wed Apr 25 06:20:19 2001 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Wed, 25 Apr 2001 12:20:19 +0200 Subject: Preprints on Spiking and Product Unit Neural Networks Message-ID: <3AE6A4E3.6D02B45F@lmi.ruhr-uni-bochum.de> Dear Colleagues, the following two preprints are available on-line: "Complexity of learning for networks of spiking neurons with nonlinear synaptic interactions" http://www.ruhr-uni-bochum.de/lmi/mschmitt/nonlinear.ps.gz (8 pages gzipped PostScript), "Product unit neural networks with constant depth and superlinear VC dimension" http://www.ruhr-uni-bochum.de/lmi/mschmitt/superlinear.ps.gz (9 pages gzipped PostScript). Both papers are going to be presented in talks at the International Conference on Artificial Neural Networks ICANN 2001, August 21-25, 2001, Vienna, Austria. Regards, Michael Schmitt ------------------------------------------------------------ TITLE: Complexity of Learning for Networks of Spiking Neurons with Nonlinear Synaptic Interactions AUTHOR: Michael Schmitt ABSTRACT We study model networks of spiking neurons where synaptic inputs interact in terms of nonlinear functions. These nonlinearities are used to represent the spatial grouping of synapses on the dendrites and to model the computations performed at local branches. We analyze the complexity of learning in these networks in terms of the VC dimension and the pseudo dimension. Polynomial upper bounds on these dimensions are derived for various types of synaptic nonlinearities. ------------------------------------------------------------ TITLE: Product Unit Neural Networks with Constant Depth and Superlinear VC Dimension AUTHOR: Michael Schmitt ABSTRACT It has remained an open question whether there exist product unit networks with constant depth that have superlinear VC dimension. In this paper we give an answer by constructing two-hidden-layer networks with this property. We further show that the pseudo dimension of a single product unit is linear. These results bear witness to the cooperative effects on the computational capabilities of product unit networks as they are used in practice. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: +49 234 32-23209 , Fax: +49 234 32-14465 http://www.ruhr-uni-bochum.de/lmi/mschmitt/ From cindy at cns.bu.edu Wed Apr 25 14:22:18 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 25 Apr 2001 14:22:18 -0400 Subject: Neural Networks 14(4/5) Message-ID: <200104251822.OAA01897@mattapan.bu.edu> NEURAL NETWORKS 14(4/5) Contents - Volume 14, Numbers 4/5 - 2001 ------------------------------------------------------------------ CONTRIBUTED ARTICLES: ***** Psychology and Cognitive Science ***** Quantitative examinations for multi joint arm trajectory planning: Using a robust calculation algorithm of the minimum commanded torque change trajectory Yasuhiro Wada, Yuichi Kaneko, Eri Nakano, Reiko Osu, and Mitsuo Kawato ***** Neuroscience and Neuropsychology ***** Solving the binding problem of the brain with bi-directional functional connectivity Masataka Watanabe, Kousaku Nakanishi, and Kazuyuki Aihara ***** Mathematical and Computational Analysis ***** Learning from noisy information in FasArt and FasBack neuro-fuzzy systems Jose Manuel Cano Izquierdo, Yannis A. Dimitriadis, Eduardo Gomez Sanchez, and Juan Lopez Coronado Comparing Bayesian neural network algorithms for classifying segmented outdoor images Francesco Vivarelli and Christopher K.I. Williams Three learning phases for radial-basis-function networks Friedhelm Schwenker, Hans A. Kestler, and Gunther Palm Noise suppression in training examples for improving generalization capability Akiko Nakashima and Hidemitsu Ogawa Networks with trainable amplitude of activation functions Edmondo Trentin A model with an intrinsic property of learning higher order correlations Marifi Guler ***** Engineering and Design ***** S-TREE: Self-organizing trees for data clustering and online vector quantization Marcos M. Campos and Gail A. Carpenter The constraint based decomposition (CBD) training architecture Sorin Draghici ***** Technology and Applications ***** Life-long learning cell structures: Continuously learning without catastrophic interference Fred H. Hamker Pattern classification by a condensed neural network A. Mitiche and M. Lebidoff ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From cindy at cns.bu.edu Wed Apr 25 15:56:18 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 25 Apr 2001 15:56:18 -0400 Subject: 5th ICCNS: Call for Registration Message-ID: <200104251956.PAA15444@retina.bu.edu> Apologies if you receive this more than once. ***** CALL FOR REGISTRATION ***** ***** AND ***** ***** FINAL INVITED PROGRAM ***** FIFTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS Tutorials: May 30, 2001 Meeting: May 31 - June 2, 2001 Boston University http://www.cns.bu.edu/meetings/ This interdisciplinary conference focuses on two fundamental questions: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? A single oral or poster session enables all presented work to be highly visible. Contributed talks will be presented on each of the three conference days. Three-hour poster sessions with no conflicting events will be held on two of the conference days. All posters will be up all day, and can also be viewed during breaks in the talk schedule. CONFIRMED INVITED SPEAKERS TUTORIAL SPEAKERS: Wednesday, May 30, 2001 Ted Adelson: The perception of surface properties Yiannis Aloimonos: What geometry and statistics tell us about the motion pathway Gail A. Carpenter: Adaptive resonance theory Michael Jordan: Inference and learning in graphical models INVITED SPEAKERS Thursday, May 31, 2001 Larry Abbott: Spike-timing effects in Hebbian synaptic plasticity Wulfram Gerstner: Rapid signal transmission by populations of spiking neurons Nancy Kopell: Rhythms and cell assemblies in the nervous system Wolfgang Maass: Liquid state machines: A new framework for understanding neural computation on spike trains Henry Markram: Neocortical microcircuits of perception, attention, and memory Victor Lamme: The role of recurrent processing in visual awareness Wolf Singer: Neuronal synchrony in cerebral cortex and its functional implications (keynote lecture) Friday, June 1, 2001 Ralph D. Freeman: Organization of receptive fields of neurons in the primary visual cortex Nikos Logothetis: On bistable perception David J. Heeger: Attention and sensory signals in primary visual cortex Maggie Shiffrar: The visual analysis of moving bodies Stephen Grossberg: The complementary brain: Unifying brain dynamics and modularity Allen Waxman: Multi-sensor 3D image fusion technologies Saturday, June 2, 2001 Peter L. Strick: Basal ganglia and cerebellar "loops" with the cerebral cortex: Motor and cognitive circuits Richard Ivry: Timing, temporal coupling, and response selection Daniel Bullock: Action selection and reinforcement learning in a model of laminar frontal cortex and the basal ganglia Christoph Schreiner: Temporal correlation and information transfer in the auditory thalamo-cortical system Rochel Gelman: Continuity and discontinuity in cognitive development: Numerical cognition as a case Maja Mataric: From what you see to what you do: Imitation in humans and humanoid robots Leon Cooper: Bi-directionally modifiable synapses: From theoretical fantasy to experimental fact (keynote lecture) REGISTRATION FORM Fifth International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 30, 2001 Meeting: May 31 - June 2, 2001 FAX: (617) 353-7755 http://www.cns.bu.edu/meetings/ (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $75 Conference plus Tutorial (Regular) ( ) $50 Conference plus Tutorial (Student) ( ) $50 Conference Only (Regular) ( ) $35 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From shastri at ICSI.Berkeley.EDU Fri Apr 27 22:07:10 2001 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Fri, 27 Apr 2001 19:07:10 PDT Subject: Episodic Memory Formation via Cortico-Hippocampal Interactions Message-ID: <200104280207.TAA06655@dill.ICSI.Berkeley.EDU> Dear Connectionists, The following article may be of interest to you. Best wishes, -- Lokendra Shastri -------------------------------------------------------------------------- http://www.icsi.berkeley.edu/~shastri/psfiles/shastri_em.pdf From Transient Patterns to Persistent Structures: A model of episodic memory formation via cortico-hippocampal interactions Lokendra Shastri International Computer Science Institute Berkeley, CA 94704 Abstract We readily remember events and situations in our daily lives and rapidly acquire memories of specific events by watching a telecast or reading a newspaper. There is a broad consensus that the hippocampal system (HS), consisting of the hippocampal formation and neighboring cortical areas, plays a critical role in the encoding and retrieval of such ``episodic'' memories. But how the HS subserves this mnemonic function is not fully understood. This article presents a computational model, SMRITI, that demonstrates how a cortically expressed transient pattern of activity representing an event can be transformed rapidly into a persistent and robust memory trace as a result of long-term pot- entiation within structures whose architecture and circuitry resemble those of the HS. Memory traces formed by the model respond to partial cues, and at the same time, reject similar but erroneous cues. During retrieval these memory traces, acting in concert with cortical circuits encoding semantic, causal, and procedural knowledge, can recreate activation-based representations of memorized events. The model explicates the representational requirements of encoding episodic memories, and suggests that the idiosyncratic architecture of the HS is well matched to the representational problems it must solve in order to support the episodic memory function. The model predicts the nature of memory deficits that would result from insult to specific HS components and to cortical circuits projecting to the HS. It also identifies the sorts of memories that must remain encoded in the HS for the long-term, and helps delineate the semantic and episodic memory distinction. (Submitted to Behavioral and Brain Sciences) From E.Koning at elsevier.nl Fri Apr 13 03:53:59 2001 From: E.Koning at elsevier.nl (Koning, Esther (ELS)) Date: Fri, 13 Apr 2001 09:53:59 +0200 Subject: CITE: Elsevier abstracts and journals available online Message-ID: <4FAD455E0BA3D31196270008C784DAE202B3B85A@elsamssonyx.elsevier.nl> Announcement: New user interface of CITE, The Computational Intelligence platform. CITE integrates contents and services, covering all subject areas in the field of computational intelligence. Visit CITE at: http://www.elsevier.com/cite CITE offers: - Access to the major journals in computational intelligence: Neural Networks, Artificial Intelligence, Biosystems, Fuzzy Sets and Systems, and Pattern Recognition. - An abstracts database covering recent citations and abstracts from more than 60 key journals of Elsevier Science and other publishers. - A book list, which provides you with information and reviews of new books. - Events list on forthcoming events world-wide. - Links to publishers' sites providing information on additional contents of journals in the area of computational intelligence. Note: Access to abstracts, tables of content of 60 journals, information on events, books, bibliographies and related sites is free to everyone. Your personal or your institution's subscription to Elsevier Science journals in CITE allows you to access the full-text articles of those journals. Contact: Esther Koning mailto:e.koning at elsevier.nl