From lawrence at research.nj.nec.com Sun Nov 1 17:22:08 1998 From: lawrence at research.nj.nec.com (Steve Lawrence) Date: Sun, 1 Nov 1998 17:22:08 -0500 Subject: ------ Scientist/RA position in learning/information retrieval ------- Message-ID: <19981101172208.B19809@research.nj.nec.com> ------ Scientist/RA position in learning/information retrieval ------- The NEC Research Institute has an immediate opening for a Scientist/Research Associate in the area of learning and/or information retrieval. NEC Research Institute is a basic research laboratory located in Princeton, NJ, a short drive from Princeton University. For more information, see http://www.neci.nj.nec.com/ This research focuses on basic issues in learning and information retrieval and dissemination, with an emphasis on the World Wide Web. A sample of recent research publications includes: S. Lawrence, C.L. Giles, Searching the World Wide Web, Science, 280, p. 98. 1998. S. Lawrence, C.L. Giles, Context and Page Analysis for Improved Web Search, IEEE Internet Computing, Volume 2, Number 4, pp. 38-46, 1998. C.L. Giles, K. Bollacker, S. Lawrence, CiteSeer: An Automatic Citation Indexing System, The 3rd ACM Conference on Digital Libraries, pp. 89-98, 1998 [shortlisted for best paper award]. S. Lawrence, C.L. Giles, A.C. Tsoi, A.D. Back, Face Recognition: A Convolutional Neural Network Approach, IEEE Transactions on Neural Networks, 8, 1, pp. 98-113, 1997. S. Lawrence, A.D. Back, A.C. Tsoi, C.L. Giles, On the Distribution of Performance from Multiple Neural Network Trials, IEEE Transactions on Neural Networks, 8, 6, pp. 1507-1517, 1997. These and other papers are available from: http://www.neci.nj.nec.com/homepages/lawrence/ See also: http://www.neci.nj.nec.com/homepages/lawrence/citeseer.html http://www.neci.nj.nec.com/homepages/lawrence/websize.html Candidates must have experience in research and be able to effectively communicate research results through publications. Successful candidates will have experience with information retrieval and/or machine learning (e.g. neural networks). Proficiency with Unix/Linux and the software implementation of algorithms is a must. Tasks will also involve code maintenance, modification and enhancement as required by the research program. The Institute provides an outstanding research environment with many recognized experts and excellent resources plus a competitive salary and a strong emphasis on open publication of results. The successful candidate will combine a genuine desire to excel in research with the above attributes. Interested applicants should send their resumes with names of references by email, mail or fax to: Dr. Steve Lawrence Computer Science NEC Research Institute 4 Independence Way Princeton NJ 08540 Phone: (609) 951 2676 Fax: (609) 951 2482 lawrence at research.nj.nec.com http://www.neci.nj.nec.com/homepages/lawrence/ The position will remain open until filled. Applicants must show documentation of eligibility for employment. NEC is an equal opportunity employer. EOE -- Steve Lawrence - http://www.neci.nj.nec.com/homepages/lawrence/ From ucganlb at ucl.ac.uk Mon Nov 2 11:00:53 1998 From: ucganlb at ucl.ac.uk (Neil Burgess - Anatomy UCL London) Date: Mon, 02 Nov 1998 16:00:53 +0000 Subject: New Book Message-ID: <19536.199811021600@socrates-a.ucl.ac.uk> THE HIPPOCAMPAL AND PARIETAL FOUNDATIONS OF SPATIAL COGNITION Neil Burgess, Kate Jeffery, John O'Keefe (eds.) Oxford University Press, 1998. Hardback ISBN: 0-19-852453-6 UK Price: 60.00 pounds Paperback ISBN: 0-19-852452-8 UK Price: 29.50 pounds For ordering see: http://www.oup.co.uk/ or the OUP stand at the Society for Neuroscience meeting Preface: Striking recent progress has been made towards an understanding of the neural basis of spatial cognition, centred on two areas of the brain: the hippocampal formation and the parietal cortex. This book includes a comprehensive sample of recent research into how these two areas work, either alone or in cooperation with each other, to support spatial cognition. The research presented here is necessarily interdisciplinary, including consideration of the effects of brain damage in humans, functional imaging of the human brain, electrophysiological recording of single neurones, and computer simulation of the action of the networks of neurons in these brain areas. The first chapter of the book provides an overall introduction to the field and to the substance of each of the remaining chapters. In this introductory chapter we also present a framework in which to consider the seemingly diverse spatial and mnemonic functions of the hippocampal formation and parietal cortex. This book should provide a useful starting point and reference for researchers and students of neuroscience, psychology or cognitive science who have an interest in spatial cognition. Contents: INTRODUCTION 1. Integrating hippocampal and parietal functions: a spatial point of view N Burgess, KJ Jeffery, J O'Keefe. pp. 3-31 PARIETAL CORTEX 2. Spatial frames of reference and somatosensory processing: a neuropsychological perspective GI Vallar 33-49 3. Spatial orientation and the representation of space with parietal lobe lesions HO Karnath 50-66 4. Egocentric and object-based visual neglect J Driver 67-89 5. Multimodal integration for the representation of spcae in the posterior parietal cortex RA Andersen 90-103 6. Parietal cortex constructs action-oriented spatial representations CL Colby 104-126 7. A new view of hemineglect based on the repsonse properties of parietal neurones A Pouget, TJ Sejnowski 127-147 THE HIPPOCAMPAL FORMATION 8. Robotic and neuronal simulation of the hippocampus and rat navigation N Burgess, JG Donnett, KJ Jeffery, J O'Keefe 149-166 9. Dissociation of exteroceptive and ideothetic orientation cues: effect on hippocampal place cells and place navigation J Bures, AA Fenton, Y Kaminski, J Rossier, B Sacchetti, L Zinyuk 167-185 10. Variable place-cell coupling to a continuously viewed stimulus: evidence that the hippocampus acts as a perceptual system A Rotenberg, RU Muller 186-202 11. Separating hippocampal maps AD Redish DS Touretzky 203-219 12. Hippocampal synaptic plasticity: role in spatial learning or the automatic encoding of attended experience? RGM Morris, U Frey 220-246 13. Right medial temporal-lobe contribution to object-location memory B Milner, I Johnsrude, J Crane 247-258 14. The hippocampus and spatial memory in humans RG Morris, JA Nunn, S Abrahams, JD Feigenbaum, M Recce 259-289 15. Hierarchical organisation of cognitive memory M Mishkin, WA Suzuki, DG Gadian, F Vargha-Khadem 290-303 INTERACTIONS BETWEEN PARIETAL AND HIPPOCAMPAL SYSTEMS IN SPACE AND MEMORY 16. Memory reprocessing in cortocortical and hippocampocortical neuronal ensembles YL Qin, BL McNaughton, WE Skaggs, CA Barnes 305-319 17. The representaiton of space in the primate hippocampus, and its role in memory ET Rolls 320-344 18. Amnesia and neglect: beyond the Delay-Brion system and the Hebb synapse D Gaffan, J Hornak 345-358 19. Representation of allocentric space in the monkey frontal lobe CR Olsen, SN Gettner, L Tremblay 359-380 20. Parietal and hippocampal contribution to topokinetic and topographic memory A Berthoz 381-403 21. Hippocampal involvement in human topographical memory: evidence from functional imaging EA Maguire 404-415 22. Parietal cortex and hippocampus: from visual affordances to the world graph MA Arbib 416-442 23. Visuospatial processing in a pure case of visual-form agnosia AD Milner, HC Dijkerman, DP Carey 443-466 INDEX From l.s.smith at cs.stir.ac.uk Mon Nov 2 04:38:19 1998 From: l.s.smith at cs.stir.ac.uk (Dr L S Smith (Staff)) Date: Mon, 2 Nov 98 09:38:19 GMT Subject: ICANN 99 - Ninth International Conference on Artificial Neural Networks Message-ID: <199811020938.JAA04776@tinker.cs.stir.ac.uk> ICANN 99 - Ninth International Conference on Artificial Neural Networks incorporating the IEE Conference on Artificial Neural Networks University of Edinburgh, Scotland, UK: 7 - 10 September 1999 First Call for Papers The conference aims to bring together researchers from academia, industry and commerce in the broad field of neural computation, spanning disciplines from computational neurobiology to engineering in what is the largest European event ever to be held in this field. It is intended to create a focus for European research and to foster dialogue between academic researchers and industrial/commercial users of a still developing technology. Scope Theory and Algorithms Neurobiology and Computational Neuroscience Cognitive Modelling Industrial, Commercial and Medical Applications Hardware and Neuromorphic Engineering Control, Robotics and Adaptive Behaviour Important dates: 1 February 1999 Deadline for the receipt of papers for assessment (6 pages) 29 March 1999 Notification of acceptance to authors 14 May 1999 Final camera-ready papers must be received Papers can be submitted electronically: see the WWW page for details. Scientific Committee Co - Chairs: Professor David Willshaw, University of Edinburgh Professor Alan Murray, University of Edinburgh For more information see http://www.iee.org.uk/Conf/ICANN or email icann99 at iee.org.uk (Leslie Smith, Department of Computing Science, University of Stirling, Scotland) From mousset at sedal.usyd.edu.au Tue Nov 3 00:48:08 1998 From: mousset at sedal.usyd.edu.au (Eric Mousset) Date: Tue, 03 Nov 1998 16:48:08 +1100 Subject: Workshop 'Understanding the Brain and Engineering Models' Message-ID: <363E9918.678CBF50@sedal.usyd.edu.au> WORKSHOP ANNOUNCEMENT UNDERSTANDING THE BRAIN AND ENGINEERING MODELS University of Sydney, Australia January 18th and 19th, 1999 OUTLINE ======= Over the last decade, research activities in the domain of neuromorphic engineering and brain function modelling have significantly increased. This workshop seeks to bring together a diverse group of researchers to critically examine the progress made so far in this challenging research area. We intend to address the following issues: 1. Sensorimotor integration in a broad sense and, more particularly, 2. Computational models of brain centres such as the cerebellum, basal ganglia and the superior colliculus, 3. Computational models of the visual and auditory pathways, 4. Microelectronic and other real-time implementations of such models and their incorporation in roving systems. KEYNOTE SPEAKER ================ Terrence Sejnowski Howard Hughes Medical Institute, Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, CA, USA TENTATIVE LIST OF SPEAKERS ============================= Abdesselam Bouzerdoum Edith Cowan University, Perth, WA, Australia Simon Carlile Department of Physiology, Sydney University, NSW, Australia Olivier Coenen The Salk Institute for Biological Studies, CA, USA & ANRL Children's Hospital Ralph Etienne-Cummings Department of Electrical and Computer Engineering, Johns Hopkins University, MD, USA Evian Gordon Westmead Hospital, The University of Sydney, NSW, Australia Marwan Jabri School of Electrical and Information Engineering, Sydney University, NSW, Australia Craig Jin School of Electrical and Information Engineering, Sydney University, NSW, Australia Eric Mousset School of Electrical and Information Engineering, Sydney University, NSW, Australia Andre Van Schaik Department of Physiology, Sydney University, NSW, Australia WHO SHOULD ATTEND? ================== The following groups of participants are targeted: (a) Visual/auditory psychophysicists and neurobiologists; (b) Computational modellers and engineers, including those interested in real-time software and/or VLSI implementation. REGISTRATION FEE & FORM ======================= Registration fee is AU$ 100 (normal) / AU$ 50 for students. It includes coffee, workshop barbecue and workshop handouts. A registration form is available at http://www.ee.usyd.edu.au/events_news/ubem99reg.html FURTHER INFORMATION ==================== Further information (registration form, detailed program, accomodation arrangements, transport, etc.) is available at http://www.sedal.usyd.edu.au/~mousset/UBEM99/ Queries can be directed to Eric Mousset: mousset at ee.usyd.edu.au or +62-1-93517208 ORGANISERS ========== Eric Mousset (mousset at ee.usyd.edu.au), Sydney University Marwan Jabri (marwan at ee.usy.edu.au), Sydney University and Ralph Etienne-Cummings (etienne at ece.jhu.edu), John Hopkins University From mieko at hip.atr.co.jp Thu Nov 5 02:03:32 1998 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Thu, 5 Nov 1998 16:03:32 +0900 Subject: Submission Deadline is SOON [Neural Networks 1999 Special Issue] Message-ID: <199811050703.QAA11163@mailhost.hip.atr.co.jp> Dear members, DEADLINE for submission: December 1st, 1998 We would like draw your attention to the quickly approaching submission deadline for the "Neural Networks 1999 Special Issue" This year, the publication will be edited by the Japanese Neural Networks Society. We plan to publish many originally contributed articles in addition to invited articles. We are looking forward to receiving your contributions. Mitsuo Kawato Co-Editor-in-Chief Neural Networks (ATR Human Information Proc. Res. Labs.) ****************************************************************** CALL FOR PAPERS ****************************************************************** Neural Networks 1999 Special Issue "Organisation of Computation in Brain-like Systems" ****************************************************************** Submission: Deadline for submission: December 1st, 1998 Notification of acceptance: March 1st, 1999 Format: as for normal papers in the journal (APA format) and no longer than 10,000 words Co-Editors: Professor Gen Matsumoto, BSI, RIKEN, Japan Professor Edgar Koerner, HONDA R&D, Europe Dr. Mitsuo Kawato, ATR Human Information Processing Res. Labs., Japan Address for Papers: Dr. Mitsuo Kawato ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho Soraku-gun, Kyoto 619-0288, Japan. ****************************************************************** In recent years, neuroscience has made a big leap forward regarding both investigation methodology and insights in local mechanisms for processing sensory information in the brain. The fact that we still do not know much better than before what happens in the brain when one recognises a familiar person, or moves around navigating seemingly effortless through a busy street, points to the fact that our models still do not describe essential aspects of how the brain organises computation. The investigation of the behaviour of fairly homogeneous ANS (artificial neural systems) composed of simple elementary nodes fostered the awareness that architecture matters: Algorithms implemented by the respective neural system are expressed by its architecture. Consequently, the focus is shifting to better understanding of the architecture of the brain and of its subsystems, since the structure of those highly modularised systems represents the way the brain organises computation. Approaching the algorithms expressed by those architectures may offer us the capability to not only understand the representation of knowledge in a neural system made under well defined constraints, but to understand the control that forces the neural system to make representations of behaviourally relevant knowledge by generating dynamic constraints. This special issue will bring together invited papers and contributed articles that illustrate the shifting emphasis in neural systems modelling to more neuroarchitecture-motivated systems that include this type of control architectures. Local and global control algorithms for organisation of computation in brain-like systems cover a wide field of topics. Abduction of control principles inherent in the architectures that mediate interaction within the cortex, between cortex-thalamus, cortex-hippocampus and other parts of the limbic system is one of the targets. Of particular importance are the rapid access to stored knowledge and the management of conflicts in response to sensory input, the coding and representation in a basically asynchronous mode of processing, the decomposition of problems into a reasonable number of simpler sub-problems, and the control of learning -- including the control which specifies what should be learned, and how to integrate the new knowledge into the relational architecture of the already acquired knowledge representation. Another target of that approach is the attempt to understand how these controls and the respective architectures emerged in the process of self-organisation during phylogenetic and ontogenetic development. Setting the cognitive behaviour of neural systems in the focus of investigation is a prerequisite for the described approach that will promote both creating computational hypotheses for neurobiology and implementing robust and flexible computation in ANS. ****************************************************************** end. ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL mieko at hip.atr.co.jp ========================================================= From austin at minster.cs.york.ac.uk Thu Nov 5 09:17:53 1998 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Thu, 5 Nov 1998 14:17:53 +0000 Subject: Lectureship in Neural networks Message-ID: <9811051417.ZM6169@minster.cs.york.ac.uk> Lectureship in Neural Networks and/or Computer Vision Applications are invited for a post of Lecturer in the Department of Computer Science, University of York UK. The Department is one of the premier Computer Science Departments in UK, with the highest research and teaching gradings. The Lecturer will be expected to be an active researcher in neural networks and/or computer vision with an strong interest in teaching the design and construction of micro computer systems or in networks and distributed systems. The successful applicant will join one of UK strongest groups undertaking research in neural networks, computer vision and advanced architectures that support them. Housed in a new building the work of the 30 strong research and lecturing team can be found at the web page http:http://www.cs.york.ac.uk/arch/ The post is available immediately on Lecture scale grade A (16,655 to 21,815 pounds Stirling per annum) or Grade B (22,726 to 29,048 per annum), depending on age and experience. Informal enquires may be made to Prof. Jim Austin (Head of the advanced Computer Architectures Group, austin at cs.york.ac.uk, 01904 432734) or Dr. Keith Mander (Head of Department, mander at cs.york.ac.uk, 01904 432734). Six copies of applications with full curriculum vitae and the names of three referees should be sent by Monday 23 November 1998 to the Personnel Officer, University of York, Heslington, York YO10 5DD. Applicants should consult the further particulars which are available from http://www.cs.york.ac.uk/~mander/partics.html. Applicants should quote the reference number X/3074. -- Professor Jim Austin, Department of Computer Science, University of York, York, YO1 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From ogawa at cs.titech.ac.jp Wed Nov 4 06:04:50 1998 From: ogawa at cs.titech.ac.jp (Hidemitsu Ogawa) Date: Wed, 04 Nov 1998 20:04:50 +0900 Subject: Ninth T.I.T. Brain Research Symposium Message-ID: <199811041104.UAA01071@hilbert.cs.titech.ac.jp> Appended below is the program for the Ninth T.I.T. Brain Research Symposium. Hidemitsu Ogawa The Ninth T.I.T. Brain Research Symposium General Chair -------------------------------------------------------------------- ************************************************************* The Ninth T.I.T. Brain Research Symposium ************************************************************* December 11, 1998 Tokyo Institute of Technology Ferrite Room of Centennial Memorial Hall, Oookayama Campus -------------------------------------------------------------------- 10:00-10:10 Opening Hidemitsu Ogawa (T.I.T. Graduate School of Information Science and Engineering) 10:10-11:10 Synapse mechanism of pheromonal memory (Invited) Masumi Ichikawa(Tokyo Metropolitan Institute for Neuroscience) 11:10-11:50 Manipulation technology of individual molecules for brain research: A Challenge in nanoworld Atsushi Ikai (T.I.T. Faculty of Biosciences and Biotechnology) --- Lunch --- 13:00-13:40 Study of learning circuit for odor sensor using 1bit-stream dataprocessing circuit Takamichi Nakamoto, Satoshi Kawamura, and Toyosaka Moriizumi (T.I.T. Faculty of Engineering) 13:40-14:20 Holographic neural computation with weighted sum performed by wave interference: Simulation by software models and hardware implementation Itsuo Kumazawa (T.I.T. Graduate School of Information Science and Engineering) 14:20-15:00 Neural networks and motivation:What will make it to do? Yukio Kosugi(T.I.T. Interdisciplinary Graduate School of Science and Engineering) --- Break --- 15:20-16:00 Recent advance in probabilistic analysis for complex learning machines Sumio Watanabe(T.I.T. Precision and Intelligence Laboratory) 16:00-17:00 Dynamic independent component analysis in application to biomedical signal processing: Recent results and open problems (Invited) Andrzeij Cichocki(RIKEN Brain Science Institute) (admission: free) ------------------------------------------------------------------------- General Chair:Prof. Hidemitsu Ogawa (Tokyo Institute of Technology, Graduate School of Information Science and Engineering) Secretariat :Dr.Yukio Kosugi/Email:kosugi at pms.titech.ac.jp ------------------------------------------------------------------------- From seung at mit.edu Wed Nov 4 01:36:52 1998 From: seung at mit.edu (H. Sebastian Seung) Date: Wed, 4 Nov 1998 01:36:52 -0500 (EST) Subject: Postdoctoral research positions available Message-ID: <199811040636.BAA22159@life.ai.mit.edu> Postdoctoral research positions: The Seung Lab for Theoretical Neurobiology Department of Brain and Cognitive Sciences The Massachusetts Institute of Technology The Seung Lab invites applications for postdoctoral research positions, to be filled between January and September 1999. Applicants should ideally have experience in theoretical neurobiology or machine learning, and strong training in the physical sciences or engineering. The Seung Lab specializes in the theoretical study of learning and memory in neural networks, both biological and artificial. More information about our activities can be found at Applications should include a CV, research statement, copies of relevant publications, and three letters of recommendation. Send applications to: Prof. Sebastian Seung Dept. of Brain & Cognitive Sciences MIT, E25-210 Cambridge, MA 02139 seung at mit.edu From a_browne at europa.nene.ac.uk Tue Nov 3 03:37:53 1998 From: a_browne at europa.nene.ac.uk (Tony Browne) Date: Tue, 03 Nov 1998 08:37:53 +0000 Subject: CFP: Symbol Processing Message-ID: <74415643E3@europa.mmb.nene.ac.uk> Call for Papers: Special Issue on Connectionist Symbol Processing For the journal: Expert Systems: The International Journal of Knowledge Engineering and Neural Networks The processing of symbols and symbolic structures has long been a challenge to connectionists. This special issue will bring together a broad range of contributed articles that explore the areas of representation, variable binding and inference. Papers are sought which present recent results in this field, or discuss fundamental theoretical concepts related to the performance of symbolic processing with connectionist networks. All papers will be peer-reviewed. Special Issue Editor: Antony Browne Submission Details: Deadline for Submission: 30th April 1999 Notification of Acceptance: 31st July 1999 Format: As for normal papers to the journal (Please see attachment to this message for instructions to authors) Length: No longer than 10000 words Address for Papers: Dr. Antony Browne School of Information Systems University College Northampton Northampton NN2 7AL UK Further questions should be addressed to Tony Browne at: antony.browne at nene.ac.uk NOTES FOR CONTRIBUTORS The International Journal of Knowledge Engineering and Neural Networks is a quarterly technical journal devoted to all aspects of the development and use of advanced computing. Papers are published on the condition that authors are prepared to assign the copyright to Blackwell Publishers Ltd. The journal is not equipped to deal with LaTeX, so please write papers using a commonly used Microsoft Windows based word-processing package such as 'Word' or 'Wordperfect' Three hard copies of the paper and figures should be supplied. The paper should be laser printed double spaced, on white A4 or white US equivalent paper. If your submission is accepted, you will be asked to submit a disk with the soft version. Each page should be numbered. The first page of the manuscript should bear only the names, titles and full addresses of the authors. Where there is more than one author, please indicate to whom correspondence should be sent, as well as contact address, phone and fax numbers, and e-mail address. Illustrations should be on separate pages, attached to the end of the manuscript. Figures should be neatly printed in black on a good white base on separate pages. Screen dumps do not always reproduce well; please make screen dumps as clear as possible. Each figure should be clearly identified for the printer (e.g. Figure 1). The position of each figure should be clearly marked and referred to in the text (e.g. Figure 1 about here). Each diagram or table should be clearly captioned (e.g. Figure 1: the sy Lettering and numbering should be large enough to be legible if reduced, which may be by up to 50%. Figures should not be marked in any way as we may reproduce them directly. Authors are responsible for obtaining permission to use figures borrowed from other works. Papers should always begin with an abstract and an introduction, and end with a conclusion and bibliography. Appendices may be used if appropriate. Papers and articles should be written in gender-free language Heading structure: Usually three levels:1. Main heading, 1.1. Subheading, 1.1.1. Sub-subheading. Every reference in the text should be given in full in the bibliography. References should be complete and correct - this is the author's responsibility. References in the text should give, in parentheses, the author's name and year of publication: for example: (Browne et. al., 1996; Niklasson & Boden, 1997; Browne, 1998). References in the bibliography at the end of the text should use the appropriate format, described below (the Harvard system). Books/Book Chapters: author name, author initials, date of publication, title of chapter (where applicable), full name of editor (where applicable), full title of book, page numbers, publisher, publisher's location. For example: NIKLASSON, L and BODEN, M. (1997). Representing structure and structured representations in connectionist networks, in A. Browne (Ed.), Neural Network Perspectives on Cognition and Adaptive Robotics, pp. 20-50. Institute of Physics Press, Bristol, UK. Journal Papers: author name, author initials, date of publication, full title of paper, full title of journal, journal volume number, journal issue number, page numbers of paper. For example: BROWNE, A. (1998). Detecting systematic structure in distributed representations. Neural Networks 11(5), 815-824. Conference proceedings, similar to the format above. If the proceedings have been published externally, please give name of publisher and publisher's location. BROWNE, A. PASCALIS, R. and AMIN, S. (1996). Signal and image processing with neural networks. Proceedings of Circuits, Systems and Computers 96, Vol. 1, 335-339. References not mentioned in the text should be listed separately as 'Further reading.' From l.s.smith at cs.stir.ac.uk Wed Nov 4 08:29:51 1998 From: l.s.smith at cs.stir.ac.uk (Dr L S Smith (Staff)) Date: Wed, 4 Nov 98 13:29:51 GMT Subject: 2nd European Workshop on Neuromorphic Systems: Call for Papers Message-ID: <199811041329.NAA05529@tinker.cs.stir.ac.uk> First Call For Papers EWNS2 2nd European Workshop on Neuromorphic Systems, 3-5 September 1999. University of Stirling, Stirling, Scotland. Neuromorphic systems are implementations in silicon of systems whose architecture and design are based on neurobiological systems. This growing area proffers exciting possibilities such as sensory systems which can compete with human senses, pattern recognition systems that can run in real-time and neuron models that can truly emulate living neurons. Neuromorphic systems are at the intersection of neurophysiology, computer science and electrical engineering. The meeting builds on the success of EWNS1, held in Stirling in August 1997. The meeting is intended both for the reporting of results, and for discussion about the way forward in neuromorphic systems: What should the role of neuromorphic systems be? Learning about neurobiological systems by rebuilding them, or engineering new solutions to problems in sensory perception using what we know about animal sensory perception? Can biologically-inspired techniques provide the basis for real improvements in auditory/visual/ olfactory/ sensorimotor systems or prostheses? How should neuromorphic systems be implemented? Purely in hardware, or as a mixture of software and hardware? As dedicated VLSI devices? In analogue or digital hardware? Should particular methodologies or specific transistor characteristics be used? Can what neuromorphic systems have to tell us about sensory perception and coding inform cognitive science? Papers are requested in the following areas: Design issues in sensorineural neuromorphic systems: auditory, visual, olfactory, proprioceptory, sensorimotor systems= Designs for silicon implementations of neurons or neural systems.= Theoretical aspects of the above areas. The meeting is being held just before ICANN'99, which will be in Edinburgh, 7-10 September 1999. Submission of papers: Papers not exceeding 8 A4 pages are requested: these should be sent to Dr. Leslie Smith, Department of Computing Science, University of Stirling, Stirling FK9 4LA, Scotland email: lss at cs.stir.ac.uk FAX (44) 1786 464551 Tel (44) 1786 467435 We also propose to hold a number of discussion sessions on some of the questions above. Short position papers (up to 4 pages) are also requested. We hope to publish the proceedings in book form after the meeting. We are particularly keen to encourage submissions by research students. Key Dates Submission Deadline April 6th 1999 Notification of Acceptance June 4th 1999 Organising Committee: Leslie S. Smith, Department of Computing Science and Mathematics, University of Stirling. Alister Hamilton, Department of Electrical Engineering, University of Edinburgh. Catherine Breslin, Department of Computing Science and Mathematics, University of Stirling. WWW page for conference: http://www.cs.stir.ac.uk/EWNS2/ Dr Leslie S. Smith Dept of Computing Science and Mathematics, Univ of Stirling Stirling FK9 4LA, Scotland l.s.smith at cs.stir.ac.uk (NeXTmail and MIME welcome) Tel (44) 1786 467435 Fax (44) 1786 464551 www http://www.cs.stir.ac.uk/‾lss/ From graepel2 at cs.tu-berlin.de Thu Nov 5 10:43:24 1998 From: graepel2 at cs.tu-berlin.de (Thore Graepel) Date: Thu, 5 Nov 1998 16:43:24 +0100 (MET) Subject: Winterschool announcement Message-ID: Winterschool Berlin, Germany, December 10-12, 1998 Networks with Spiking Neurons and Synaptic Plasticity ======================================================= The school focuses on key questions of computational neuroscience. Tutorials for non-experts will provide self-contained introductions to experimental results, models, and theoretical concepts. To prepare for the tutorials and lectures, references to selected reviews can be obtained via internet. Organization: Graduiertenkolleg Berlin "Signal Cascades in Living Systems", Sonderforschungsbereich "Mechanismen entwicklungs- und erfahrungsabhaengiger Plastizitaet des Nervensystems" (Sfb 515) Location: Ernst-Reuter-Haus, Saal ER-A, Strasse des 17. Juni 112, Berlin, Germany Abstracts, recommended background literature and further information about the winterschool can be found on: http://www.fu-berlin.de/grk120/ For registration, send an e-mail with your name and address to kolleg at zedat.fu-berlin.de before December 1. The registration fee of 25 DM includes the dinner and may be paid upon arrival. Visitors from outside Berlin can make reservations for accomodation at various rates at the Berlin Tourismus Marketing GmbH, phone: +49 30 25 00 25, e-mail: reservation at btm.de Winterschool Program: ===================== Thursday, December 10: Introductory Tutorials --------------------------------------------- 9:00-10:30 Larry Abbott: Methods of neuronal and network modeling 11:00-12:30 Mike Shadlen: Coding and computing with noisy neurons 14:30-16:00 Ad Aertsen: Temporal Coding and Dynamics of Spiking Neurons 16:30-18:00 Henry Markram: Non-linear synaptic transmission 18:30 Reception and Conference Dinner Friday, December 11: Synaptic Plasticity ---------------------------------------- 8:30- 9:30 Florian Engert: Pairing-induced ltp in hippocampal slice cultures is not strictly input-specific. 9:30-10:30 Lori McMahon: Modulation of hippocampal interneuron excitability through changes in synaptic strength and during rhythmic oscillations. 11:00-12:00 Jeff Magee: Temporal summation of synaptic activity is spatially normalized by a nonuniform dendritic Ih in hippocampal neurons. 12:00-13:00 Alex Thomson: Cortical Cicuits: simultaneous translation at each class of synapse. 14:30-15:30 Larry Abbott: Temporally asymmetric Hebbian plasticity: spike synchrony and response variability. 15:30-16:30 Henry Markram: The synaptic organization principle in the neocortex enables maximal diversity of information transmission between neurons. 17:00-18:00 Walter Senn: Depressing synapses, their modification, and receptive field formation Saturday, December 12: Spiking Neurons -------------------------------------- 8:30- 9:30 Tony Zador: Input synchrony and the irregular firing of cortical neurons 9:30-10:30 Mike Shadlen: Modeling ensembles of weakly correlated noisy neurons. 11:00-12:00 Sonja Gruen: Unitary joint-events in cortical activity. 12:00-13:00 Ad Aertsen: Conditions for stable propagation of synchronous spiking in cortical networks 14:30-15:30 Klaus Pawelzik: Functional roles of subthreshold membrane potential oscillations 15:30-16:30 Wulfram Gerstner: Dynamics in Networks of spiking neurons: Fast transients, synchronisation, and asynchronous firing 17:00-18:00 Gustavo Deco: Spatio-Temporal Coding in the Cortex: Information Flow Based Learning in Spiking Neural Networks. Organizers: Prof. Dr. Andreas V.M. Herz Innovationskolleg Theoretische Biologie Humboldt Universitaet zu Berlin http://itb.biologie.hu-berlin.de/ Prof. Dr. Randolf Menzel Institut f=FCr Neurobiologie Freie Universit=E4t Berlin http://www.neuro.biologie.fu-berlin.de/neuro.html Prof. Dr. Klaus Obermayer Department of Computer Science Technical University of Berlin http://ni.cs.tu-berlin.de/ From priel at mail.biu.ac.il Thu Nov 5 14:12:11 1998 From: priel at mail.biu.ac.il (Avner Priel) Date: Thu, 5 Nov 1998 21:12:11 +0200 (WET) Subject: paper on time series generation Message-ID: The following preprint on the subject of time series generation by feed-forward networks was submitted for publication in the Physical Review E. The paper is available from my home-page : http://faculty.biu.ac.il/‾priel/ comments are welcome. *************** NO HARD COPIES ****************** ---------------------------------------------------------------------- Long-term properties of time series generated by a perceptron with various transfer functions ----------------------------------------------------- A Priel and I Kanter Department of Physics, Bar Ilan University, 52900 Ramat Gan,Israel ABSTRACT: We study the effect of various transfer functions on the properties of a time series generated by a continuous-valued feed-forward network in which the next input vector is determined from past output values. The parameter space for monotonic and non-monotonic transfer functions is analyzed in the unstable regions with the following main finding; non-monotonic functions can produce robust chaos whereas monotonic functions generate fragile chaos only. In the case of non-monotonic functions, the number of positive Lyapunov exponents increases as a function of one of the free parameters in the model, hence, high dimensional chaotic attractors can be generated. We extend the analysis to a combination of monotonic and non-monotonic functions. -------------------------------------------------- Priel Avner < priel at mail.biu.ac.il > < http://faculty.biu.ac.il/‾priel > Department of Physics, Bar-Ilan University. Ramat-Gan, 52900. Israel. From hyson at darwin.psy.fsu.edu Fri Nov 6 13:37:03 1998 From: hyson at darwin.psy.fsu.edu (Richard Hyson) Date: Fri, 6 Nov 1998 13:37:03 -0500 Subject: Position announcement Message-ID: The Department of Psychology at the Florida State University seeks to make a TENURE-TRACK appointment all at the assistant professor level in COMPUTATIONAL PSYCHOLOGY: Applicants in all areas of Computational Psychology are encouraged to apply, but preference will be given to candidates whose research interests complement those of our faculty in either cognitive psychology or neuroscience. This position is part of a College initiative to develop a program in computational science. The successful candidate will have responsibilities to an interdisciplinary program in Computational Science and Engineering as well as to the Psychology Department. Recruits will join a diverse research faculty with training programs in Clinical, Cognitive and Behavioral Science, and Psychobiology/Neuroscience. We seek candidates with strong evidence of research potential and teaching ability. Applicants who can contribute to more than one of the department's areas of strength will receive special consideration. A curriculum vitae, a cover letter describing research and teaching interests, and three letters of reference should be sent by Dec. 1 to: Computational Psychology Search Committee, Department of Psychology, Florida State University, Tallahassee, FL 32306-1270. Florida State University is an Equal opportunity/Affirmative Action Employer. For information about the Department of Psychology, see: http://www.psy.fsu.edu ------------------------------------------------ Rick Hyson, Ph.D. Ph: (850) 644-5824 Psychology Department Fax: (850) 644-7739 FSU email: hyson at psy.fsu.edu Tallahassee, FL 32306-1270 From cas-cns at cns.bu.edu Mon Nov 9 15:00:36 1998 From: cas-cns at cns.bu.edu (cas-cns@cns.bu.edu) Date: Mon, 9 Nov 1998 15:00:36 -0500 (EST) Subject: No subject Message-ID: <199811092000.PAA10937@cns.bu.edu> ******************************************************************* Sender: cas-cns at cns.bu.edu Precedence: bulk Reply-To: cas-cns at cns.bu.edu GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. Applications for Fall, 1999, admission and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via e-mail your full name and mailing address to the attention of Mr. Robin Amos at: inquiries at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. Stephen Grossberg, Chairman Gail A. Carpenter, Director of Graduate Studies Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Students are trained in a broad range of areas concerning cognitive and neural systems, including vision and image processing; speech and language understanding; adaptive pattern recognition; cognitive information processing; self-organization; associative learning and long-term memory; cooperative and competitive network dynamics and short-term memory; reinforcement, motivation, and attention; adaptive sensory-motor control and robotics; and biological rhythms; as well as the mathematical and computational methods needed to support modeling research and applications. The CNS Department awards MA, PhD, and BA/MA degrees. The CNS Department embodies a number of unique features. It has developed a curriculum that consists of interdisciplinary graduate courses, each of which integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of neural networks to technology. Additional advanced courses, including research seminars, are also offered. Each course is typically taught once a week in the afternoon or evening to make the program available to qualified students, including working professionals, throughout the Boston area. Students develop a coherent area of expertise by designing a program that includes courses in areas such as biology, computer science, engineering, mathematics, and psychology, in addition to courses in the CNS curriculum. The CNS Department prepares students for thesis research with scientists in one of several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The unit most closely linked to the department is the Center for Adaptive Systems. Students interested in neural network hardware work with researchers in CNS, at the College of Engineering, and at MIT Lincoln Laboratory. Other research resources include distinguished research groups in neurophysiology, neuroanatomy, and neuropharmacology at the Medical School and the Charles River Campus; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department; and in biophysics and computational physics within the Physics Department. In addition to its basic research and training program, the department conducts a seminar series, as well as conferences and symposia, which bring together distinguished scientists from both experimental and theoretical disciplines. The department is housed in its own new four-story building which includes ample space for faculty and student offices and laboratories, as well as an auditorium, classroom and seminar rooms, a library, and a faculty-student lounge. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Thomas Anastasio Visiting Scholar, Department of Cognitive and Neural Systems (9/1/98 - 6/30/99) Associate Professor, Molecular & Integrative Physiology, University of Illinois, Urbana/Champaign PhD, McGill University Computational modeling of neurophysiological systems. Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior. Aijaz Baloch Adjunct Assistant Professor of Cognitive and Neural Systems Senior Development Engineer, Nestor, Inc. PhD, Electrical Engineering, Boston University Visual motion perception, computational vision, adaptive control, and financial fraud detection. Helen Barbas Professor of Anatomy and Neurobiology, Boston Univ. School of Medicine PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex. Jacob Beck Research Professor of Cognitive and Neural Systems PhD, Psychology, Cornell University Visual perception, psychophysics, computational models. Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development. Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems PhD, Mathematics, University of Wisconsin, Madison Learning and memory, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations. Gert Cauwenberghs Visiting Scholar, Department of Cognitive and Neural Systems (6/1/98 - 8/31/99) Associate Professor of Electrical And Computer Engineering, Johns Hopkins Univ. PhD, Electrical Engineering, California Institute of Technology VLSI circuits, systems and algorithms for parallel analog signal processing and adaptive neural computation. Laird Cermak Director, Memory Disorders Research Center, Boston Veterans Affairs Medical Center Professor of Neuropsychology, School of Medicine Professor of Occupational Therapy, Sargent College PhD, Ohio State University Memory disorders. Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems. H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, signal processing models of hearing. Birgitta Dresp Visiting Scholar, Department of Cognitive and Neural Systems (10/1/98 - 12/31/98) Research Agent of the French Government (CNRS), Universite Louis Pasteur PhD in Cognitive Psychology, Universite Rene Descartes, Paris Visual Psychophysics (Form Perception, Spatial Contrast, Perceptual Learning). Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory. William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology. Gil Engel Research Fellow, Department of Cognitive and Neural Systems Chief Engineer, Vision Applications, Inc. Senior Design Engineer, Analog Devices, CTS Division MS, Polytechnic University, New York Space-variant active vision systems for use in human-computer interactive control. Bruce Fischl Research Fellow, Department of Cognitive and Neural Systems Postdoctoral Research Fellow, Massachusetts General Hospital PhD, Cognitive and Neural Systems, Boston University Anisotropic diffusion and nonlinear image filtering, space-variant vision, computational models of early visual processing, and automated analysis of magnetic resonance images. Paolo Gaudiano Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Computational and neural models of robotics, vision, adaptive sensory-motor control, and behavioral neurobiology. Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics. Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition. Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Theoretical biology, theoretical psychology, dynamical systems, and applied mathematics. Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, and biological sensory-motor control. Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition. Michael E. Hasselmo Associate Professor of Psychology PhD, Experimental Psychology, Oxford University Electrophysiological studies of neuromodulatory effects in cortical structures, network biophysical simulations of memory function in hippocampus and piriform cortex, behavioral studies of amnestic drugs. Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing. Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Functional analysis, mathematical physics, partial differential equations. Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamical systems, mathematical physiology, pattern formation in biological/physical systems. Gregory Lesher Research Fellow, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neuro- developmental disorders. Ennio Mingolla Associate Professor of Cognitive and Neural Systems and Psychology PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes. Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production. Alan Peters Professor of Anatomy and Neurobiology, School of Medicine PhD, Zoology, Bristol University, United Kingdom Organization of neurons in the cerebral cortex; effects of aging on the primate brain; fine structure of the nervous system. Andrzej Przybyszewski Research Fellow, Department of Cognitive and Neural Systems Assistant Professor, University of Massachusetts Medical School PhD, Warsaw Medical Academy Electrophysiology of the primate visual system, mathematical and computer modeling of the neuronal networks in the visual system. Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision. Mark Reinitz Assistant Professor of Psychology PhD, University of Washington Cognitive psychology, attention, explicit and implicit memory, memory-perception interactions. Mark Rubin Research Assistant Professor of Cognitive and Neural Systems PhD, Physics, University of Chicago Pattern recognition; artificial and biological vision. Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Assistant Professor, Department of Psychology and Center for the Ecological Study of Perception and Action University of Connecticut, Storrs Research Scientist, Haskins Laboratories, New Haven, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities. Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Scientist, Rowland Institute for Science Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception. Teaching about functional MRI and other brain mapping methods. Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling. Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Jesse and Louis Salvage Professor of Psychology, Brandeis University PhD, Psychology, Brown University Visual motion, visual adaptation, relation of visual perception, memory, and movement. Barbara Shinn-Cunningham Assistant Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensori- motor adaptation, mathematical models of human performance. Malvin Teich Professor of Electrical and Computer Engineering, Biomedical Engineering, and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission. Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging. Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI). Allen Waxman Adjunct Associate Professor of Cognitive and Neural Systems Senior Staff Scientist, MIT Lincoln Laboratory PhD, Astrophysics, University of Chicago Visual system modeling, multisensor fusion, image mining, parallel computing, and advanced visualization. James Williamson Research Assistant Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Development of cortical receptive fields; perceptual grouping; pattern recognition. Jeremy Wolfe Adjunct Associate Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Dept. Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, preattentive and attentive object representation. Curtis Woodcock Professor of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing. CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN710 Advanced Topics in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and a student-run Journal Club, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by grants and contracts from federal agencies that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception, speech and language processing, and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Silicon Graphics workstations, Macintoshes, and PCs. All students have access to X-terminals or UNIX workstation consoles, a selection of color systems and PCs, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Lab is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; a light machine shop; an active vision lab including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. Neurobotics Laboratory The Neurobotics Lab utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The lab currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Other platforms may be investigated in the future. Psychoacoustics Laboratory The Psychoacoustics Lab houses a newly installed, 8 ft. x 8 ft. sound-proof booth. The laboratory is extensively equipped to perform both traditional psychoacoustic experiments and experiments using interactive auditory virtual-reality stimuli. The major equipment dedicated to the psychoacoustics laboratory includes two Pentium-based personal computers; two Power-PC-based Macintosh computers; a 50-MHz array processor capable of generating auditory stimuli in real time; programmable attenuators; analog-to-digital and digital-to-analog converters; a real-time head tracking system; a special-purpose, signal-processing hardware system capable of generating "spatialized" stereo auditory signals in real time; a two-channel oscilloscope; a two-channel spectrum analyzer; various cables, headphones, and other miscellaneous electronics equipment; and software for signal generation, experimental control, data analysis, and word processing. Sensory-Motor Control Laboratory The Sensory-Motor Control Lab supports experimental studies of motor kinematics. An infrared WatSmart system allows measurement of large-scale movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. Equipment includes a 40-inch monitor that allows computer display of animations generated by an SGI workstation or a Pentium Pro (Windows NT) workstation. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. Speech and Language Laboratory The Speech and Language Lab includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. For high speed processing, supercomputer facilities speed filtering and data analysis. Visual Psychophysics Laboratory The Visual Psychophysics Lab occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Silicon Graphics, Inc. (SGI) Onyx RE2, SGI Indigo2 High Impact, SGI Indigo2 Extreme, Power Computing (Macintosh compatible) PowerTower Pro 225, and Macintosh 7100/66 workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing glasses, prisms, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://cns-web.bu.edu/ ******************************************************************* --- This was an announcement from the Department of Cognitive and Neural Systems at Boston University. If you would like to be removed from this mailing list and discontinue receiving these announcements, you may send an e-mail to "majordomo at cns.bu.edu" with the following line in the body of the message: unsubscribe announcements If you have any trouble in doing so, please do not hesitate to report problems to "owner-majordomo at cns.bu.edu". Thank you. From austin at minster.cs.york.ac.uk Tue Nov 10 07:21:47 1998 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Tue, 10 Nov 1998 12:21:47 +0000 Subject: Workshop Message-ID: <9811101221.ZM9279@minster.cs.york.ac.uk> Announcement / Call for Papers Weightless Neural Networks Workshop WNNW-99 York, UK, 30th March 1999 WNNW-99 is a third in a series of premier international meetings for dissemination and discussion of research trends and results in weightless neural networks. The main aim of these conferences is to pro- vide a convenient forum for discussion of recent results and ideas in all areas pertaining to weightless neural networks, RAM based networks and N-tuple networks including architectures, algorithms, coding, hardware implementations and novel applications. The workshop will be held at the Department of Computer Science, University of York, UK. Most participants will be staying at the on-campus accommodation, close to the venue. The meetings will start on Tuesday, 30th of March, 1999 and will last for two days. This meeting is supported by the Advanced Computer Architecture Group at the Department of Computer Science, University of York, UK. The latest workshop information will be posted to the workshop home page at http://thalamus.cs.york.ac.uk/wnnw/ Important Dates Paper Submission Deadline Tuesday, 19th of January 1998 Acceptance Notification Date not later than Tuesday, 16th of February 1999 Workshop Date Tuesday, 30th of March 1999 Inquiries Inquiries concerning the meeting can be sent to the organisers either by email to wnnw at cs.york.ac.uk or by post to the following address: WNNW-98 Advanced Computer Architecture Group Department of Computer Science University of York York YO10 5DD UK Organising Committee Prof. Jim Austin, University of York Mr. Dan Kustrin, University of York Prof. Nigel M Allinson, UMIST Prof. Igor Aleksander, Imperial College of Science, Technology and Medicine Dr. Simon O'Keefe, University of York -- Professor Jim Austin, Department of Computer Science, University of York, York, YO1 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From Jon.Baxter at keating.anu.edu.au Mon Nov 9 17:29:03 1998 From: Jon.Baxter at keating.anu.edu.au (Jonathan Baxter) Date: Tue, 10 Nov 1998 09:29:03 +1100 (EST) Subject: no subject (file transmission) Message-ID: <199811092229.JAA08025@keating.anu.edu.au> *********************** NIPS*98 FINAL PROGRAM ********************** My apologies if you receive this notice more than once. Jonathan Baxter ******************************************************************** SUN NOV 29 ---------- 18:00-22:00 Registration MON NOV 30 ---------- 08:30-18:00 Registration 09:30-17:30 Tutorials 18:30 Reception and Conference Banquet 20:30 The laws of the WEB (Banquet talk) B. Huberman Xerox PARC TUE DEC 1 --------- Oral Session 1: 08:30 Statistics of visual images: neural representation and synthesis (Invited) E. Simoncelli New York University 09:20 Attentional modulation of human pattern discrimination psychophysics reproduced by a quantitative model (VS1, Oral) L. Itti, J. Braun, D. Lee, C. Koch California Institute of Technology 09:40 Orientation, scale, and discontinuity as emergent properties of illusory contour shape (VS2, Oral) K. Thornber, L. Williams NEC Research Institute, University of New Mexico 10:00 DTs: dynamic trees (AA1, Spotlight) C. Williams, N. Adams Aston University Modeling stationary and integrated time series with autoregressive neural networks (LT1, Spotlight) F. Leisch, A. Trapletti, K. Hornik Technical University of Vienna Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages (LT2, Spotlight) W. Maass, E. Sontag Technical University of Graz, Rutgers University Semiparametric support vector and linear programming machines (AA8, Spotlight) A. Smola, T. Friess, B. Schoelkopf GMD FIRST Blind separation of filtered source using state-space approach (AA13, Spotlight) L. Zhang, A. Cichocki RIKEN Brain Science Institute 10:15-11:00 Break Oral Session 2: 11:00 The bias-variance tradeoff and the randomized GACV (AA4, Oral) G. Wahba, X. Lin, F. Gao, D. Xiang, R. Klein, B. Klein University of Wisconsin-Madison, SAS Institute 11:20 Kernel PCA and de-noising in feature spaces (AA7, Oral) S. Mika, B. Schoelkopf, A. Smola, K. Mueller, M Scholz, G. Raetsch GMD FIRST 11:40 Sparse code shrinkage: denoising by maximum likelihood estimation (AA12, Oral) A. Hyvaarinen, P. Hoyer, E. Oja Helsinki University of Technology 12:00-14:00 Lunch Oral Session 3: 14:00 Temporally asymmetric Hebbian learning, spike timing and neuronal response variability (Invited) L. Abbott Brandeis University 14:50 Information maximization in single neurons (NS1, Oral) M. Stemmler, C. Koch California Institute of Technology 15:10 Multi-electrode spike sorting by clustering transfer functions (NS2, Oral) D. Rinberg, H. Davidowitz, N. Tishby NEC Research Institute 15:30 Distributional population codes and multiple motion models (NS3, Spotlight) R. Zemel, P. Dayan University of Arizona, Massachusetts Institute of Technology Population coding with correlated noise (NS4, Spotlight) H. Yoon, H. Sompolinsky Hebrew University Bayesian modeling of human concept learning (CS1, Spotlight) J. Tenenbaum Massachusetts Institute of Technology Mechanisms of generalization in perceptual learning (CS2, Spotlight) Z. Liu, D. Weinshall NEC Research Institute, Hebrew University An entropic estimator for structure discovery (SP1, Spotlight) M. Brand Mitsubishi Electric Research Laboratory 15:45-16:15 Break Oral Session 4: 16:15 The role of lateral cortical competition in ocular dominance development (NS6, Oral) C. Piepenbrock, K. Obermayer Technical University of Berlin 16:35 Evidence for learning of a forward dynamic model in human adaptive control (CS3, Oral) N. Bhushan, R. Shadmehr Johns Hopkins University 16:55-18:00 Poster Preview 19:30 Poster Session WED DEC 2 --------- Oral Session 5: 08:30 Computation by Cortical Modules (Invited) H. Sompolinsky Hebrew University 09:20 Learning curves for Gaussian processes (LT14, Oral) P. Sollich University of Edinburgh 9:40 Mean field methods for classification with Gaussian processes (LT15, Oral) M. Opper O. Winther Aston University, Niels Bohr Institute 10:00 Dynamics of supervised learning with restricted training sets (LT16, Spotlight) A. Coolen, D. Saad King's College London, Aston University Finite-dimensional approximation of Gaussian processes (LT18, Spotlight) G. Trecate, C. Williams, M. Opper University of Pavia, Aston University Inference in multilayer networks via large deviation bounds (LT20, Spotlight) M. Kearns, L. Saul AT&T Labs Gradient descent for general reinforcement learning (CN14, Spotlight) L. Baird, A. Moore Carnegie Mellon University Risk sensitive reinforcement learning (CN15, Spotlight) R. Neuneier, O. Mihatsch Siemens AG 10:15-11:00 Break Oral Session 6: 11:00 VLSI implementation of motion centroid localization for autonomous navigation (IM6, Oral) R. Etienne-Cummings, M. Ghani, V. Gruev Southern Illinois University 11:20 Improved switching among temporally abstract actions (CN16, Oral) R. Sutton, S. Singh, D. Precup, B. Ravindran University of Massachusetts, University of Colorado 11:40 Finite-sample convergence rates for Q-learning and indirect algorithms (CN17, Oral) M. Kearns, S. Singh AT&T Labs, University of Colorado 12:00-14:00 Lunch Oral Session 7: 14:00 Statistical natural language processing: better living through floating-point numbers (Invited) E. Charniak Brown University 14:50 Markov processes on curves for automatic speech recognition (SP3, Oral) L. Saul, M. Rahim AT&T Labs 15:10 Approximate learning of dynamic models (AA22, Oral) X. Boyen, D. Koller Stanford University 15:30 Learning nonlinear stochastic dynamics using the generalized EM algorithm (AA23, Spotlight) Z. Ghahramani, S. Roweis University of Toronto, California Institute of Technology Reinforcement learning for trading systems (AP9, Spotlight) J. Moody, M. Saffell Oregon Graduate Institute Bayesian modeling of facial similarity (AP13, Spotlight) B. Moghaddam, T. Jebara, A. Pentland Mitsubishi Electric Research Laboratory, Massachusetts Institute of Technology Computation of smooth optical flow in a feedback connected analog network (IM8, Spotlight) A. Stocker, R. Douglas University and ETH Zurich Classification on pairwise proximity data (AA26, spotlight) T. Graepel, R. Herbrich, P. Bollmann-Sdorra, K. Obermayer Technical University of Berlin 15:45-16:15 Break Oral Session 8: 16:15 Learning from dyadic data (AA27, Oral) T. Hofmann, J. Puzicha, M. Jordan Massachusetts Institute of Technology, University of Bonn 16:35 Classification in non-metric spaces (VS7, Oral) D. Weinshall, D. Jacobs, Y. Gdalyahu NEC Research Institute, Hebrew University 16:55-18:00 Poster Preview 19:30 Poster Session THU DEC 3 --------- Oral Session 9: 08:30 Convergence of the wake-sleep algorithm (LT22, Oral) S. Ikeda, S. Amari, H. Nakahara RIKEN Brain Science Institute 08:50 Learning a continuous hidden variable model for binary data (AA32, Oral) D. Lee, H. Sompolinsky Bell Laboratories, Hebrew University 09:10 Direct optimization of margins improves generalization in combined classifiers (LT23, Oral) L. Mason, P. Bartlett, J. Baxter Australian National University 09:30 A polygonal line algorithm for constructing principal curves (AA39, Oral) B. Kegl, A. Krzyzak, T. Linder, K. Zeger Concordia University, Queen's University, UC San Diego 09:50-10:30 Break Oral Session 10: 10:30 Graphical models for recognizing human interactions (AP15, Oral) N. Oliver, B. Rosario, A. Pentland Massachusetts Institute of Technology 10:50 Fast neural network emulation of physics-based models for computer animation (AP16, Oral) R. Grzeszczuk, D. Terzopoulos, G. Hinton Intel Corporation, University of Toronto 11:10 Things that think (Invited) N. Gershenfeld Massachusetts Institute of Technology 12:00 End of main conference POSTERS: TUE DEC 1 ------------------ Basis selection for wavelet regression (AA2, Poster) K. Wheeler NASA Ames Research Center Boxlets: a fast convolution algorithm for signal processing and neural networks (AA3, Poster) P. Simard, L. Bottou, P. Haffner, Y. LeCun AT&T Labs Least absolute shrinkage is equivalent to quadratic penalization (AA5, Poster) Y. Grandvalet, S. Canu Universite de Technologie de Compiegne Neural networks for density estimation (AA6, Poster) M. Magdon-Ismail, A. Atiya California Institute of Technology Semi-supervised support vector machines (AA9, Poster) K. Bennett, A. Demiriz Rensselaer Polytechnic Institute Exploiting generative models in discriminative classifiers (AA10, Poster) T. Jaakkola, D. Haussler UC Santa Cruz Using analytic QP and sparseness to speed training of support vector machines (AA11, Poster) J. Platt Microsoft Research Source separation as a by-product of regularization (AA14, Poster) S. Hochreiter, J. Schmidhuber Technical University of Munich, IDSIA Unsupervised classification with non-Gaussian mixture models using ICA (AA15, Poster) T-W. Lee, M. Lewicki, T. Sejnowski The Salk Institute Hierarchical ICA belief networks (AA16, Poster) H. Attias UC San Francisco Efficient Bayesian parameter estimation in large discrete domains (AA17, Poster) N. Friedman, Y. Singer UC Berkeley, AT&T Labs Discovering hidden features with Gaussian processes regression (AA18, Poster) F. Vivarelli, C. Williams Aston University Bayesian PCA (AA19, Poster) C. Bishop Microsoft Research Replicator equations, maximal cliques, and graph isomorphism (AA20, Poster) M. Pelillo University of Venice Convergence rates of algorithms for perceptual organization: detecting visual contours (AA21, Poster) A. Yuille, J. Coughlan Smith-Kettlewell Eye Research Institute Independent component analysis of intracellular calcium spike data (AP1, Poster) K. Prank, J.Boerger, A. von zur Muehlen, G. Brabant, C. Schoefl Medical School Hannover Applications of multi-resolution neural networks to mammography (AP2, Poster) P. Sajda, C. Spence Sarnoff Corporation Making templates rotationally invariant: an application to rotated digit recognition (AP3, Poster) S. Baluja Carnegie Mellon University Graph matching for shape retrieval (AP4, Poster) B. Huet, A. Cross, E. Hancock University of York Vertex identification in high energy physics experiments (AP5, Poster) G. Dror, H. Abramowicz, D. Horn The Academic College of Tel-Aviv-Yaffo, Tel-Aviv University Familiarity discrimination of radar pulses (AP6, Poster) E. Granger, S. Grossberg, M. Rubin, W. Streilein Ecole Polytechnique de Montreal, Boston University Robot docking using mixtures of Gaussians (AP7, Poster) M. Williamson, R. Murray-Smith, V. Hansen Massachusetts Institute of Technology, Technical University of Denmark, Daimler-Benz Call-based fraud detection in mobile communication networks using a hierarchical regime-switching model (AP8, Poster) J. Hollmen, V. Tresp Helsinki University of Technology, Siemens AG Multiple paired forward-inverse models for human motor learning and control (CS4, Poster) M. Haruno, D. Wolpert, M. Kawato ATR Human Information Processing Research Laboratories, University College London A neuromorphic monaural sound localizer (IM1, Poster) J. Harris, C-J. Pu, J. Principe University of Florida Active noise canceling using analog neuro-chip with on-chip learning capability (IM2, Poster) J-W. Cho, S-Y. Lee Korea Advanced Institute of Science and Technology Optimizing correlation algorithms for hardware-based transient classification (IM3, Poster) R. Edwards, G. Cauwenberghs, F. Pineda Johns Hopkins University A high performance k-NN classifier using a binary correlation matrix memory (IM4, Poster) P. Zhou, J. Austin, J. Kennedy University of York A micropower CMOS adaptive amplitude and shift invariant vector quantizer (IM5, Poster) R. Coggins, R. Wang, M. Jabri University of Sidney Where does the population vector of motor cortical cells point during reaching movements? (NS5, Poster) P. Baraduc, E. Guigon, Y. Burnod Universite Pierre et Marie Curie Heeger's normalization, line attractor network and ideal observers (NS7, Poster) S. Deneve, A. Pouget, P. Latham Georgetown University Image statistics and cortical normalization models (NS8, Poster) E. Simoncelli, O. Schwartz New York University Learning instance-independent value functions to enhance local search (CN1, Poster) R. Moll, A. Barto, T. Perkins, R. Sutton University of Massachusetts Exploring unknown environments with real-time heuristic search (CN2, Poster) S. Koenig Georgia Institute of Technology GLS: a hybrid classifier system based on POMDP research (CN3, Poster) A. Hayashi, N. Suematsu Hiroshima City University Non-linear PI control inspired by biological control systems (CN4, Poster) L. Brown, G. Gonye, J. Schwaber E.I. DuPont deNemours Coordinate transformation learning of hand position feedback controller by using change of position error norm (CN5, Poster) E. Oyama, S. Tachi University of Tokyo Optimizing admission control while ensuring quality of service in multimedia networks via reinforcement learning (CN6, Poster) T. Brown, H. Tong, S. Singh University of Colorado Barycentric interpolators for continuous space & time reinforcement learning (CN7, Poster) R. Munos, A. Moore Carnegie Mellon University Modifying the parti-game algorithm for increased robustness, higher efficiency and better policies (CN8, Poster) M. Al-Ansari, R. Williams Northeastern University Coding time-varying signals using sparse, shift-invariant representations (SP2, Poster) M. Lewicki, T. Sejnowski The Salk Institute Phase diagram and storage capacity of sequence storing neural networks (LT3, Poster) A. Duering, A. Coolen, D. Sherrington Oxford University, King's College London Discontinuous recall transitions induced by competition between short- and long-range interactions in recurrent networks (LT4, Poster) N. Skantzos, C. Beckmann, A. Coolen King's College London Computational differences between asymmetrical and symmetrical networks (LT5, Poster) Z. Li, P. Dayan Massachusetts Institute of Technology Shrinking the tube: a new support vector regression algorithm (LT6, Poster) B. Schoelkopf, P. Bartlett, A. Smola, R. Williamson GMD FIRST, Australian National University Dynamically adapting kernels in support vector machines (LT7, Poster) N. Cristianini, C. Campbell, J. Shawe-Taylor University of Bristol, University of London A theory of mean field approximation (LT8, Poster) T. Tanaka Tokyo Metropolitan University The belief in TAP (LT9, Poster) Y. Kabashima, D. Saad Tokyo Institute of Technology, Aston University Unsupervised clustering: the mutual information between parameters and observations (LT10, Poster) D. Herschkowitz, J-P. Nadal Ecole Normale Superieure Classification with linear threshold functions and the linear loss (LT11, Poster) C. Gentile, M. Warmuth University of Milan, UC Santa Cruz Almost linear VC dimension bounds for piecewise polynomial networks (LT12, Poster) P. Bartlett, V. Maiorov, R. Meir Australian National University, Technion Tight bounds for the VC-dimension of piecewise polynomial networks (LT13, Poster) A. Sakurai Japan Advanced Institute of Science and Technology Learning Lie transformation groups for invariant visual perception (VS3, Poster) R. Rao, D. Rudermann The Salk Institute Support vector machines applied to face recognition (VS4, Poster) J. Phillips National Institute of Standards and Technology Learning to find pictures of people (VS5, Poster) S. Ioffe, D. Forsyth UC Berkeley Probabilistic sensor fusion (VS6, Poster) R. Sharma, T. Leen, M. Pavel Oregon Graduate Institute POSTERS: WED DEC 2 ------------------ Learning multi-class dynamics (AA24, Poster) A. Blake, B. North, M. Isard Oxford University Fisher scoring and a mixture of modes approach for approximate inference and learning in nonlinear state space models (AA25, Poster) T. Briegel, V. Tresp Siemens AG A randomized algorithm for pairwise clustering (AA28, Poster) Y. Gdalyahu, D. Weinshall, M. Werman Hebrew University Visualizing group structure (AA29, Poster) M. Held, J. Puzicha, J. Buhmann University of Bonn Probabilistic visualization of high-dimensional binary data (AA30, Poster) M. Tipping Microsoft Research Restructuring sparse high dimensional data for effective retrieval (AA31, Poster) C. Isbell, P. Viola Massachusetts Institute of Technology Exploratory data analysis using radial basis function latent variable models (AA33, Poster) A. Marrs, A. Webb DERA SMEM algorithm for mixture models (AA34, Poster) N. Ueda, R. Nakano, Z. Ghahramani, G. Hinton NTT Communication Science Laboratories, University of Toronto Learning mixture hierarchies (AA35, Poster) N. Vasconcelos, A. Lippman Massachusetts Institute of Technology On-line and batch parameter estimation of Gaussian mixtures based on the relative entropy (AA36, Poster) Y. Singer, M. Warmuth AT&T Labs, UC Santa Cruz Very fast EM-based mixture model clustering using multiresolution kd-trees (AA37, Poster) A. Moore Carnegie Mellon University Maximum conditional likelihood via bound maximization and the CEM algorithm (AA38, Poster) T. Jebara, A. Pentland Massachusetts Institute of Technology Lazy learning meets the recursive least squares algorithm (AA40, Poster) M. Birattari, G. Bontempi, H. Bersini Universite Libre de Bruxelles Global optimization of neural network models via sequential sampling (AA41, Poster) J. de Freitas, M. Niranjan, A. Doucet, A. Gee Cambridge University Regularizing AdaBoost (AA42, Poster) G. Raetsch, T. Onoda, K. Mueller GMD FIRST Using collective intelligence to route Internet traffic (AP10, Poster) D. Wolpert, K. Tumer, J. Frank NASA Ames Research Center Scheduling straight-line code using reinforcement learning and rollouts (AP11, Poster) A. McGovern, E. Moss University of Massachusetts Probabilistic modeling for face orientation discrimination: learning from labeled and unlabeled data (AP12, Poster) S. Baluja Carnegie Mellon University Adding constrained discontinuities to Gaussian process models of wind fields (AP14, Poster) D. Cornford, I. Nabney, C. Williams Aston University A principle for unsupervised hierarchical decomposition of visual scenes (CS5, Poster) M. Mozer University of Colorado Facial memory is kernel density estimation (almost) (CS6, Poster) M. Dailey, G. Cottrell, T. Busey UC San Diego, Indiana University Perceiving without learning: from spirals to inside/outside relations (CS7, Poster) K. Chen, D. Wang Ohio State University Utilizing time: asynchronous binding (CS8, Poster) B. Love Northwestern University A model for associative multiplication (CS9, Poster) G. Christianson, S. Becker McMaster University Analog VLSI cellular implementation of the boundary contour system (IM7, Poster) G. Cauwenberghs, J. Waskiewicz Johns Hopkins University An integrated vision sensor for the computation of optical flow singular points (IM9, Poster) C. Higgins, C. Koch California Institute of Technology Spike-based compared to rate-based Hebbian learning (NS9, Poster) R. Kempter, W. Gerstner, L. van Hemmen Technical University of Munich, Swiss Federal Institute of Technology Neuronal regulation implements efficient synaptic pruning (NS10, Poster) G. Chechik, I. Meilijson, E. Ruppin Tel-Aviv University Signal detection in noisy weakly-active dendrites (NS11, Poster) A. Manwani, C. Koch California Institute of Technology Influence of changing the synaptic transmitter release probability on contrast adaptation of simple cells in the primary visual cortex (NS12, Poster) P. Adorjan, K. Obermayer Technical University of Berlin Complex cells as cortically amplified simple cells (NS13, Poster) F. Chance, S. Nelson, L. Abbott Brandeis University Synergy and redundancy among brain cells of behaving monkeys (NS14, Poster) I. Gat, N. Tishby Hebrew University, NEC Research Institute Visualizing and analyzing single-trial event-related potentials (NS15, Poster) T-P. Jung, S. Makeig, M. Westerfield, J. Townsend, E. Courchesne, T. Sejnowski The Salk Institute, Naval Health Research Center, UC San Diego A reinforcement learning algorithm in partially observable environments using short-term memory (CN9, Poster) N. Suematsu, A. Hayashi Hiroshima City University Experiments with a memoryless algorithm which learns locally optimal stochastic policies for partially observable Markov decision processes (CN10, Poster) J. Williams, S. Singh University of Colorado The effect of eligibility traces on finding optimal memoryless policies in partially observable Markovian decision processes (CN11, Poster) J. Loch University of Colorado Learning macro-actions in reinforcement learning (CN12, Poster) J. Randlov Niels Bohr Institute Reinforcement learning based on on-line EM algorithm (CN13, Poster) M. Sato, S. Ishii ATR Human Information Processing Research Laboratories, Nara Institute of Science and Technology Controlling the complexity of HMM systems by regularization (SP4, Poster) C. Neukirchen, G. Rigoll Gerhard-Mercator-University Maximum-likelihood continuity mapping (MALCOM): an alternative to HMMs (SP5, Poster) D. Nix, J. Hogden Los Alamos National Laboratory On-line learning with restricted training sets: exact solution as benchmark for general theories (LT17, Poster) H. Rae, P. Sollich, A. Coolen King's College London, University of Edinburgh General bounds on Bayes errors for regression with Gaussian processes (LT19, Poster) M. Opper, F. Vivarelli Aston University Variational approximations of graphical models using undirected graphs (LT21, Poster) D. Barber, W. Wiegerinck University of Nijmegen Optimizing classifiers for imbalanced training sets (LT24, Poster) G. Karakoulas, J. Shawe-Taylor Canadian Imperial Bank of Commerce, University of London On the optimality of incremental neural network algorithms (LT25, Poster) R. Meir, V. Maiorov Technion General-purpose localization of textured image regions (VS8, Poster) R. Rosenholtz Xerox PARC A V1 model of pop out and asymmetry in visual search (VS9, Poster) Z. Li Massachusetts Institute of Technology Minutemax: a fast approximation for minimax learning (VS10, Poster) J. Coughlan, A. Yuille Smith-Kettlewell Eye Research Institute Using statistical properties of a labelled visual world to estimate scenes (VS11, Poster) W. Freeman, E. Pasztor Mitsubishi Electric Research Laboratory Example based image synthesis of articulated figures (VS12, Poster) T. Darrell Interval Research ******************************************************************************* From Bill_Warren at Brown.edu Wed Nov 11 15:54:12 1998 From: Bill_Warren at Brown.edu (Bill Warren) Date: Wed, 11 Nov 1998 15:54:12 -0500 (EST) Subject: Please post -- thanks! Message-ID: Please circulate to your graduating seniors: GRADUATE TRAINEESHIPS Visual Navigation in Humans and Robots Brown University The Department of Cognitive and Linguistic Sciences and the Department of Computer Science at Brown University are seeking graduate applicants interested in visual navigation in humans and robots. The project investigates the nature of the spatial knowledge that is used in active navigation, and how it interacts with the environmental layout, landmark properties, and navigational task during learning. The research is based in a unique virtual reality lab with a 40 x 40 ft wide-area tracker, Kaiser head-mounted display, and SGI Onyx 2 graphics. Human experiments study active navigation and landmark recognition in virtual environments, where 3D structure and properties are easily manipulated. In conjunction, biologically-inspired navigation strategies are tested on a mobile robot platform. Computational modeling pursues (a) reinforcement learning and hidden Markov models for spatial navigation and (b) neural net model of the hippocampus. The project is under the direction of Leslie Kaelbling (Computer Science, www.cs.brown.edu), Michael Tarr and William Warren (Cognitive & Linguistic Sciences, www.cog.brown.edu). Three to four graduate traineeships are available, beginning in the Fall of 1999. Applicants should apply to either of these home departments. Application materials can be obtained from: The Graduate School, Brown University, Box 1867, Providence, RI 02912, phone (401) 863-2600, www.brown.edu. The application deadline is Jan. 1, 1999. This program is funded by a Learning and Intelligent Systems grant from NSF, and an IGERT training grant from NSF. -- Bill William H. Warren, Professor Dept. of Cognitive & Linguistic Sciences Box 1978 Brown University Providence, RI 02912 (401) 863-3980 ofc, 863-2255 FAX Bill_Warren at brown.edu From tani at csl.sony.co.jp Thu Nov 12 01:51:41 1998 From: tani at csl.sony.co.jp (Jun.Tani (SONY CSL)) Date: Thu, 12 Nov 1998 15:51:41 +0900 Subject: TR: An Interpretation of the `Self` from the Dynamical Systems Perspective: Message-ID: <199811120651.PAA12630@tani.csl.sony.co.jp> Dear connectionists, The following new technical report is avairable. "An Interpretation of the `Self` from the Dynamical Systems Perspective: A Constructivist Approach" by Jun Tani, Sony CSL. This paper will be published from Journal of Consciousness Studies, Vol.5 No.5-6, 1998. The TR can be retrieved from: (1) ftp.csl.sony.co.jp/CSL/CSL-Papers/98/SCSL-TR-98-018.ps.Z (you need to uncompress in Unix.) (2) ftp.csl.sony.co.jp/CSL/CSL-Papers/98/SCSL-TR-98-018.pdf (3) http://www.csl.sony.co.jp/person/tani.html. ABSTRACT This study attempts to describe the notion of the "self" using dynamical systems language based on the results of our robot learning experiments. A neural network model consisting of multiple modules is proposed, in which the interactive dynamics between the bottom-up perception and the top-down prediction are investigated. Our experiments with a real mobile robot showed that the incremental learning of the robot switches spontaneously between steady and unsteady phases. In the steady phase, the top-down prediction for the bottom-up perception works well when coherence is achieved between the internal and the environmental dynamics. In the unsteady phase, conflicts arise between the bottom-up perception and the top-down prediction; the coherence is lost, and a chaotic attractor is observed in the internal neural dynamics. By investigating possible analogies between this result and the phenomenological literature on the "self", we draw the conclusions that (1) the structure of the "self" corresponds to the "open dynamic structure" which is characterized by co-existence of stability in terms of goal-directedness and instability caused by embodiment; (2) the open dynamic structure causes the system's spontaneous transition to the unsteady phase where the "self" becomes aware. Best regards, jun ------------------------------------------------ Jun TANI, Ph.D Senior Researcher Sony Computer Science Laboratory Inc. Takanawa Muse Building, 3-14-13 Higashi-gotanda, Shinagawa-ku, Tokyo, 141 JAPAN email: tani at csl.sony.co.jp http://www.csl.sony.co.jp/person/tani.html Fax +81-3-5448-4273 Tel +81-3-5448-4380 ** Joint Appointment Visiting Associate Professor Graduate School of Arts and Sciences University of Tokyo From bengioy at IRO.UMontreal.CA Thu Nov 12 10:27:42 1998 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Thu, 12 Nov 1998 10:27:42 -0500 Subject: post-doc position in Montreal / machine learning + bootstrap for database mining Message-ID: <19981112102742.45902@IRO.UMontreal.CA> POST DOCTORAL RESEARCH STAFF MEMBER IN STATISTICAL DATA ANALYSIS AND MACHINE LEARNING FOR HIGH-DIMENSIONAL DATA SETS MATHEMATICS OF INFORMATION TECHNOLOGY AND COMPLEX SYSTEMS (MITACS: a new Canadian Network of Centers of Excellence) Position to be held jointly at the Department of Computer Science & Operations Research and the Department of Mathematics and Statistics, at the UNIVERSITY OF MONTREAL, Quebec, Canada NATURE AND SCOPE OF THE POSITION: A post-doctoral position position is available at the University of Montreal within the MITACS network of centers of excellence. The position is subject to the approval of funding by the MITACS Board of Directors. The main research area will be the statistical data analysis of high-dimensional data sets with machine learning algorithms, also known as, "database mining". The main research questions that will be addressed in this research are the following: - how to deal with the "curse of dimensionality": algorithms based on variable selection or based on combining many variables with different importance, while controlling generalization to avoid overfitting; - how to make inferences on the models obtained with such methods, mostly using resampling methods such as the BOOTSTRAP. This research will be performed within the MITACS project entitled "INFERENCE FROM HIGH-DIMENSIONAL DATA". See http://www.iro.umontreal.ca/‾bengioy/mitacs.html for more information on the project and http://www.mitacs.math.ca for more information on the MITACS network. The candidate will be working under the supervision of professors Yoshua Bengio (computer science and operations research) and Christian Leger (mathematics and statistics). See http://www.iro.umontreal.ca/‾bengioy and http://www.iro.umontreal.ca/‾lisa for more information respectively on Yoshua Bengio and his laboratory. See http://www.dms.umontreal.ca/‾leger for more information on Christian Leger. ESSENTIAL SKILLS, KNOWLEDGE, AND ABILITIES: Candidates must possess a recent Ph.D. in computer science, statistics, mathematics, or a related discipline, with a research background in machine learning (in particular neural networks) and/or computational statistical methods such as the bootstrap. Candidates must have excellent programming skills, with demonstrated experience. Experience in the following areas will be mostly considered: - Statistical data analysis in general, - bootstrapping methods in particular. - Machine learning algorithms in general, - artificial neural networks in particular. - Programming skills in general, - object-oriented programming, - participation in large-scale, multiple authors, software projects, - experience with C, C++ and S-Plus languages, in particular. LENGTH OF EMPLOYMENT: 1 year (with possible renewal for 4 years total), starting as soon as possible. FOR FURTHER INFORMATION, PLEASE CONTACT: Yoshua Bengio bengioy at iro.umontreal.ca, 514-343-6804, fax 514-343-5834 or Christian Leger leger at dms.umontreal.ca 514-343-7824, fax 514-343-5700 Electronic applications (preferably as a postscript, raw text, or pdf file) are encouraged, in the form of a a Curriculum Vitae with information on academic experience, academic standing, research experience, programming experience, and any other relevant information (e.g., pointer to your web site, if any). -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, C.P. 6128 Succ. Centre-Ville, 2920 Chemin de la Tour, Montreal, Quebec, Canada H3C 3J7 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/‾bengioy http://www.iro.umontreal.ca/labs/neuro From jbednar at cs.utexas.edu Fri Nov 13 05:25:31 1998 From: jbednar at cs.utexas.edu (James A. Bednar) Date: Fri, 13 Nov 1998 04:25:31 -0600 (CST) Subject: self-organization software, papers, web demos Message-ID: <199811131025.EAA25845@mail.cs.utexas.edu> The following software package for self-organization in laterally connected maps is now available from the UTCS Neural Networks Research Group website http://www.cs.utexas.edu/users/nn. The software has been developed in the RF-LISSOM project of modeling the primary visual cortex, and is intended to serve as a starting point for computational studies of development and function of perceptual maps in general. Abstracts of two recent papers on the RF-LISSOM project, on segmentation and on internal pattern generators, are also included below. Other papers and demos of the RF-LISSOM software are available at http://www.cs.utexas.edu/users/nn/pages/research/selforg.html. - Jim, Yoonsuck, and Risto Software: ----------------------------------------------------------------------- RFLISSOM: LATERALLY CONNECTED SELF-ORGANIZING MAPS http://www.cs.utexas.edu/users/nn/pages/software/abstracts.html#lissom James A. Bednar and Joseph Sirosh The LISSOM package contains the ANSI C source code and examples for training and testing RF-LISSOM. This implementation supports almost every modern single-processor workstation, as well as the Cray T3E massively parallel supercomputer. It is designed to have full functionality even when run in batch mode or remote mode, using a simple but powerful command file format. It also has an interactive command-line prompt accepting the same commands, which makes it easy to test and explore different options in real-time. Because of the focus on batch/remote use, it does not have a GUI, but it does create a wide variety of images for analysis and testing. It can display these images immediately or save them for later viewing. Sample command files are provided for running orientation and ocular-dominance map simulations on a variety of network and machine sizes. Extensive documentation is also included, all of which is also available via online help where appropriate. This implementation currently supports the LISSOM algorithm only, but it can serve as a good starting point for writing a batch-mode neural-network or related simulator. In particular, it includes independent and general-purpose routines for interprocessor communication, platform independence, console messaging, error handling, command-file/command-prompt processing, parameter setting/ retrieving/ bounds-checking, expression parsing, and online help. Papers: ----------------------------------------------------------------------- PATTERN-GENERATOR-DRIVEN DEVELOPMENT IN SELF-ORGANIZING MODELS James A. Bednar and Risto Miikkulainen In James M. Bower, editor, Computational Neuroscience: Trends in Research, 1998, 317-323. New York: Plenum, 1998. (7 pages) http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#bednar.cns97.ps.Z Self-organizing models develop realistic cortical structures when given approximations of the visual environment as input. Recently it has been proposed that internally generated input patterns, such as those found in the developing retina and in PGO waves during REM sleep, may have the same effect. Internal pattern generators would constitute an efficient way to specify, develop, and maintain functionally appropriate perceptual organization. They may help express complex structures from minimal genetic information, and retain this genetic structure within a highly plastic system. Simulations with the RF-LISSOM orientation map model indicate that such preorganization is possible, providing a computational framework for examining how genetic influences interact with visual experience. The results from this paper can be reproduced with the LISSOM software described above. ----------------------------------------------------------------------- SELF-ORGANIZATION AND SEGMENTATION IN A LATERALLY CONNECTED ORIENTATION MAP OF SPIKING NEURONS Yoonsuck Choe and Risto Miikkulainen Neurocomputing, in press (20 pages) http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#choe.neurocomputing98.ps.Z The RF-SLISSOM model integrates two separate lines of research on computational modeling of the visual cortex. Laterally connected self-organizing maps have been used to model how afferent structures such as orientation columns and patterned lateral connections can simultaneously self-organize through input-driven Hebbian adaptation. Spiking neurons with leaky integrator synapses have been used to model image segmentation and binding by synchronization and desynchronization of neuronal group activity. Although these approaches differ in how they model the neuron and what they explain, they share the same overall layout of a laterally connected two-dimensional network. This paper shows how both self-organization and segmentation can be achieved in such an integrated network, thus presenting a unified model of development and functional dynamics in the primary visual cortex. A demo of RF-SLISSOM can be run remotely over the internet at http://www.cs.utexas.edu/users/nn/pages/research/selforg.html. From niebur at russell.mb.jhu.edu Fri Nov 13 14:32:08 1998 From: niebur at russell.mb.jhu.edu (Prof. Ernst Niebur) Date: Fri, 13 Nov 1998 14:32:08 -0500 Subject: Graduate studies in systems and computational neuroscience at Johns Hopkins University Message-ID: <199811131932.OAA07483@russell.mb.jhu.edu> The Johns Hopkins University is a major private research university and its hospital and medical school have been rated consistently as the first or second in the nation over the last years. The Department of Neuroscience ranks second in the nation (all according to 'US News and World Report'). The Zanvyl Krieger Mind/Brain Institute at Johns Hopkins encourages students with interest in systems neuroscience, including computational neuroscience, to apply for the graduate program in the Neuroscience department. The Institute is an interdisciplinary research center devoted to the investigation of the neural mechanisms of mental function and particularly to the mechanisms of perception: How is complex information represented and processed in the brain, how is it stored and retrieved, and which brain centers are critical for these operations? Research opportunities exist in all of the laboratories of the Institute. Interdisciplinary projects, involving the student in more than one laboratory, are particularly encouraged. All students accepted to the PhD program of the Neuroscience department receive full tuition remission plus a stipend at or above the National Institutes of Health predoctoral level. Additional information on the research interests of the faculty in the Mind/Brain Institute and the Department of Neuroscience can be obtained at http://www.mb.jhu.edu/mbi.html and at http://www.med.jhu.edu/neurosci/welcome.html, respectively. Applicants should have a B.S. or B.A. with a major in any of the biological or physical sciences. Applicants are required to take the Graduate Record Examination (GRE; both the aptitude tests and an advanced test), or the Medical College Admission Test (MCAT). Further information on the admission procedure can be obtained from the Department of Neuroscience: Director of Graduate Studies Neuroscience Training Program Department of Neuroscience The Johns Hopkins University School of Medicine 725 Wolfe Street Baltimore, MD 21205 Completed applications (including three letters of recommendation and either GRE scores or Medical College Admission Test scores) must be received by January 1, 1998 at the above address. -- Ernst Niebur, PhD Krieger Mind/Brain Institute Asst. Prof. of Neuroscience Johns Hopkins University niebur at jhu.edu 3400 N. Charles Street (410)516-8643, -8640 (secr), -8648 (fax), -4357 (lab) Baltimore, MD 21218 From jkh at dcs.rhbnc.ac.uk Fri Nov 13 06:53:28 1998 From: jkh at dcs.rhbnc.ac.uk (Keith Howker) Date: Fri, 13 Nov 1998 11:53:28 -0000 Subject: NeuroCOLT2 - Recent technical Reports Message-ID: <01BE0EFC.3AF90BA0.jkh@dcs.rhbnc.ac.uk> The following NeuroCOLT2 Technical reports are now available on our web site and ftp archive. best wishes, K. ++++++++++++++++++++++++++++ NC-TR-98-011 Matamala & Meer On the computational structure of the connected components of a hard problem NC-TR-98-019 Williamson, Smola & Schoelkopf Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators NC-TR-98-020 Shawe-Taylor & Cristianini Margin distribution bounds on generalization NC-TR-98-021 Raetsch, Onoda & Mueller Soft Margins for AdaBoost NC-TR-98-022 Smola, Williamson & Schoelkopf Generalization Bounds for Convex Combinations of Kernel Functions NC-TR-98-023 Williamson, Smola & Schoelkopf Entropy Numbers, Operators and Support Vector Kernels NC-TR-98-024 Smola, Friess & Schoelkopf Semiparametric support vector and linear programming machines NC-TR-98-025 Auer, Cesa-Bianchi, Freund & Schapire Gambling in the rigged casino: the adversarial multi-armed bandit problem NC-TR-98-026 Cesa-Bianchi & Lugosi On prediction of individual sequences NC-TR-98-027 Smola, Williamson & Schoelkopf Generalization bounds and learning rates for regularized principal manifolds NC-TR-98-028 Smola, Mika & Schoelkopf Quantization functionals and regularized principal manifolds NC-TR-98-029 Shawe-Taylor & Cristianini Robust bounds on generalization from the margin distribution NC-TR-98-030 Smola & Schoelkopf A tutorial on Support Vector regression ++++++++++++++++++++++++++++ GENERAL INFORMATION =================== The European Community ESPRIT Working Group in Neural and Computational Learning Theory (NeuroCOLT) has been funded for a further three years as the Working Group NeuroCOLT2. The overall objective of the Working Group is to demonstrate the effectiveness of technologies that arise from a deep understanding of the performance and implementation of learning systems on real world data. We will continue to maintain the Technical Report Archive of papers produced by members or associates of the group's partners. These results, together with other project information, may be accessed on our web site: http://www.neurocolt.com/. ACCESS INSTRUCTIONS =================== The files and abstracts may be accessed via WWW starting from the NeuroCOLT homepage: http://www.neurocolt.com/ or from the archive: ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports Alternatively, it is still possible to use ftp access as follows: % ftp ftp.dcs.rhbnc.ac.uk (134.219.96.1) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports/1998 ftp> binary ftp> get nc-tr-1998-001.ps.Z ftp> bye % zcat nc-tr-1998-001.ps.Z | lpr Similarly for the other technical reports. In some cases there are two files available, for example, nc-tr-97-002-title.ps.Z nc-tr-97-002-body.ps.Z The first contains the title page while the second contains the body of the report. The single command, ftp> mget nc-tr-97-002* will prompt you for the files you require. --------------------------------------------------------------------- | Keith Howker | e-mail: jkh at dcs.rhbnc.ac.uk | | Dept. of Computer Science | Phone : +44 1784 443696 | | RHUL | Fax : +44 1784 439786 | | EGHAM TW20 0EX, UK | Home: +44 1932 222529 | --------------------------------------------------------------------- From thimm at idiap.ch Mon Nov 16 12:05:20 1998 From: thimm at idiap.ch (Georg Thimm) Date: Mon, 16 Nov 1998 18:05:20 +0100 Subject: Contents of Neurocomputing 21 (1998) Message-ID: <199811161705.SAA09282@rotondo.idiap.ch> Dear reader, Please find below a compilation of the contents for Neurocomputing and Scanning the Issue written by V. David Snchez A. More information on the journal are available at the URL http://www.elsevier.nl/locate/jnlnr/05301 . The contents of this and other journals published by Elsevier are distributed also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect). Please feel free to redistribute this message. My apologies if this message is inappropriate for this mailing list; I would appreciate a feedback. With kindest regards, Georg Thimm Dr. Georg Thimm Research scientist & WWW: http://www.idiap.ch/‾thimm Current Events Editor of Neurocomputing Tel.: ++41 27 721 77 39 (Fax: 12) IDIAP / C.P. 592 / 1920 Martigny / Suisse E-mail: thimm at idiap.ch ******************************************************************************** Journal : NEUROCOMPUTING ISSN : 0925-2312 Vol./Iss. : 21 / 1-3 The self-organizing map Kohonen , Teuvo pp.: 1-6 TEXSOM: Texture segmentation using self-organizing maps Ruiz-del-Solar , Javier pp.: 7-18 Self-organizing maps of symbol strings Kohonen , Teuvo pp.: 19-30 SOM accelerator system Ru"ping , S. pp.: 31-50 Sufficient conditions for self-organisation in the one-dimensional SOM with a reduced width neighbourhood Flanagan , John A. pp.: 51-60 Text classification with self-organizing maps: Some lessons learned Merkl , Dieter pp.: 61-77 Local linear correlation analysis with the SOM Piras , Antonio pp.: 79-90 Applications of the growing self-organizing map Villmann , Th. pp.: 91-100 WEBSOM -- Self-organizing maps of document collections Kaski , Samuel pp.: 101-117 Theoretical aspects of the SOM algorithm Cottrell , M. pp.: 119-138 Self-organization and segmentation in a laterally connected orientation map of spiking neurons Choe , Yoonsuck pp.: 139-158 Neural detection of QAM signal with strongly nonlinear receiver Raivio , Kimmo pp.: 159-171 Self-organizing maps: Generalizations and new optimization techniques Graepel , Thore pp.: 173-190 Predicting bankruptcies with the self-organizing map Kiviluoto , Kimmo pp.: 191-201 Developments of the generative topographic mapping Bishop , Christopher M. pp.: 203-224 ******************************************************************************** Neurocomputing 21 (1998) Scanning the issue T. Kohonen presents in The self-organizing map (SOM) as overview of the SOM algorithm which realizes a data-compressing mapping of a high-dimensional data distribution onto a regular lower-dimensional grid. The SOM preserves the most important topological and metric relationships of the original data, this leads to its main properties of visualization and abstraction. In TEXSOM: Texture segmentation using self-organizing maps J. Ruiz-del-Solar describes a texture segmentation architecture based on the joint spatial/spatial-frequency paradigm. The Adaptive-Subspace Self-Organizing Map (ASSOM) or the Supervised ASSOM (SASSOM) are used to generate automatically the oriented filters. Defect identification on textured images can be performed using the proposed architecture. T. Kohonen and P. Somervuo present Self-organizing maps of symbol strings. Instead of defining metric vector spaces to be used with the self-organizing map (SOM) as is usually the case, symbols strings are organized on a SOM array. The batch map principle and average over strings are applied. The Feature Distance (FD) between the symbol strings and the Redundant Hash Addressing (RHA) are employed for the construction of large SOM arrays. S. Rping, M. Porrmann, and U. Rckert describe the SOM accelerator system. The system is a massively parallel system based on the NBISOM - 25 chips. Each chip contains an array of five times five processing elements. A VMEBus board with sixteen chips on it was built. The model vectors have up to 64 weights with 8-Bit accuracy. The system performance is 2.4 GCUPS for learning and 4.1 GCPS for recall. J.A. Flanagan presents Sufficient conditions for self-organisation in the one-dimensional SOM with a reduced width neighborhood. To achieve self-organization three intervals of non-zero probability separated by a minimum distance are required. The conditions are sufficient, but appear to be close to necessary. D. Merkl describes in Text classification with self-organizing maps: Some lessons learned the application of a hierarchical neural network approach to the task of document classification. The different network levels are realized by independent SOMs allowing the selection of different levels of granularity while exploring the document collection. A. Piras and A. Germond present in Local linear correlation analysis with the SOM an extension to the SOM algorithm for selecting relevant input variables in nonlinear regression. The linear correlation between variables in neighbor spaces is studied using the SOM. A localized correlation coefficient is determined that allows the understanding of the varying dependencies over the definition manifold. In Applications of the growing self-organizing map T. Villmann and H.-U. Bauer describe the growing self-organizing map (GSOM) algorithm. This extension of the SOM algorithm adapts the topology of the map output space and allows for unsupervised generation of dimension-reducing projections with optimal neighborhood preservation. In WEBSOM - Self-organizing maps of document collections S. Kaski, T. Honkela, K. Lagus, and T. Kohonen describe a new method to organize large collections of full-text documents in electronic form. A histogram of word categories is used to encode each document. The documents are ordered in a document map according to their contents. Computationally efficient SOM algorithms were used. In Theoretical aspects of the SOM algorithm M. Cottrell, J.C. Fort, and G. Pges review the status quo of the efforts to understand the SOM algorithm theoretically. The study of the one-dimensional case is complete with the exception of finding the appropriate decreasing rate to ensure ordering. The study of the multi-dimensional case is difficult and far from being complete. Y. Choe and R. Miikkulainen present Self-organization and segmentation in a laterally connected orientation map of spiking neurons. The model achieves selforganization based on rudimentary visual input and develops orientation-selective receptive fields and patterned lateral connections forming a global orientation map. Crucial principles of the biological visual cortex are incorporated. In Neural detection of QAM signal with strongly nonlinear receiver K. Raivio, J. Henriksson, and O. Simula describe neural receiver structures for adaptive discrete-signal detection. A receiver structure based on the self-organizing map (SOM) is compared with the conventional Decision Feedback Equalizer (DFE) for a Quadrature Amplitude Modulated (QAM) transmitted signal. T. Graepel, M. Burger and K. Obermayer describe in Self-organizing maps: Generalizations and new optimization techniques three algorithms for generating topographic mappings based on cost function minimization using an EM algorithm and deterministic annealing. The algorithms described are the Soft Topographic Vector Quantization (STVQ) algorithm, the Kernel-based Soft Topographic Mapping (STMK) algorithm, and the Soft Topographic Mapping for Proximity data (STMP) algorithm. K. Kiviluoto presents Predicting bankruptcies with the self-organizing map. The qualitative analysis on going bankrupt is performed using the SOM algorithm in a supervised manner. The quantitative analysis is performed using three classifiers: SOM-1, SOM-2, and RBF-SOM. Results are compared to Linear Discriminant Analysis (LDA) and Learning Vector Quantization (LVQ). C.M. Bishop, M. Svensn, and C.K.I. Williams present Developments of the generative topographic mapping (GTM) which is an enhancement of the standard SOM algorithm with a number of advantages. Extensions of the GTMare reported: the use of an incremental EM algorithm, the use of local subspace models, mixing discrete and continuous data, and the use of semi-linear models and high-dimensional manifolds. I appreciate the cooperation of all those who submitted their work for inclusion in this issue. V. David Snchez A. Editor-in-Chief From baveja at research.att.com Mon Nov 16 10:17:45 1998 From: baveja at research.att.com (satinder singh baveja) Date: Mon, 16 Nov 1998 10:17:45 -0500 Subject: Call for Papers: Machine Learning Journal special issue on REINFORCEMENT LEARNING Message-ID: <36504219.DEA39F52@research.att.com> Machine Learning Journal Special Issue on REINFORCEMENT LEARNING Edited by Satinder Singh Submission Deadline: March 1, 1999 Reinforcement learning has become an exciting focus for research in machine learning and artificial intelligence. Some of this new excitement comes from a convergence of computational approaches to planning and learning in large-scale environments, some from accumulating empirical evidence that reinforcement learning algorithms can solve significant real-world problems, and some from the ever-increasing mathematical maturity of this field. In this special issue we would like to capture a snapshot of this recent excitement, and invite submission of papers describing new work in reinforcement learning. New algorithms, problem frameworks, theoretical results, and empirical results are all appropriate contributions. Submission Deadline: March 1, 1999 Expected publication Date: January, 2000 SUBMISSION INSTRUCTIONS *********************** Submissions will undergo the standard Machine Learning journal review process. Send one hard-copy of submissions to: Satinder Singh AT&T Shannon Lab Room A269 180 Park Avenue Florham Park, NJ 07932 USA Fax: (973) 360-8970 E-mail: baveja at research.att.com If possible, e-mail the abstract separately to baveja at research.att.com, even before it is final, so that reviewers can be allocated efficiently. Also mail five hard-copies of submitted papers to: Karen Cullen MACHINE LEARNING Editorial Office Kluwer Academic Publishers 101 Philip Drive Assinippi Park Norwell, MA 02061 USA phone: (617) 871-6300 E-mail: karen at world.std.com **** PLEASE INDICATE that your submission is for the REINFORCEMENT **** LEARNING SPECIAL issue. Submissions exceptionally longer than 24 pages when formatted according the journal's style (pointers to which are below) may be rejected without review. Note: Machine Learning is now accepting submission for FINAL copy in electronic form. There is a latex style file and a related files available via the url: http://www.wkap.nl/journalhome.htm/0885-6125 Authors are strongly encouraged to use these style files or the formatting instructions stated at the back of the Machine Learning journal for their submissions. From omlin at waterbug.cs.sun.ac.za Mon Nov 16 08:29:55 1998 From: omlin at waterbug.cs.sun.ac.za (Christian Omlin) Date: Mon, 16 Nov 1998 15:29:55 +0200 Subject: preprints available Message-ID: The technical reports below are available from my homepage at http://www.cs.sun.ac.za/people/staff/omlin.html. Comments are welcome. My apologies if you receive multiple copies of this announcement. Best regards, Christian Christian W. Omlin e-mail: omlin at cs.sun.ac.za Department of Computer Science phone (direct): +27-21-808-4932 University of Stellenbosch phone (secretary): +27-21-808-4232 Private Bag X1 fax: +27-21-808-4416 Stellenbosch 7602 http://www.cs.sun.ac.za/people/staff/omlin.html SOUTH AFRICA http://www.neci.nj.nec.com/homepages/omlin ------------------------------------------------------------------------------------ Recurrent Neural Networks Learn Deterministic Representations of Fuzzy Finite-State Automata C.W. Omlin$^a$, C.L. Giles$^{b,c}$ $^a‾$Department of Computer Science University of Stellenbosch 7600 Stellenbosch South Africa $^b‾$NEC Research Institute 4 Independence Way Princeton, NJ 08540 USA $^c‾$UMIACS University of of Maryland College Park, MD 20742 USA ABSTRACT The paradigm of deterministic finite-state automata (DFAs) and their corresponding regular languages have been shown to be very useful for addressing fundamental issues in recurrent neural net- works. The issues that have been addressed include knowledge rep- resentation, extraction, and refinement as well development of advanced learning algorithms. Recurrent neural networks are also very promising tool for modeling discrete dynamical systems through learning, particularly when partial prior knowledge is available. The drawback of the DFA paradigm is that it is inap- propriate for modeling vague or uncertain dynamics; however, many real-world applications deal with vague or uncertain information. One way to model vague information in a dynamical system is to allow for vague state transitions, i.e. the system may be in several states at the same time with varying degree of certainty; fuzzy finite-state automata (FFAs) are a formal equivalent of such systems. It is therefore of interest to study how uncertain- ty in the form of FFAs can be modeled by deterministic recurrent neural networks. We have previously proven that second-order re- current neural networks are able to represent FFAs, i.e. recur- rent networks can be constructed that assign fuzzy memberships to input strings with arbitrary accuracy. In such networks, the classification performance is independent of the string length. In this paper, we are concerned with recurrent neural networks that have been trained to behave like FFAs. In particular, we are interested in the internal representation of fuzzy states and state transitions and in the extraction of knowledge in symbolic form. ------------------------------------------------------------------------------------ Equivalence in Knowledge Representation: Automata, Recurrent Neural Networks, and Dynamical Fuzzy Systems C.W. Omlin$^a$, C.L. Giles$^{b,c}$, K.K. Thornber $^b$ $^a‾$Department of Computer Science University of Stellenbosch 7600 Stellenbosch South Africa $^b‾$NEC Research Institute 4 Independence Way Princeton, NJ 08540 USA $^c‾$UMIACS University of of Maryland College Park, MD 20742 USA ABSTRACT Neuro-fuzzy systems - the combination of artificial neural net- works with fuzzy logic - are becoming increasingly popular. How- ever, neuro-fuzzy systems need to be extended for %are often in- adequate for applications which require context (e.g., speech, handwriting, control). Some of these applications can be modeled in the form of finite-state automata. Previously, it was proved that deterministic finite-state automata (DFAs) can be stably synthesized or mapped into second-order recurrent neural networks with sigmoidal discriminant functions and sparse interconnection topology by programming the networks' weights to $+H$ or $-H$. Based on those results, this paper proposes a synthesis method for mapping fuzzy finite-state automata (FFAs) into recurrent neural networks which is suitable for implementation in VLSI, i.e. the encoding of FFAs is a generalization of the encoding of DFAs. The synthesis method requires FFAs to undergo a transfor- mation prior to being mapped into recurrent networks. Their neu- rons have a slightly enriched functionality in order to accommo- date a fuzzy representation of FFA states, i.e. any state can be occupied with a fuzzy membership that takes on values in the range $[0, 1]$ and several fuzzy states can be occupied at any given time. [ This is in contrast to stochastic finite-state automata where there exists no ambiguity about which is an au- tomaton's current state. The automaton can only be in exactly one state at any given time and the choice of a successor state is determined by some probability distribution. For a discussion of the relation between probability and fuzziness, see for in- stance] The enriched neuron functionality allows fuzzy parameters of FFAs to be directly represented as parameters of the neural network. In this paper we prove the stability of fuzzy finite- state dynamics of constructed neural networks for finite values of network weight $H$ and through simulations give empirical val- idation of the proofs. ------------------------------------------------------------------------------------ Dynamic Adaptation of Recurrent Neural Network Architectures Guided By Symbolic Knowledge C.W. Omlin$^a$, C.L. Giles$^{b,c}$ $^a‾$Department of Computer Science University of Stellenbosch 7600 Stellenbosch South Africa $^b‾$NEC Research Institute 4 Independence Way Princeton, NJ 08540 USA $^c‾$UMIACS University of of Maryland College Park, MD 20742 USA ABSTRACT The success and the time needed to train neural networks with a given learning algorithm depend on the learning task, the initial conditions, and the network architecture. Particularly the num- ber of hidden units in feedforward and recurrent neural networks is an important factor. We propose a novel method for dynamical- ly adapting the architecture of recurrent neural networks trained to behave like deterministic finite-state automata (DFAs). It is different from other constructive approaches so far as our method relies on algorithms for extracting and inserting symbolic knowl- edge in the form of DFAs. The network architecture (number of neurons and weight configuration) changes during training based on the symbolic information extracted from undertrained networks. We successfully trained recurrent networks to recognize strings of a regular language accepted by a non-trivial, randomly gener- ated deterministic finite-state automaton. Our empirical results indicate that symbolically-driven network adaptation results in networks that generalize better than networks trained without dy- namic network adaptation or networks trained with standard dynam- ic growing methods. From steve at cns.bu.edu Wed Nov 18 22:11:09 1998 From: steve at cns.bu.edu (Stephen Grossberg) Date: Wed, 18 Nov 1998 23:11:09 -0400 Subject: possible listing Message-ID: The following preprint on the subject of how the visual cortex is organized to give rise to visual percepts has been accepted for publication in Spatial Vision and is currently in press. The paper is available from my home page: http://cns-web.bu.edu/Profiles/Grossberg/ Paper copies are available through the mail. Contact: amos at cns.bu.edu and request by title: Grossberg, S. (1998). How does the cerebral cortex work? Learning, attention, and grouping by the laminar circuits of visual cortex. Spatial Vision, in press. Abstract: The organization of neocortex into layers is one of its most salient anatomical features. These layers include circuits that form functional columns in cortical maps. A major unsolved problem concerns how bottom-up, top-down, and horizontal interactions are organized within cortical layers to generate adaptive behaviors. This article models how these interactions help visual cortex to realize: (1) the binding process whereby cortex groups distributed data into coherent object representations; (2) the attentional process whereby cortex selectively processes important events; and (3) the developmental and learning processes whereby cortex shapes its circuits to match environmental constraints. New computational ideas about feedback systems suggest how neocortex develops and learns in a stable way, and why top-down attention requires converging bottom-up inputs to fully activate cortical cells, whereas perceptual groupings do not. A related article applies these insights to the processing of Synthetic Aperture Radar images. This article is in press in Neural Networks. It is: Mingolla, E., Ross, W., and Grossberg, S. (1998). A Neural Network for Enhancing Boundaries and Surfaces in Synthetic Aperture Radar Images. Paper copies can be gotten by writing amos at cns.bu.edu. From mieko at hip.atr.co.jp Wed Nov 18 04:28:33 1998 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Wed, 18 Nov 1998 18:28:33 +0900 Subject: Announcement of Brain Science of Communications Symposium Message-ID: <199811180928.SAA22769@mailhost.hip.atr.co.jp> Dear members, I am glad to inform you that the "Brain Science of Communications Symposium" will be held as below. The project "Communications of Primates including Humans" is supported by Science and Technology Agency Special Coordination Fund Projects for Strategic Promotion System for Brain Science. Please join us if you have an interest. For details of the project: http://hozu.ctr.atr.co.jp/kbp/index.html For details of the symposium: URL:http://hozu.ctr.atr.co.jp/kbp/symposium.html Mitsuo Kawato ATR Human Information Processing Res. Labs. ****************************************************************** Brain Science of Communications Symposium ****************************************************************** The Science and Technology Agency Special Coordination Fund Project Strategic Promotion System for Brain Science "Communications of Primates including Humans" Date: December 14 (Mon) and 15 (Tue), 1998 Place: The Keihanna Plaza, Kansai Science City Fee: Free (charged for the party, ¥3000) ----------------------------------------------------------------- Dec. 14, 1998 Monday 13:00?18:00 Language: English ----------------------------------------------------------------- 13:00-13:15 Welcome & Introduction Yoh'ichi Tohkura (NTT Basic Res. Labs., ATR-I) 13:15-14:00 Invited Lecture Computational Models of Cortical-Cerebellar and Cortical-Basal Ganglionic Control and Cognition James C. Houk (Northwestern University Medical School) 14:10-14:50 Role of the basal ganglia in learning sequential motor tasks Minoru Kimura (School of Health and Sport Sciences, Osaka Univ.) 15:00-15:40 The roles of reward prediction in motor control and communication Kenji Doya (Kawato Dynamic Brain Project, ERATO) 16:00-16:40 Cerebellar simple spikes and complex spikes during arm movements Shigeru Kitazawa (Electrotechnical Lab.) 16:50-17:30 Computational theory of communication based on multiple paired forward-inverse models Mitsuo Kawato (ATR Human Information Processing Res. Labs.) 17:30-18:00 Discussion 18:00-19:30 Party ----------------------------------------------------------------- Dec.15, 1998 Tuesday 10:00?17:30 Language: Japanese ----------------------------------------------------------------- 10:00-10:40 Universal Properties of Language: Embedding and Long-Distance Dependency Yukio Otsu (Inst. of Cultural and Linguistic Studies, Keio Univ.) 10:50-11:30 Brain mechanisms of verbal and non-verbal communication Toshio Inui (Dept. of Intelligence Science and Technology Graduate School of Informatics, Kyoto Univ.) 11:30-12:00 Discussion 12:00-13:20 Lunch Break 13:20-14:00 FMRI and MEG study of neural activity during judgments of visually presented characters and character strings Norio Fujimaki (Fujitsu Labs. Ltd., Communications Res. Lab.) 14:10-14:50 Neuronal representation of facial information for communication : Single neuronsin the temporal cortex encode finer facial information with a categorized header Shigeru Yamane (Electrotechnical Lab.) 15:10-15:50 Development of Neuromagnetic virtual sensor system Yoshida Yoshikazu (Technology Res. Lab., Shimadzu Corporation) 16:00-16:40 The study of visual grouping in human brain using high spatial resolution MEG Keisuke Toyama (Dept. of Physiology, Kyoto Prefectural Univ. of Medicine) 16:40-17:20 Discussion 17:20-17:30 Closing Remarks Yoh'ichi Tohkura(NTT Basic Research Laboratories, ATR-I Registration Form: Kansai Brain Project Symposium (e-mail to the secretariat: ikeda at ctr.atr.co.jp) ------------------------------------------------------------- Name: ------------------------------------------------------------- Institute: ------------------------------------------------------------- TEL: ------------------------------------------------------------- e-mail: ------------------------------------------------------------- Day 1(12/14) Attending/Not attending ------------------------------------------------------------- Party (We will collect ?3000 per person, ?1000 for students at the front) Evening of Day 1(12/14) Attending/Not attending ------------------------------------------------------------- Day 2(12/15) Attending/Not attending ------------------------------------------------------------- ****************************************************************** end. ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL mieko at hip.atr.co.jp ========================================================= From Dave_Touretzky at cs.cmu.edu Wed Nov 18 04:28:36 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Wed, 18 Nov 1998 04:28:36 -0500 Subject: Graduate training in the Neural Basis of Cognition Message-ID: <12550.911381316@skinner.boltz.cs.cmu.edu> Graduate Training with the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers an interdisciplinary doctoral training program operated jointly with nine affiliated PhD programs at Carnegie Mellon University and the University of Pittsburgh. Detailed information about this program is available on our web site at http://www.cnbc.cmu.edu. The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, or robotics. For a brochure describing the program and application materials, contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu Application materials are also available online. The affiliated PhD programs at the two universities are: Carnegie Mellon University of Pittsburgh Biological Sciences Mathematics Computer Science Neurobiology Psychology Neuroscience Robotics Psychology Statistics The CNBC training faculty includes: German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving B.J. Casey (Pitt Psychology): attention; developmental cognitive neuroscience Carson Chow (Pitt Mathematics): spatiotemporal dynamics in neural networks Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Chris Genovese (CMU Statistics): making inferences from scientific data John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Marcel Just (CMU Psychology): visual thinking, language comprehension Robert Kass (CMU Statistics): transmission of info. by collections of neurons Eric Klann (Pitt Neuroscience): hippocampal LTP and LTD Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Pat Levitt (Pitt Neurobiology): molecular basis of cortical development Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Brian MacWhinney (CMU Psychology): models of language acquisition James McClelland (CMU Psychology): connectionist models of cognition Paula Monaghan Nichols (Pitt Neurobiology): vertebrate CNS development Carl Olson (CNBC): spatial representations in primate frontal cortex David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia John Pollock (CMU Biological Sci.): neurodevelopment of the fly visual system Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex William Skaggs (Pitt Neuroscience): representations in rodent hippocampus David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning See http://www.cnbc.cmu.edu for further details. From omlin at waterbug.cs.sun.ac.za Thu Nov 19 04:23:36 1998 From: omlin at waterbug.cs.sun.ac.za (Christian Omlin) Date: Thu, 19 Nov 1998 11:23:36 +0200 Subject: preprints of technical reports - URL correction Message-ID: Dear List Members Thru an oversight on my part, I gave the wrong URL when I announced preprints of technical reports a few days ago; my apologies. The correct URL is http://www.cs.sun.ac.za/people/staff/omlin The link to the technical reports is all the way at the bottom of the page. Alternatively, the papers may now also be downloaded from the U.S. mirror site at http://www.neci.nj.nec.com/homepages/omlin Again, my apologies for the confusion. We welcome your comments. Best regards, Christian From S.Singh at exeter.ac.uk Thu Nov 19 05:53:33 1998 From: S.Singh at exeter.ac.uk (Sameer Singh) Date: Thu, 19 Nov 1998 10:53:33 +0000 (GMT Standard Time) Subject: PhD Studentship in Financial Forecasting Using Neural Networks Message-ID: PhD STUDENTSHIP IN FINANCIAL FORECASTING SCHOOL OF ENGINEERING AND COMPUTER SCIENCE Department of Computer Science Applications are now invited for a PhD studentship in the area of "Financial Forecasting using Neural Networks". The project will develop student skills in areas including neural networks, financial markets, forecasting, and pattern recognition. Candidates for this studentship should have a degree in computer science, engineering or a related subject. They should have programming skills in C/C++/JAVA and knowledge of unix operating system. The studentships are only open to UK and EU applicants, the latter being on a fees-only basis. The successful candidate should expect to take up the studentships no later than 1 February, 1999. Applicants should send a CV, including the names and addresses of two referees, to Dr Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK. The applications can also be emailed at s.singh at exeter.ac.uk and informal enquiries can be made at +44-1392-264053. -------------------------------------------- Sameer Singh Department of Computer Science University of Exeter Exeter EX4 4PT UK tel: +44-1392-264053 fax: +44-1392-264067 email: s.singh at exeter.ac.uk web: http://www.dcs.exeter.ac.uk/academics/sameer -------------------------------------------- From steve at cns.bu.edu Fri Nov 20 20:44:14 1998 From: steve at cns.bu.edu (Stephen Grossberg) Date: Fri, 20 Nov 1998 21:44:14 -0400 Subject: a special issue of interest to Connectionists Message-ID: Below is a compilation of the contents for the 1998 Special Issue of Neural Networks on "Neural Control and Robotics: Biology and Technology", edited by Rodney Brooks, Stephen Grossberg, and Lance Optican. The contents of this and other journals published by Elsevier are distributed by the ContentsDirect service (http://www.elsevier.nl/locate/ContentsDirect). Journal: Neural Networks, Volume 11(7/8), 1998 VISION SENSORS: Vladimir M. Brajovic, Ryohei Miyagawa, and Takeo Kanade: Temporal photoreception for adaptive dynamic range image sensing and encoding pp. 1149-1158 CONTROL OF EYE AND HEAD MOVEMENTS: Gregory Gancarz and Stephen Grossberg: A neural model of the saccade generator in the reticular formation pp. 1159-1174 Philippe Lefevre, Christian Quaia, and Lance M. Optican: Distributed model of control of saccades by superior colliculus and cerebellum pp. 1175-1190 Francesco Panerai and Giulio Sandini: Oculo-motor stabilization reflexes: Integration of inertial and visual information pp. 1191-1204 Paul Dean and John Porrill: Pseudo-inverse control in biological systems: A learning mechanism for fixation stability pp. 1205-1218 Michael G. Paulin: A method for analysing neural computation using receptive fields in state space pp. 1219-1228 Christian Quaia, Lance M. Optican, and Michael E. Goldberg: The maintenance of spatial accuracy by the perisaccadic remapping of visual receptive fields pp. 1229-1240 Jeffrey D. Schall and Doug P. Hanes: Neural mechanisms of selection and control of visually guided eye movements pp. 1241-1251 H. Sebastian Seung: Continuous attractors and oculomotor control pp. 1253-1258 Yasuo Kuniyoshi and Luc Berthouze: Neural learning of embodied interaction dynamics pp. 1259-1276 CONTROL OF ARM AND OTHER BODY MOVEMENTS: Andrew H. Fagg and Michael A. Arbib: Modeling parietal-premotor interactions in primate control of grasping pp. 1277-1303 J.J. van Heijst, J.E. Vos, and D. Bullock: Development in a biologically inspired spinal neural network for movement control pp. 1305-1316 Daniel M. Wolpert and Mitsuo Kawato: Multiple paired forward and inverse models for motor control pp. 1317-1329 Hiroyuki Miyamoto and Mitsuo Kawato: A tennis serve and upswing learning robot based on bi-directional theory pp. 1331-1344 Joseph A. Doeringer and Neville Hogan: Serial processing in human movement production pp. 1345-1356 H.A. Tabeli, K. Khorasani, and R.V. Patel: Neural network based control schemes for flexible-link manipulators: Simulations and experiments pp. 1357-1377 Matthew M. Williamson: Neural control of rhythmic arm movements pp. 1379-1394 Jean-Luc Buessler and Jean-Phillipe Urban: Visually guided movements: Learning with modular neural maps in robotics pp. 1385-1415 Pietro G. Morasso, Vittorio Sanguineti, Francesco Frisone, and Luca Perico: Coordinate-free sensorimotor processing: Computing with population codes pp. 1417-1428 Bjorn O. Peters, Gert Pfurtscheller, and Henrik Flyvbjerg: Mining multi-channel EEG for its information content: An ANN-based method for a brain-computer interface pp. 1429-1433 CONTROL OF LOCOMOTION AND NAVIGATION: Holk Cruse, Thomas Kindermann, Michael Schumm, Jeffrey Dean, and Josef Schmitz: Walknet--A biologically inspired network to control six-legged walking pp. 1435-1447 Gennady S. Cymbalyuk, Roman M. Borisyuk, Uwe Mueller-Wilm, and Holk Cruse: Oscillatory network controlling six-legged locomotion: Optimization of model parameters pp. 1449-1460 Dario Floreano and Francesco Mondada: Evolutionary neurocontrollers for autonomous mobile robots pp. 1461-1478 Barbara Webb: Robots, crickets and ants: Models of neural control of chemotaxis and phonotaxis pp. 1479-1496 Terry Huntsberger and John Rose: BISMARC: A biologically inspired system for map-based autonomous rover control pp. 1497-1510 John J. Weng and Shaoyun Chen: Vision-guided navigation using SHOSLIF pp. 1511-1529 Paul F.M.J. Verschure and Thomas Voegtlin: A bottom up approach towards the acquisition and expression of sequential representations applied to a behaving real-world device: Distributed adaptive control III pp. 1531-1549 Christian Sheier, Rolf Pfeifer, and Yasuo Kuniyoshi: Embedded neural networks: Exploiting constraints pp. 1551-1569 From jose at tractatus.rutgers.edu Sun Nov 8 12:54:36 1998 From: jose at tractatus.rutgers.edu (Stephen Jose Hanson) Date: Sun, 08 Nov 1998 12:54:36 -0500 Subject: RUTGERS DIGITAL LIBRARIES--GRADUATE AWARDS Message-ID: <3645DADB.5A1191EC@tractatus.rutgers.edu> [ Moderator's note: Steve Hanson informs me that people interested in doing neural network or connectionist modeling research are among those being sought for these graduate fellowships. The steering committee includes several people who are well-known in the machine learning community. -- Dave Touretzky, CONNECTIONSTS moderator ] GRADUATE AWARDS FOR INTERDISCIPLINARY STUDY IN DIGITAL LIBRARIES The Rutgers University Distributed Laboratory for Digital Libraries (RDLDL) is pleased to announce the availability of competitive graduate awards for interdisciplinary doctoral studies in digital libaries, including contributing technologies and relevant basic research. Digital libraries is an exciting new domain for research, and for society at large, and the RDLDL is taking a leading role in establishing an explicitly interdisciplinary approach to the variety of problems in this burgeoning area. Current faculty members of the RDLDL come from the disciplines of cognitive science, computer science, library and information science and psychology, and students in the program are expected to participate in research and to take courses in two or more of the disciplines represented in the RDLDL. Successful candidates will participate in digital library-oriented research projects with several faculty members of the RDLDL, as well as being enrolled in a Ph.D. program in one of the disciplines associated with the RDLDL. They will also take part in an interdisciplinary seminar involving the faculty and all of the other graduate students associated with the RDLDL. The RDLDL Awards offer tuition remission and an annual stipend of $13,000, and the opportunity to achieve an interdisciplinary education for research in the interdisciplinary field of digital libaries. These awards are tenable for one year in the first instance, beginning in either January or September 1999, and are renewable. For further information, including the specific interests of the RDLDL faculty, and application materials, please see the RDLDL WWW site, http://diglib.rutgers.edu/RDLDL or contact any one of the members of the RDLDL Steering Committee, listed below. Steering Committee of the Rutgers Distributed Laboratory for Digital Libraries Nicholas J. Belkin, Department of Library and Information Science, New Brunswick Campus nick at belkin.rutgers.edu Benjamin M. Bly, Department of Psychology, Newark Campus ben at psychology.rutgers.edu Sven Dickinson, Department of Computer Science, New Brunswick Campus sven at ruccs.rutgers.edu Stephen Hanson, Department of Psychology, Newark Campus jose at tractatus.rutgers.edu Haym Hirsh, Department of Computer Science, New Brunswick Campus hirsh at cs.rutgers.edu Paul Kantor, Department of Library and Information Science, New Brunswick Campus kantor at scils.rutgers.edu Zenon Pylyshyn, Rutgers Center for Cognitive Science, New Brunswick Campus zenon at ruccs.rutgers.edu To contact us by mail, please write to: Rutgers Distributed Laboratory for Digital Libraries c/o School of Communication, Information and Library Studies Rutgers University 4 Huntington Street New Brunswick, NJ 08901-1071, USA From thimm at idiap.ch Fri Nov 20 08:27:54 1998 From: thimm at idiap.ch (Georg Thimm) Date: Fri, 20 Nov 1998 14:27:54 +0100 Subject: Contents of Neurocomputing 22 (1998) Message-ID: <199811201327.OAA17337@rotondo.idiap.ch> Dear reader, Please find below a compilation of the contents for Neurocomputing and Scanning the Issue written by V. David Snchez A. More information on the journal are available at the URL http://www.elsevier.nl/locate/jnlnr/05301 . The contents of this and other journals published by Elsevier are distributed also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect). Please feel free to redistribute this message. My apologies if this message is inappropriate for this mailing list; I would appreciate a feedback. With kindest regards, Georg Thimm Dr. Georg Thimm Research scientist & WWW: http://www.idiap.ch/‾thimm Current Events Editor of Neurocomputing Tel.: ++41 27 721 77 39 (Fax: 12) IDIAP / C.P. 592 / 1920 Martigny / Suisse E-mail: thimm at idiap.ch ******************************************************************************** Journal : NEUROCOMPUTING ISSN : 0925-2312 Vol./Iss. : 22 / 1-3 The nonlinear PCA criterion in blind source separation: Relations with other approaches Karhunen , Juha pp.: 5-20 Blind separation of convolved mixtures in the frequency domain Smaragdis , Paris pp.: 21-34 Blind source separation using algorithmic information theory Pajunen , Petteri pp.: 35-48 Independent component analysis in the presence of Gaussian noise by maximizing joint likelihood Hyva"rinen , Aapo pp.: 49-67 Learned parametric mixture based ICA algorithm Xu , Lei pp.: 69-80 Bayesian Kullback Ying--Yang dependence reduction theory Xu , Lei pp.: 81-111 Robust techniques for independent component analysis (ICA) with noisy data Cichocki , A. pp.: 113-129 Searching for a binary factorial code using the ICA framework Palmieri , Francesco pp.: 131-144 Constrained PCA techniques for the identification of common factors in data Charles , Darryl pp.: 145-156 A method of blind separation for convolved non-stationary signals Kawamoto , Mitsuru pp.: 157-171 Removing artifacts from electrocardiographic signals using independent components analysis Barros , Allan Kardec pp.: 173-186 >From neural learning to independent components Oja , Erkki pp.: 187-199 A nonlinear model of the binaural cocktail party effect Girolami , Mark pp.: 201-215 ******************************************************************************** Neurocomputing 22 (1998) Scanning the issue In The nonlinear PCA criterion in blind source separation: Relations with other approaches J. Karhunen, P. Pajunen and E. Oja derive the nonlinear Principal Component Analysis (PCA) in Blind Source Separation (BSS) in a form appropriate for comparison with other BSS and Independent Component Analysis (ICA). The choice of the optimal nonlinearity is explained. P. Smaragdis presents Blind separation of convolved mixtures in the frequency domain using information theoretic algorithms. Improved efficiency and better convergence are accomplished by the proposed approach which shows clear advantages when compared to its time-domain counterparts. The filter parameters in the frequency domain are orthogonal to each other, therefore one can update one parameter without any influence on the rest of parameters. P. Pajunen describes Blind source separation using algorithmic information theory. The algorithmic complexity of the sources and the mixing mapping is measured. Natural signals meet often the requirement of low complexity. An experiment consisting of separating correlated signals is discussed. A Hyvrinen presents Independent component analysis in the presence of Gaussian noise by maximizing joint likelihood. In the presence of noise the relationship between observed data and the estimates of the independent components is non-linear. For supergaussian (sparse) data the nonlinearity can be approximated by a shrinkage operation and realized using competitive learning. For subgaussian components anti-competitive learning can be used. L. Xu, C.C. Cheung and S. Amari describe a Learned parametric mixture based ICA algorithm. It is based on linear mixture and its separation capability is shown to be superior to the original model with prefixed nonlinearity. Experiments with subgaussian, supergaussian and combinations of these types of sources confirm the applicability of the algorithm. L. Xu presents the (BKYY-DR) Bayesian Kullback Yingang Dependence Reduction theory. In particular the solution of the Blind Source Separation (BSS) problem is addressed. Algorithms and criteria for parameter learning and model selection are based on stochastic approximation. They are given for a general BKYY-DR system and its three typical architectures. Experiments with binary sources are reported. A. Cichocki, S.C. Douglas and S. Amari propose Robust techniques for independent component analysis (ICA) with noisy data. A recurrent dynamic neural network architecture is introduced for simultaneous unbiased estimation of the unknown mixing matrix, blind source separation and noise reduction in the extracted output signals. The shape parameters of the nonlinearities are adjusted using gradient-based rules. F. Palmieri, A. Budillon, M. Calabrese and D. Mattera present Searching for a binary factorial code using the ICA framework. Independent Component Analysis (ICA) is formulated as density search and used for finding a mapping of the input space into a binary string with independent bits, i.e. a binary factorial code. Experiments with a real image show the feasibility of the approach whose results are compared with the ones of a Principal Component Analyzer (PCA). D. Charles describes Constrained PCA techniques for the identification of common factors in data. An unsupervised learning network is presented that operates similarly to Principal Factor Analysis. The network responds to the covariance of the input data. Extensions include preprocessing and a function that supports sparseness at the network output. M. Kawamoto, K. Matsuoka and N. Ohnishi present A method of blind separation for convolved non-stationary signals. The method extracts non-stationary signals from their convolutive mixtures. The minimization of Matsuoka's cost function is performed. Simulations confirm the feasibility of the method. The cases number of observed signals greater or equal to the number of source signals are analyzed. A.K. Barros, A. Mansour and N. Ohnishi describe Removing artifacts from ECG signals using independent component analysis. An architecture is proposed consisting of a high-pass filter, a two-layer network based on the Independent Component Analysis (ICA) algorithm, and a self-adaptive step-size. Simulations using a standard ECG database are performed. E. Oja presents From neural learning to independent components. Emphasis is placed on the connection between independent Component Analysis (ICA) and neural learning, especially constrained Hebbian learning. The latter is a non-linear extension of Principal Component Analysis (PCA). Results are given on stationary points and their asymptotic stability. M. Girolami describes A nonlinear model of the binaural cocktail party effect. The network is based on a temporal anti-Hebbian adaptive maximum likelihood estimator and its operation is similar to many approaches for blind source separation of convolutive mixtures. Impressive results are obtained when using simulations. Favorable results are obtained under realistic unconstrained acoustic conditions when compared with current adaptive filters. I appreciate the cooperation of all those who submitted their work for inclusion in this issue. V. David Snchez A. Editor-in-Chief From cf99 at stern.nyu.edu Mon Nov 23 01:40:03 1998 From: cf99 at stern.nyu.edu (Andreas Weigend - Computational Finance 99) Date: Mon, 23 Nov 1998 01:40:03 -0500 (EST) Subject: Computational Finance CF99 Program and Registration Message-ID: <199811230640.BAA01787@kuaile.stern.nyu.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 26639 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/51e48358/attachment.ksh From harnad at coglit.soton.ac.uk Mon Nov 23 14:59:16 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Mon, 23 Nov 1998 19:59:16 +0000 (GMT) Subject: Color & Consciousness: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article (see also 4 important announcements about new BBS policies at the end of this message) COLOR, CONSCIOUSNESS, AND THE ISOMORPHISM CONSTRAINT by Stephen E. Palmer This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ COLOR, CONSCIOUSNESS, AND THE ISOMORPHISM CONSTRAINT Stephen E. Palmer Psychology Department University of California Berkeley, CA 94720-1650 palmer at cogsci.berkeley.edu http://socrates.berkeley.edu/‾plab ABSTRACT: The relations among consciousness, brain, behavior, and scientific explanation are explored within the domain of color perception. Current scientific knowledge about color similarity, color composition, dimensional structure, unique colors, and color categories is used to assess Locke's "inverted spectrum argument" about the undetectability of color transformations. A symmetry analysis of color space shows that the literal interpretation of this argument -- reversing the experience of a rainbow -- would not work. Three other color-to-color transformations might, however, depending on the relevance of certain color categories. The approach is then generalized to examine behavioral detection of arbitrary differences in color experiences, leading to the formulation of a principled distinction, called the isomorphism constraint, between what can and cannot be determined about the nature of color experience by objective behavioral means. Finally, the prospects for achieving a biologically based explanation of color experience below the level of isomorphism are considered in light of the limitations of behavioral methods. Within-subject designs using biological interventions hold the greatest promise for scientific progress on consciousness, but objective knowledge of another person's experience appears impossible. The implications of these arguments for functionalism are discussed. In this article I discuss the relations among mind, brain, behavior, and science in the particular domain of color perception. My reasons for approaching these difficult issues from the perspective of color experience are two-fold. First, there is long philosophical tradition of debating the nature of internal experiences of color, dating from John Locke's (1690) discussion of the so-called "inverted spectrum argument". This intuitively compelling argument constitutes an important historical backdrop for much of the article. Second, color is perhaps the most tractable, best understood aspect of mental life from a scientific standpoint. It demonstrates better than any other topic how a mental phenomenon can be more fully understood by integrating knowledge from many different disciplines (Kay & McDaniel, 1978; Thompson, 1995; Palmer, in press). In this article I turn once more to color for new insights into how conscious experience can be studied and understood scientifically. I begin with a brief description of the inverted spectrum problem as posed in classical philosophical terms. I then discuss how empirical constraints on the answer can be brought to bear in terms of the structure of human color experience as it is currently understood scientifically. This discussion ultimately leads to a principled distinction, called the isomorphism constraint, between what can and what cannot be determined about the nature of experience by objective behavioral means. Finally, I consider the prospects for achieving a biologically based explanation of color experience, ending with some speculations about limitations on what science can achieve with respect to understanding color experience and other forms of consciousness. ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.palmer.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.palmer ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.palmer To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.palmer When you have the file(s) you want, type: quit ____________________________________________________________ FOUR IMPORTANT ANNOUNCEMENTS ------------------------------------------------------------------ (1) There have been some extremely important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/‾harnad/science.html Nature: http://www.cogsci.soton.ac.uk/‾harnad/nature.html http://www.cogsci.soton.ac.uk/‾harnad/nature2.html American Scientist: http://www.cogsci.soton.ac.uk/‾harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is exceedingly simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ Paper/Disk: Behavioral and Brain Sciences Department of Electronics and Computer Science New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/‾harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html Nominations of books for BBS Multiple book review are also invited. From black at signal.dera.gov.uk Tue Nov 24 05:57:21 1998 From: black at signal.dera.gov.uk (John V. Black) Date: Tue, 24 Nov 98 10:57:21 +0000 Subject: IEE Colloquium on Applied Statistical Pattern Recognition Message-ID: IEE COLLOQUIUM ON APPLIED STATISTICAL PATTERN RECOGNITION Tuesday, 20 April 1999 to be held at The Midlands Engineering Centre, Birmingham, UNITED KINGDOM Professional Group E5 (Signal processing) and the Pattern & Information Processing Group, DERA Malvern, United Kingdom The innovative application of advanced pattern recognition techniques is increasingly required to achieve the high levels of performance demanded by modern systems. It is therefore vital for engineers and scientists to develop their awareness and understanding of emerging and established techniques, and of their successful application. This colloquium will provide a focus for applied research in this area, drawing contributions both from industry and academia. It will increase awareness of developing techniques across a broad range of applications areas; and will be of interest to people working in all areas of statistical pattern recognition, addressing applications drawn (for example) from signal and image processing. A number of invited papers will be presented by leading researchers: some of these will take the form of illustrative tutorial sessions. Additional contributions are welcome, and a one page synopsis of proposed papers should be sent by Friday, 15 January 1999 to: Dr Amanda Goode Senior Research Scientist, Pattern & Information Processing DERA Malvern St Andrews Road Great Malvern Worcestershire WR14 3PS United Kingdom Tel: (+44) (0)1684 894366 Fax: (+44) (0)1684 894384 Email: goode at signal.dera.gov.uk From terry at salk.edu Wed Nov 25 02:54:38 1998 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 24 Nov 1998 23:54:38 -0800 (PST) Subject: NEURAL COMPUTATION 11:1 Message-ID: <199811250754.XAA21965@helmholtz.salk.edu> Neural Computation - Contents - Volume 11, Number 1 - January 1, 1999 *** A list of all reviewers for Vols 5-10 will be included in this 10th anniversary issue. REVIEW Evolution of Time Coding Systems C. E. Carr and M. A. Friedman ARTICLE Computing With Self-Excitatory Cliques: A Model and an Application to Hyperacuity-Scale Computation in Visual Cortex Steven Zucker and Douglas Miller NOTES Complex Response to Periodic Inhibition In Simple and Detailed Neuronal Models Corrado Bernasconi, Kaspar Schindler, Ruedi Stoop and Rodney Douglas Neuronal Tuning: To Sharpen Or Broaden? Kechen Zhang and Terrence J. Sejnowski Narrow vs Wide Tuning Curves: What's Best for a Population Code? Alexandre Pouget, Sophie Deneve, Jean-Christophe Ducom, and Peter Latham LETTERS The Effect of Correlated Variability on the Accuracy of a Population Code L. F. Abbott and Peter Dayan A Neural Network Model of Temporal Code Generation and Position Invariant Pattern Recognition Dean V. Buonomano and Michael Merzenich Probabilistic Synaptic Transmission in the Associative Net Bruce Graham and David Willshaw A Stochastic Self-Organizing Map for Proximity Data Thore Graepel and Klaus Obermayer High-Order Constrasts for Independent Component Analysis Jean-Francois Cardoso Variational Learning in Nonlinear Gaussian Belief Networks Brendan J. Frey and Geoffrey E. Hinton Propagating Distributions Up Directed Acyclic Graphs Eric B. Baum and Warren D. Smith Modeling and Prediction of Human Behavior Alex Pentland and Andrew Liu Analog VLSI-Based Modeling of the Primate Oculomotor System Timothy K. Horiuchi and Christof Koch JPEG Quality Transcoding Using Neural Networks Trained With a Perceptual Error Measure John Lazzaro and John Wawrzynek ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 11 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $84 Individual $82 $87.74 $116 Institution $302 $323.14 $336 * includes 7% GST (Back issues from Volumes 1-10 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From horn at neuron.tau.ac.il Wed Nov 25 11:11:03 1998 From: horn at neuron.tau.ac.il (David Horn) Date: Wed, 25 Nov 1998 18:11:03 +0200 (IST) Subject: NCST-99 Message-ID: Conference Announcement ------------------------- Neural Computation in Science and Technology --------------------------------------------- Israel, October 10-13, 1999 This conference, sponsored by the Israel Ministry of Science, aims to bring together scientists and practitioners of neural computation who are interested in scientific as well as technological applications. It will focus on two main general topics: A. The use of neural network techniques in various applications in science and technology. B. Modeling in computational neuroscience. The conference's dual focus on neuroscience and technological themes follows one of the fundamental ideas of the field of Computational Neuroscience, i.e., the belief that such cross-disciplinary interactions will be beneficial to researchers of both schools. Such cross-fertilization of ideas has already manifested itself in both directions over the last decade: On one hand, thinking about the brain in computational concepts has given rise to the development of artificial neural networks as leading computing devices for performing a multitude of technological tasks, such as classification, function approximation, prediction and control. On the other hand, basic engineering and technological methods such as principal component analysis, blind separation, learning vector quantization and reinforcement learning have all led to the development of neurally-based realizations. The latter, in turn, have given rise to many interesting ideas about the possible central role that such networks may play in the brain in a multitude of information processing tasks, varying from odor processing to reward-related action selection to self-organization of cortical maps, just to name a few. Moreover, new frontiers of neural computation such as neuromorphic engineering and adaptive autonomous control manifest even further the growing need for technological-scientific interaction, to fully exploit the potential promises of neural computation. The conference will consist of a four day meeting that will include tutorials, a series of invited and contributed talks, a poster session, and a discussion panel. An informal atmosphere will be maintained, encouraging questions and discussions. It will take place in a hotel that serves also as a conference center in Kibbutz Maale Hachamisha, located between Tel Aviv and Jerusalem. It has the nice scenery, and pleasant weather, of the Judean hills. Currently Confirmed Invited Speakers: -------------------------------------- D. Amit, Y. Baram, W. Bialek, E. Domany, R. Douglas, G. Dreyfus, D. Golomb, M. Hasselmo, J. Hertz, D. Horn, N. Intrator, I. Kanter, W. Kinzel, H. Oja, E. Ruppin, I. Segev, T. Sejnowski, H. Sompolinsky, N. Tishby M. Tsodyks and V. Vapnik. Call for Papers ------------------ We call herewith for the submission of papers for oral or poster presentation at the conference. The papers should fit into the two general categories of the conference. Submitted papers should be limited to seven pages in length and indicate physical and e-mail addresses of all authors and the author to whom correspondence should be addressed. Four copies of the submitted papers should reach the conference scientific committee by May 15th, 1999. Mail submissions to Prof. David Horn, NCST-99, School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel. Scientific Organizing Committee --------------------------------- D. Horn (chairman), H. Abramowicz, A. Agranat, Y. Baram, I. Kanter, E. Ruppin, N. Tishby, M. Tsodyks, G. Ariely and B. Shimony. Information and Registration: ----------------------------- Registration forms and information about the conference are available at the website http://neuron.tau.ac.il/NCST-99 ---------------------------------------------------------------------------- Prof. David Horn horn at neuron.tau.ac.il School of Physics and Astronomy http://neuron.tau.ac.il/‾horn Tel Aviv University Tel: ++972-3-642-9305, 640-7377 Tel Aviv 69978, Israel. Fax: ++972-3-640-7932 From gorr at willamette.edu Wed Nov 25 13:21:42 1998 From: gorr at willamette.edu (Jenny Orr) Date: Wed, 25 Nov 1998 10:21:42 -0800 (PST) Subject: Book Announcement: Tricks of the Trade Message-ID: <199811251821.KAA16692@mirror.willamette.edu> Dear collegues, The following book is now available: Neural Networks: Tricks of the Trade, edited by Genevieve B. Orr and Klaus-Robert M¥"uller LNCS 1524, Springer Heidelberg (1998) http://www.springer.de/comp/lncs/neuralnw/index.html (contains ordering and other information) Some background information: The idea for this book dates back to our 1996 NIPS workshop on tricks of the trade (http://www.first.gmd.de/persons/Mueller.Klaus-Robert/nipsws.htm). People who work with neural networks acquire, through experience and word-of-mouth, techniques and heuristics that help them successfully apply neural networks to difficult real world problems. Often these tricks are theoretically well motivated. Sometimes they are the result of trial and error. However, their most common link is that they are usually hidden in people's heads or in the back pages of space constrained conference papers. As a result newcomers to the field waste much time wondering why their networks train so slowly and perform so poorly. The tricks book tried to collect all this interesting and useful information about how to train neural networks. For a glimpse on the table of content see: http://www.springer.de/comp/lncs/neuralnw/contents.pdf Contributors are: Anderson, Back, Bottou, Burns, Caruana, Denker, Finke, Flake, Fritsch, Giles, Hansen, Hirzinger, Horn, Intrator, Larsen, Lawrence, LeCun, Lyon, M¥"uller, Moody, Naftaly, Neuneier, Orr, Plate, Prechelt, R¥"ognvaldsson, Schraudolph, Simard, van der Smagt, Svarer, Tsoi, Victorri, Webb, Yaeger, Zimmermann. Best wishes, Klaus-Robert M¥"uller & Genevieve B. Orr From shultz at psych.mcgill.ca Wed Nov 25 13:57:18 1998 From: shultz at psych.mcgill.ca (Tom Shultz) Date: Wed, 25 Nov 1998 13:57:18 -0500 Subject: Academic position at McGill Message-ID: <3.0.2.32.19981125135718.007a2430@psych.mcgill.ca> Dear Connectionists, Here is a notice for an academic position at McGill University that may interest you or your colleagues or students. Please feel free to circulate this notice. MCGILL UNIVERSITY DEPARTMENT OF PSYCHOLOGY The Department of Psychology of McGill University seeks applicants for a tenure-track position at the Assistant Professor level in Cognitive Psychology broadly construed. We seek applicants with a strong program of research and teaching in areas such as cognitive psychology, computational modeling, decision-making, cognitive neuroscience. The Department has excellent facilities for interdisciplinary research through its links with McGill Cognitive Science, the Montreal Neurological Institute, and related departments. The deadline for receipt of completed applications is February 1, 1999, with an anticipated starting date of September 1, 1999. Applicants should arrange for three confidential letters of recommendation to be sent to the address below. Statements of current and proposed areas of research and teaching, curriculum vitae, selected reprints, and other relevant material should also be sent to: Thomas Shultz, Chair, Cognitive Psychology Search Committee, Department of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, Quebec, Canada H3A 1B1. In accordance with Canadian immigration requirements, priority will be given to Canadian citizens and permanent residents of Canada. McGill University is committed to equity in employment. Although priority is given to Canadian citizens and permanent residents of Canada, final decisions will be based on the relative excellence of candidates. Therefore, non-Canadians should seriously consider applying if they are interested in the position. Neural network modelers with an interest in cognition and neuroscience are particularly encouraged to apply. Cheers, Tom -------------------------------------------------------- Thomas R. Shultz, Professor, Department of Psychology, McGill University, 1205 Penfield Ave., Montreal, Quebec, Canada H3A 1B1. E-mail: shultz at psych.mcgill.ca http://www.psych.mcgill.ca/labs/lnsc/html/Lab-Home.html Phone: 514 398-6139 Fax: 514 398-4896 -------------------------------------------------------- From sd136 at umail.umd.edu Wed Nov 25 11:45:46 1998 From: sd136 at umail.umd.edu (Sandy Davis) Date: Wed, 25 Nov 1998 11:45:46 -0500 (Eastern Standard Time) Subject: Computational neuroscientist faculty position Message-ID: Computational Neuroscientist. The Neuroscience and Cognitive Science Program at the Univ. of Maryland (www.life.umd.edu/nacs) seeks tenure-track faculty. Areas of research include sensory and motor physiology, analysis of control systems or cognitive neuroscience. A wide range of techniques can be supported, from neural network modeling to brain mapping/imaging/EEG. Candidates who integrate theoretical with experimental research are preferred. Tenure will be held in the Depts. of Psychology or Kinesiology. Teaching duties include a graduate course in computational neuroscience and undergraduate course(s). For appointment in Psychology, the rank will be Assistant Professor. For Kinesiology, rank is open. For best consideration send, before Feb. 1, 1999, a CV, addresses (including emails) of three references, copies of publications, and statements of both research (documenting any previous extramural funding) and teaching interests to: Dr. Richard Payne, NACS Search, 2239 Zoo/Psych Building, University of Maryland, College Park, MD 20742. Women and minorities are strongly encouraged to apply. The University of Maryland is an Equal Opportunity/Affirmative Action Employer Sandy Davis, Graduate Secretary Neuroscience and Cognitive Science Program 2239 Zoo/Psych Building Univ. of Maryland, College Park,MD 20742-4415 phone: (301) 405-8910 fax: (301) 314-9358 email: sd136 at umail.umd.edu From Dave_Touretzky at cs.cmu.edu Thu Nov 26 02:07:22 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Thu, 26 Nov 1998 02:07:22 -0500 Subject: CNS*99 CALL FOR PAPERS and CNS*99 REGISTRATION FORM Message-ID: <23440.912064042@skinner.boltz.cs.cmu.edu> CALL FOR PAPERS Eighth Annual Computational Neuroscience Meeting CNS*99 July 18 - 22, 1999, Pittsburgh, Pennsylvania Co-sponsored by the Center for the Neural Basis of Cognition at Carnegie Mellon University and the University of Pittsburgh DEADLINE FOR SUMMARIES AND ABSTRACTS: **>> 11:59pm January 26, 1999 <<** This is the eighth annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. Peer reviewed papers are presented all related to understanding how nervous systems compute. As in previous years, CNS*99 will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1999 will take place at the Pittsburgh Hilton and Towers Hotel in downtown Pittsburgh, Pennsylvania. The first session starts at 9 am, Sunday, July 18th and ends with the annual banquet (and river cruise) on Thursday evening, July 22nd. There will be no parallel sessions. The meeting will include time for informal workshops focused on current issues in computational neuroscience. Pittsburgh International Airport is a prime hub for US Airways and enjoys direct flights from most major US cities. The conference hotel is located at Point State Park downtown, where the Allegheny and Monongahela rivers meet to form the Ohio. Having undergone a major renaissance in the 1980s, Pittsburgh is a high tech town that still honors its blue collar roots. You'll find clubs, restaurants, and cultural facilities comparable to most east coast US cities, but also some attractions unique to Pittsburgh's heritage as the "Steel City". SUBMISSION INSTRUCTIONS With this announcement we solicit the submission of presented papers all of which will be refereed. Peer review will be conducted based on a 1000-word (or less) summary describing the methods, nature, and importance of your results. Authors will be notified of acceptance by the second week of May, 1999. As with last year's meeting, all submission of papers will be performed electronically using a custom designed JAVA/HTML interface. Full instructions for submission can be found at the meeting web site: http://numedeon.com/cns-meetings/CNS99/. In brief, authors should cut and paste text from their own word processors into the forms available on the web site. It is important that all requested information be provided, including a 100 word abstract for publication in the conference program, all author information, and selection of the appropriate category and theme from the list provided. Authors should especially note the mechanisms used for figures and mathematical equations. All submissions will be acknowledged immediately by email. Program committee decisions will be sent to the designated correspondence author only. Submissions will not be considered if they lack category information, abstracts, or author addresses, or if they arrive late. FURTHER MEETING CORRESPONDENCE We would like to strongly encourage authors to make their submissions electronically. However, if you do not have ready access to the internet or a web server, we will send you instructions for paper submission if you contact us either by email or at the following address: CNS*99 Division of Biology 216-76 Caltech Pasadena, CA 91125 ADDITIONAL INFORMATION concerning the meeting, including hotel and travel arrangements or information on past meetings can be obtained by: o Using our on-line WWW information and registration server, URL of: http://numedeon.com/cns-meetings/CNS99/ o Sending Email to: cns99 at bbb.caltech.edu CNS*99 ORGANIZING COMMITTEE: Co-meeting Chair / Logistics - David Touretzky, Carnegie Mellon University Co-meeting Chair / Finances and Program - Jim Bower, Caltech Governmental Liaison - Dennis Glanzman, NIMH/NIH Workshop Organizer - To be Announced 1999 Program Committee: Erik De Schutter, University of Antwerp, Belgium Anders Lansner, Royal Institute of Technology, Sweden Chip Levy, University of Virginia Ray Glanz, Rice University David Touretzky, Carnegie Mellon University Gina Turrigiano, Brandeis University Ranu Jung, University of Kentucky Simon Thorpe, CNRS, Toulouse, France 1999 Regional Organizers: Europe- Erik DeSchutter (Belgium) Middle East - Idan Segev (Jerusalem) Down Under - Mike Paulin (New Zealand) South America - Marcelo and Marilene Mazza (Brazil) Asia - Zhaoping Li (London) India - Upinder Bhalla (Bangalore) XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX CNS*99 REGISTRATION FORM Last Name: First Name: Position: Student___ Graduate Student___ Post Doc___ Professor___ Committee Member___ Other___ Organization: Address: City: State: Zip: Country: Telephone: Email Address: REGISTRATION FEES: Technical Program --July 18 - July 22, 1999 Regular $275 ($300 after July 3rd) Price Includes One Banquet/Cruise Ticket Student $125 ($150 after July 3rd) Price Includes One Banquet/Cruise Ticket Each Additional Banquet/Cruise Ticket $35 Total Payment: $ Please Indicate Method of Payment: * Check or Money Order payable in U. S. Dollars to CNS*99 - Caltech (Please make sure to indicate CNS*99 and YOUR name on all money transfers.) * Charge my credit card. Cards the Caltech Ticket Office accepts are: Visa, Mastercard, or American Express Card number: Expiration Date: Name of Cardholder: Signature as appears on card (for mailed in applications): Date: ADDITIONAL QUESTIONS: Previously Attended: CNS*92___ CNS*93___ CNS*94___ CNS*95___ CNS*96___ CNS*97___ CNS*98___ Did you submit an abstract and summary? ( ) Yes ( ) No Title: Do you have special dietary preferences or restrictions (e.g., diabetic, low sodium, kosher, vegetarian)? If so, please note: Some grants to cover partial travel expenses may become available for students and postdoctoral fellows who present papers at the meeting. Do you wish further information? ( ) Yes ( ) No PLEASE CALL PITTSBURGH HILTON AND TOWERS TO MAKE HOTEL RESERVATIONS AT (412) 391-4600 Fax (412) 594-5144 PLEASE FAX OR MAIL REGISTRATION FORM TO: Caltech, Division of Biology 216-76, Pasadena, CA 91125 Attn: Judy Macias Fax Number: (626) 795-2088 (Refund Policy: 50% refund for cancellations on or before July 9; no refund after July 10th) From srx014 at coventry.ac.uk Fri Nov 27 11:24:53 1998 From: srx014 at coventry.ac.uk (Colin Reeves) Date: Fri, 27 Nov 1998 16:24:53 +0000 (GMT) Subject: ICANNGA99 Message-ID: Reminder: 4th International Conference on Artificial Neural Networks and Genetic Algorithms, to be held on 6th-9th April, 1999 at Portoroz, Slovenia, hosted by the University of Ljubljana. Plenary talks by Kevin Warwick, Hugo De Garis with Marco Tomassini and Michael Korkin, and Ingo Rechenberg. Tutorial sessions are planned on 6th April on approximations of multi-variable functions by neural networks (Vera Kurkova), Multi-objective optimisation using GAs (Kalyan Deb), Alternative viewpoints on GAs (Colin Reeves), Neural networks and Fuzzy Logic (Nigel Steele). Full programme of accepted talks plus a visit to the world-famous Postojna cave and Lipica stud...also an optional post-conference visit to Venice by sea. For latest details and registration information see http://cherry.fri.uni-lj.si/icannga99.html or send an email to icannga at fri.uni-lj.si From horn at neuron.tau.ac.il Thu Nov 26 10:50:46 1998 From: horn at neuron.tau.ac.il (David Horn) Date: Thu, 26 Nov 1998 17:50:46 +0200 (IST) Subject: conference announcement: NCST-99 Message-ID: Conference Announcement ------------------------- Neural Computation in Science and Technology --------------------------------------------- Israel, October 10-13, 1999 This conference, sponsored by the Israel Ministry of Science, aims to bring together scientists and practitioners of neural computation who are interested in scientific as well as technological applications. It will focus on two main general topics: A. The use of neural network techniques in various applications in science and technology. B. Modeling in computational neuroscience. The conference's dual focus on neuroscience and technological themes follows one of the fundamental ideas of the field of Computational Neuroscience, i.e., the belief that such cross-disciplinary interactions will be beneficial to researchers of both schools. Such cross-fertilization of ideas has already manifested itself in both directions over the last decade: On one hand, thinking about the brain in computational concepts has given rise to the development of artificial neural networks as leading computing devices for performing a multitude of technological tasks, such as classification, function approximation, prediction and control. On the other hand, basic engineering and technological methods such as principal component analysis, blind separation, learning vector quantization and reinforcement learning have all led to the development of neurally-based realizations. The latter, in turn, have given rise to many interesting ideas about the possible central role that such networks may play in the brain in a multitude of information processing tasks, varying from odor processing to reward-related action selection to self-organization of cortical maps, just to name a few. Moreover, new frontiers of neural computation such as neuromorphic engineering and adaptive autonomous control manifest even further the growing need for technological-scientific interaction, to fully exploit the potential promises of neural computation. The conference will consist of a four day meeting that will include tutorials, a series of invited and contributed talks, a poster session, and a discussion panel. An informal atmosphere will be maintained, encouraging questions and discussions. It will take place in a hotel that serves also as a conference center in Kibbutz Maale Hachamisha, located between Tel Aviv and Jerusalem. It has the nice scenery, and pleasant weather, of the Judean hills. Currently Confirmed Invited Speakers: -------------------------------------- D. Amit, Y. Baram, W. Bialek, E. Domany, R. Douglas, G. Dreyfus, D. Golomb, M. Hasselmo, J. Hertz, D. Horn, N. Intrator, I. Kanter, W. Kinzel, H. Oja, E. Ruppin, I. Segev, T. Sejnowski, H. Sompolinsky, N. Tishby M. Tsodyks and V. Vapnik. Call for Papers ------------------ We call herewith for the submission of papers for oral or poster presentation at the conference. The papers should fit into the two general categories of the conference. Submitted papers should be limited to seven pages in length and indicate physical and e-mail addresses of all authors and the author to whom correspondence should be addressed. Four copies of the submitted papers should reach the conference scientific committee by May 15th, 1999. Mail submissions to Prof. David Horn, NCST-99, School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel. Scientific Organizing Committee --------------------------------- D. Horn (chairman), H. Abramowicz, A. Agranat, Y. Baram, I. Kanter, E. Ruppin, N. Tishby, M. Tsodyks, G. Ariely and B. Shimony. Information and Registration: ----------------------------- Registration forms and information about the conference are available at the website http://neuron.tau.ac.il/NCST-99 ---------------------------------------------------------------------------- Prof. David Horn horn at neuron.tau.ac.il School of Physics and Astronomy http://neuron.tau.ac.il/‾horn Tel Aviv University Tel: ++972-3-642-9305, 640-7377 Tel Aviv 69978, Israel. Fax: ++972-3-640-7932 From Annette_Burton at Brown.edu Mon Nov 30 08:17:36 1998 From: Annette_Burton at Brown.edu (Annette Burton) Date: Mon, 30 Nov 1998 09:17:36 -0400 Subject: Program and Job Announcement Message-ID: Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY GRADUATE TRAINING PROGRAM in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES Deadline for Applications: January 1, 1999 Brown University is actively recruiting graduate students for a new NSF-supported Interdisciplinary Graduate Education, Research and Training (IGERT) program in "Learning and Action in the Face of Uncertainty: Cognitive, Computational and Statistical Approaches". The use of probabilistic models and statistical methods has had a major impact on our understanding of language, vision, action, and reasoning. This training program provides students with the opportunity to integrate a detailed study of human or artificial systems for language acquisition and use, visual processing, action, and reasoning with appropriate mathematical and computational models. Students will be enrolled in one of the three participating departments (Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science) and will study an interdisciplinary program of courses in topics such as statistical estimation, cognitive processes, linguistics, and computational models. The aim of this program is to provide promising students with a mix of mathematical, computational and experimental expertise to carry out multidisciplinary collaborative research across the disciplines of Applied Mathematics, Computer Science, and Cognitive Science. Interested students should apply to the participating department closest to their area of interest and expertise, and should indicate their interest in the IGERT training program in their application. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program, application procedures, and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT or contact Dr. Julie Sedivy Department of Cognitive & Linguistic Sciences Brown University, Box 1978 Providence, RI 02912 USA Julie_Sedivy at brown.edu ___________________________________________________________________ Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY POSTDOCTORAL OPPORTUNITY in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES As part of an NSF award to Brown University through the IGERT program, the Departments of Cognitive and Linguistic Sciences, Computer Science, and Applied Mathematics will be hiring two Postdoctoral Research Associates. Fellows will be scholars who have displayed significant interest and ability in conducting collaborative interdisciplinary research in one or more of the research areas of the program: computational and empirical approaches to uncertainty in language, vision, action, or human reasoning. As well as participating in collaborative research, responsibilities will include helping to coordinate cross-departmental graduate teaching and research as well as some teaching of interdisciplinary graduate courses. We expect that these fellows will play an important role in creating a highly visible presence for the IGERT program at Brown, and their interdisciplinary activities will help unify the interdepartmental activities of the IGERT program. Applicants must hold a PhD in Cognitive Science, Linguistics, Computer Science, Mathematics, Applied Mathematics, or a related discipline, or show evidence that the PhD will be completed before the start of the position. Applicants should send a vita and three letters of reference to Steven Sloman, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, RI 02912. Special consideration will be given to those applicants whose research is relevant to at least two of the participating departments. The positions will begin September 1, 1999 for one year, renewable upon satisfactory completion of duties in the first year. Salaries will be between $35,000 and $45,000 per year. All materials must be received by Jan. 1, 1999, for full consideration. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT From lawrence at research.nj.nec.com Sun Nov 1 17:22:08 1998 From: lawrence at research.nj.nec.com (Steve Lawrence) Date: Sun, 1 Nov 1998 17:22:08 -0500 Subject: ------ Scientist/RA position in learning/information retrieval ------- Message-ID: <19981101172208.B19809@research.nj.nec.com> ------ Scientist/RA position in learning/information retrieval ------- The NEC Research Institute has an immediate opening for a Scientist/Research Associate in the area of learning and/or information retrieval. NEC Research Institute is a basic research laboratory located in Princeton, NJ, a short drive from Princeton University. For more information, see http://www.neci.nj.nec.com/ This research focuses on basic issues in learning and information retrieval and dissemination, with an emphasis on the World Wide Web. A sample of recent research publications includes: S. Lawrence, C.L. Giles, Searching the World Wide Web, Science, 280, p. 98. 1998. S. Lawrence, C.L. Giles, Context and Page Analysis for Improved Web Search, IEEE Internet Computing, Volume 2, Number 4, pp. 38-46, 1998. C.L. Giles, K. Bollacker, S. Lawrence, CiteSeer: An Automatic Citation Indexing System, The 3rd ACM Conference on Digital Libraries, pp. 89-98, 1998 [shortlisted for best paper award]. S. Lawrence, C.L. Giles, A.C. Tsoi, A.D. Back, Face Recognition: A Convolutional Neural Network Approach, IEEE Transactions on Neural Networks, 8, 1, pp. 98-113, 1997. S. Lawrence, A.D. Back, A.C. Tsoi, C.L. Giles, On the Distribution of Performance from Multiple Neural Network Trials, IEEE Transactions on Neural Networks, 8, 6, pp. 1507-1517, 1997. These and other papers are available from: http://www.neci.nj.nec.com/homepages/lawrence/ See also: http://www.neci.nj.nec.com/homepages/lawrence/citeseer.html http://www.neci.nj.nec.com/homepages/lawrence/websize.html Candidates must have experience in research and be able to effectively communicate research results through publications. Successful candidates will have experience with information retrieval and/or machine learning (e.g. neural networks). Proficiency with Unix/Linux and the software implementation of algorithms is a must. Tasks will also involve code maintenance, modification and enhancement as required by the research program. The Institute provides an outstanding research environment with many recognized experts and excellent resources plus a competitive salary and a strong emphasis on open publication of results. The successful candidate will combine a genuine desire to excel in research with the above attributes. Interested applicants should send their resumes with names of references by email, mail or fax to: Dr. Steve Lawrence Computer Science NEC Research Institute 4 Independence Way Princeton NJ 08540 Phone: (609) 951 2676 Fax: (609) 951 2482 lawrence at research.nj.nec.com http://www.neci.nj.nec.com/homepages/lawrence/ The position will remain open until filled. Applicants must show documentation of eligibility for employment. NEC is an equal opportunity employer. EOE -- Steve Lawrence - http://www.neci.nj.nec.com/homepages/lawrence/ From ucganlb at ucl.ac.uk Mon Nov 2 11:00:53 1998 From: ucganlb at ucl.ac.uk (Neil Burgess - Anatomy UCL London) Date: Mon, 02 Nov 1998 16:00:53 +0000 Subject: New Book Message-ID: <19536.199811021600@socrates-a.ucl.ac.uk> THE HIPPOCAMPAL AND PARIETAL FOUNDATIONS OF SPATIAL COGNITION Neil Burgess, Kate Jeffery, John O'Keefe (eds.) Oxford University Press, 1998. Hardback ISBN: 0-19-852453-6 UK Price: 60.00 pounds Paperback ISBN: 0-19-852452-8 UK Price: 29.50 pounds For ordering see: http://www.oup.co.uk/ or the OUP stand at the Society for Neuroscience meeting Preface: Striking recent progress has been made towards an understanding of the neural basis of spatial cognition, centred on two areas of the brain: the hippocampal formation and the parietal cortex. This book includes a comprehensive sample of recent research into how these two areas work, either alone or in cooperation with each other, to support spatial cognition. The research presented here is necessarily interdisciplinary, including consideration of the effects of brain damage in humans, functional imaging of the human brain, electrophysiological recording of single neurones, and computer simulation of the action of the networks of neurons in these brain areas. The first chapter of the book provides an overall introduction to the field and to the substance of each of the remaining chapters. In this introductory chapter we also present a framework in which to consider the seemingly diverse spatial and mnemonic functions of the hippocampal formation and parietal cortex. This book should provide a useful starting point and reference for researchers and students of neuroscience, psychology or cognitive science who have an interest in spatial cognition. Contents: INTRODUCTION 1. Integrating hippocampal and parietal functions: a spatial point of view N Burgess, KJ Jeffery, J O'Keefe. pp. 3-31 PARIETAL CORTEX 2. Spatial frames of reference and somatosensory processing: a neuropsychological perspective GI Vallar 33-49 3. Spatial orientation and the representation of space with parietal lobe lesions HO Karnath 50-66 4. Egocentric and object-based visual neglect J Driver 67-89 5. Multimodal integration for the representation of spcae in the posterior parietal cortex RA Andersen 90-103 6. Parietal cortex constructs action-oriented spatial representations CL Colby 104-126 7. A new view of hemineglect based on the repsonse properties of parietal neurones A Pouget, TJ Sejnowski 127-147 THE HIPPOCAMPAL FORMATION 8. Robotic and neuronal simulation of the hippocampus and rat navigation N Burgess, JG Donnett, KJ Jeffery, J O'Keefe 149-166 9. Dissociation of exteroceptive and ideothetic orientation cues: effect on hippocampal place cells and place navigation J Bures, AA Fenton, Y Kaminski, J Rossier, B Sacchetti, L Zinyuk 167-185 10. Variable place-cell coupling to a continuously viewed stimulus: evidence that the hippocampus acts as a perceptual system A Rotenberg, RU Muller 186-202 11. Separating hippocampal maps AD Redish DS Touretzky 203-219 12. Hippocampal synaptic plasticity: role in spatial learning or the automatic encoding of attended experience? RGM Morris, U Frey 220-246 13. Right medial temporal-lobe contribution to object-location memory B Milner, I Johnsrude, J Crane 247-258 14. The hippocampus and spatial memory in humans RG Morris, JA Nunn, S Abrahams, JD Feigenbaum, M Recce 259-289 15. Hierarchical organisation of cognitive memory M Mishkin, WA Suzuki, DG Gadian, F Vargha-Khadem 290-303 INTERACTIONS BETWEEN PARIETAL AND HIPPOCAMPAL SYSTEMS IN SPACE AND MEMORY 16. Memory reprocessing in cortocortical and hippocampocortical neuronal ensembles YL Qin, BL McNaughton, WE Skaggs, CA Barnes 305-319 17. The representaiton of space in the primate hippocampus, and its role in memory ET Rolls 320-344 18. Amnesia and neglect: beyond the Delay-Brion system and the Hebb synapse D Gaffan, J Hornak 345-358 19. Representation of allocentric space in the monkey frontal lobe CR Olsen, SN Gettner, L Tremblay 359-380 20. Parietal and hippocampal contribution to topokinetic and topographic memory A Berthoz 381-403 21. Hippocampal involvement in human topographical memory: evidence from functional imaging EA Maguire 404-415 22. Parietal cortex and hippocampus: from visual affordances to the world graph MA Arbib 416-442 23. Visuospatial processing in a pure case of visual-form agnosia AD Milner, HC Dijkerman, DP Carey 443-466 INDEX From l.s.smith at cs.stir.ac.uk Mon Nov 2 04:38:19 1998 From: l.s.smith at cs.stir.ac.uk (Dr L S Smith (Staff)) Date: Mon, 2 Nov 98 09:38:19 GMT Subject: ICANN 99 - Ninth International Conference on Artificial Neural Networks Message-ID: <199811020938.JAA04776@tinker.cs.stir.ac.uk> ICANN 99 - Ninth International Conference on Artificial Neural Networks incorporating the IEE Conference on Artificial Neural Networks University of Edinburgh, Scotland, UK: 7 - 10 September 1999 First Call for Papers The conference aims to bring together researchers from academia, industry and commerce in the broad field of neural computation, spanning disciplines from computational neurobiology to engineering in what is the largest European event ever to be held in this field. It is intended to create a focus for European research and to foster dialogue between academic researchers and industrial/commercial users of a still developing technology. Scope Theory and Algorithms Neurobiology and Computational Neuroscience Cognitive Modelling Industrial, Commercial and Medical Applications Hardware and Neuromorphic Engineering Control, Robotics and Adaptive Behaviour Important dates: 1 February 1999 Deadline for the receipt of papers for assessment (6 pages) 29 March 1999 Notification of acceptance to authors 14 May 1999 Final camera-ready papers must be received Papers can be submitted electronically: see the WWW page for details. Scientific Committee Co - Chairs: Professor David Willshaw, University of Edinburgh Professor Alan Murray, University of Edinburgh For more information see http://www.iee.org.uk/Conf/ICANN or email icann99 at iee.org.uk (Leslie Smith, Department of Computing Science, University of Stirling, Scotland) From mousset at sedal.usyd.edu.au Tue Nov 3 00:48:08 1998 From: mousset at sedal.usyd.edu.au (Eric Mousset) Date: Tue, 03 Nov 1998 16:48:08 +1100 Subject: Workshop 'Understanding the Brain and Engineering Models' Message-ID: <363E9918.678CBF50@sedal.usyd.edu.au> WORKSHOP ANNOUNCEMENT UNDERSTANDING THE BRAIN AND ENGINEERING MODELS University of Sydney, Australia January 18th and 19th, 1999 OUTLINE ======= Over the last decade, research activities in the domain of neuromorphic engineering and brain function modelling have significantly increased. This workshop seeks to bring together a diverse group of researchers to critically examine the progress made so far in this challenging research area. We intend to address the following issues: 1. Sensorimotor integration in a broad sense and, more particularly, 2. Computational models of brain centres such as the cerebellum, basal ganglia and the superior colliculus, 3. Computational models of the visual and auditory pathways, 4. Microelectronic and other real-time implementations of such models and their incorporation in roving systems. KEYNOTE SPEAKER ================ Terrence Sejnowski Howard Hughes Medical Institute, Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, CA, USA TENTATIVE LIST OF SPEAKERS ============================= Abdesselam Bouzerdoum Edith Cowan University, Perth, WA, Australia Simon Carlile Department of Physiology, Sydney University, NSW, Australia Olivier Coenen The Salk Institute for Biological Studies, CA, USA & ANRL Children's Hospital Ralph Etienne-Cummings Department of Electrical and Computer Engineering, Johns Hopkins University, MD, USA Evian Gordon Westmead Hospital, The University of Sydney, NSW, Australia Marwan Jabri School of Electrical and Information Engineering, Sydney University, NSW, Australia Craig Jin School of Electrical and Information Engineering, Sydney University, NSW, Australia Eric Mousset School of Electrical and Information Engineering, Sydney University, NSW, Australia Andre Van Schaik Department of Physiology, Sydney University, NSW, Australia WHO SHOULD ATTEND? ================== The following groups of participants are targeted: (a) Visual/auditory psychophysicists and neurobiologists; (b) Computational modellers and engineers, including those interested in real-time software and/or VLSI implementation. REGISTRATION FEE & FORM ======================= Registration fee is AU$ 100 (normal) / AU$ 50 for students. It includes coffee, workshop barbecue and workshop handouts. A registration form is available at http://www.ee.usyd.edu.au/events_news/ubem99reg.html FURTHER INFORMATION ==================== Further information (registration form, detailed program, accomodation arrangements, transport, etc.) is available at http://www.sedal.usyd.edu.au/‾mousset/UBEM99/ Queries can be directed to Eric Mousset: mousset at ee.usyd.edu.au or +62-1-93517208 ORGANISERS ========== Eric Mousset (mousset at ee.usyd.edu.au), Sydney University Marwan Jabri (marwan at ee.usy.edu.au), Sydney University and Ralph Etienne-Cummings (etienne at ece.jhu.edu), John Hopkins University From mieko at hip.atr.co.jp Thu Nov 5 02:03:32 1998 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Thu, 5 Nov 1998 16:03:32 +0900 Subject: Submission Deadline is SOON [Neural Networks 1999 Special Issue] Message-ID: <199811050703.QAA11163@mailhost.hip.atr.co.jp> Dear members, DEADLINE for submission: December 1st, 1998 We would like draw your attention to the quickly approaching submission deadline for the "Neural Networks 1999 Special Issue" This year, the publication will be edited by the Japanese Neural Networks Society. We plan to publish many originally contributed articles in addition to invited articles. We are looking forward to receiving your contributions. Mitsuo Kawato Co-Editor-in-Chief Neural Networks (ATR Human Information Proc. Res. Labs.) ****************************************************************** CALL FOR PAPERS ****************************************************************** Neural Networks 1999 Special Issue "Organisation of Computation in Brain-like Systems" ****************************************************************** Submission: Deadline for submission: December 1st, 1998 Notification of acceptance: March 1st, 1999 Format: as for normal papers in the journal (APA format) and no longer than 10,000 words Co-Editors: Professor Gen Matsumoto, BSI, RIKEN, Japan Professor Edgar Koerner, HONDA R&D, Europe Dr. Mitsuo Kawato, ATR Human Information Processing Res. Labs., Japan Address for Papers: Dr. Mitsuo Kawato ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho Soraku-gun, Kyoto 619-0288, Japan. ****************************************************************** In recent years, neuroscience has made a big leap forward regarding both investigation methodology and insights in local mechanisms for processing sensory information in the brain. The fact that we still do not know much better than before what happens in the brain when one recognises a familiar person, or moves around navigating seemingly effortless through a busy street, points to the fact that our models still do not describe essential aspects of how the brain organises computation. The investigation of the behaviour of fairly homogeneous ANS (artificial neural systems) composed of simple elementary nodes fostered the awareness that architecture matters: Algorithms implemented by the respective neural system are expressed by its architecture. Consequently, the focus is shifting to better understanding of the architecture of the brain and of its subsystems, since the structure of those highly modularised systems represents the way the brain organises computation. Approaching the algorithms expressed by those architectures may offer us the capability to not only understand the representation of knowledge in a neural system made under well defined constraints, but to understand the control that forces the neural system to make representations of behaviourally relevant knowledge by generating dynamic constraints. This special issue will bring together invited papers and contributed articles that illustrate the shifting emphasis in neural systems modelling to more neuroarchitecture-motivated systems that include this type of control architectures. Local and global control algorithms for organisation of computation in brain-like systems cover a wide field of topics. Abduction of control principles inherent in the architectures that mediate interaction within the cortex, between cortex-thalamus, cortex-hippocampus and other parts of the limbic system is one of the targets. Of particular importance are the rapid access to stored knowledge and the management of conflicts in response to sensory input, the coding and representation in a basically asynchronous mode of processing, the decomposition of problems into a reasonable number of simpler sub-problems, and the control of learning -- including the control which specifies what should be learned, and how to integrate the new knowledge into the relational architecture of the already acquired knowledge representation. Another target of that approach is the attempt to understand how these controls and the respective architectures emerged in the process of self-organisation during phylogenetic and ontogenetic development. Setting the cognitive behaviour of neural systems in the focus of investigation is a prerequisite for the described approach that will promote both creating computational hypotheses for neurobiology and implementing robust and flexible computation in ANS. ****************************************************************** end. ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL mieko at hip.atr.co.jp ========================================================= From austin at minster.cs.york.ac.uk Thu Nov 5 09:17:53 1998 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Thu, 5 Nov 1998 14:17:53 +0000 Subject: Lectureship in Neural networks Message-ID: <9811051417.ZM6169@minster.cs.york.ac.uk> Lectureship in Neural Networks and/or Computer Vision Applications are invited for a post of Lecturer in the Department of Computer Science, University of York UK. The Department is one of the premier Computer Science Departments in UK, with the highest research and teaching gradings. The Lecturer will be expected to be an active researcher in neural networks and/or computer vision with an strong interest in teaching the design and construction of micro computer systems or in networks and distributed systems. The successful applicant will join one of UK strongest groups undertaking research in neural networks, computer vision and advanced architectures that support them. Housed in a new building the work of the 30 strong research and lecturing team can be found at the web page http:http://www.cs.york.ac.uk/arch/ The post is available immediately on Lecture scale grade A (16,655 to 21,815 pounds Stirling per annum) or Grade B (22,726 to 29,048 per annum), depending on age and experience. Informal enquires may be made to Prof. Jim Austin (Head of the advanced Computer Architectures Group, austin at cs.york.ac.uk, 01904 432734) or Dr. Keith Mander (Head of Department, mander at cs.york.ac.uk, 01904 432734). Six copies of applications with full curriculum vitae and the names of three referees should be sent by Monday 23 November 1998 to the Personnel Officer, University of York, Heslington, York YO10 5DD. Applicants should consult the further particulars which are available from http://www.cs.york.ac.uk/‾mander/partics.html. Applicants should quote the reference number X/3074. -- Professor Jim Austin, Department of Computer Science, University of York, York, YO1 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From ogawa at cs.titech.ac.jp Wed Nov 4 06:04:50 1998 From: ogawa at cs.titech.ac.jp (Hidemitsu Ogawa) Date: Wed, 04 Nov 1998 20:04:50 +0900 Subject: Ninth T.I.T. Brain Research Symposium Message-ID: <199811041104.UAA01071@hilbert.cs.titech.ac.jp> Appended below is the program for the Ninth T.I.T. Brain Research Symposium. Hidemitsu Ogawa The Ninth T.I.T. Brain Research Symposium General Chair -------------------------------------------------------------------- ************************************************************* The Ninth T.I.T. Brain Research Symposium ************************************************************* December 11, 1998 Tokyo Institute of Technology Ferrite Room of Centennial Memorial Hall, Oookayama Campus -------------------------------------------------------------------- 10:00-10:10 Opening Hidemitsu Ogawa (T.I.T. Graduate School of Information Science and Engineering) 10:10-11:10 Synapse mechanism of pheromonal memory (Invited) Masumi Ichikawa(Tokyo Metropolitan Institute for Neuroscience) 11:10-11:50 Manipulation technology of individual molecules for brain research: A Challenge in nanoworld Atsushi Ikai (T.I.T. Faculty of Biosciences and Biotechnology) --- Lunch --- 13:00-13:40 Study of learning circuit for odor sensor using 1bit-stream dataprocessing circuit Takamichi Nakamoto, Satoshi Kawamura, and Toyosaka Moriizumi (T.I.T. Faculty of Engineering) 13:40-14:20 Holographic neural computation with weighted sum performed by wave interference: Simulation by software models and hardware implementation Itsuo Kumazawa (T.I.T. Graduate School of Information Science and Engineering) 14:20-15:00 Neural networks and motivation:What will make it to do? Yukio Kosugi(T.I.T. Interdisciplinary Graduate School of Science and Engineering) --- Break --- 15:20-16:00 Recent advance in probabilistic analysis for complex learning machines Sumio Watanabe(T.I.T. Precision and Intelligence Laboratory) 16:00-17:00 Dynamic independent component analysis in application to biomedical signal processing: Recent results and open problems (Invited) Andrzeij Cichocki(RIKEN Brain Science Institute) (admission: free) ------------------------------------------------------------------------- General Chair:Prof. Hidemitsu Ogawa (Tokyo Institute of Technology, Graduate School of Information Science and Engineering) Secretariat :Dr.Yukio Kosugi/Email:kosugi at pms.titech.ac.jp ------------------------------------------------------------------------- From seung at mit.edu Wed Nov 4 01:36:52 1998 From: seung at mit.edu (H. Sebastian Seung) Date: Wed, 4 Nov 1998 01:36:52 -0500 (EST) Subject: Postdoctoral research positions available Message-ID: <199811040636.BAA22159@life.ai.mit.edu> Postdoctoral research positions: The Seung Lab for Theoretical Neurobiology Department of Brain and Cognitive Sciences The Massachusetts Institute of Technology The Seung Lab invites applications for postdoctoral research positions, to be filled between January and September 1999. Applicants should ideally have experience in theoretical neurobiology or machine learning, and strong training in the physical sciences or engineering. The Seung Lab specializes in the theoretical study of learning and memory in neural networks, both biological and artificial. More information about our activities can be found at Applications should include a CV, research statement, copies of relevant publications, and three letters of recommendation. Send applications to: Prof. Sebastian Seung Dept. of Brain & Cognitive Sciences MIT, E25-210 Cambridge, MA 02139 seung at mit.edu From a_browne at europa.nene.ac.uk Tue Nov 3 03:37:53 1998 From: a_browne at europa.nene.ac.uk (Tony Browne) Date: Tue, 03 Nov 1998 08:37:53 +0000 Subject: CFP: Symbol Processing Message-ID: <74415643E3@europa.mmb.nene.ac.uk> Call for Papers: Special Issue on Connectionist Symbol Processing For the journal: Expert Systems: The International Journal of Knowledge Engineering and Neural Networks The processing of symbols and symbolic structures has long been a challenge to connectionists. This special issue will bring together a broad range of contributed articles that explore the areas of representation, variable binding and inference. Papers are sought which present recent results in this field, or discuss fundamental theoretical concepts related to the performance of symbolic processing with connectionist networks. All papers will be peer-reviewed. Special Issue Editor: Antony Browne Submission Details: Deadline for Submission: 30th April 1999 Notification of Acceptance: 31st July 1999 Format: As for normal papers to the journal (Please see attachment to this message for instructions to authors) Length: No longer than 10000 words Address for Papers: Dr. Antony Browne School of Information Systems University College Northampton Northampton NN2 7AL UK Further questions should be addressed to Tony Browne at: antony.browne at nene.ac.uk NOTES FOR CONTRIBUTORS The International Journal of Knowledge Engineering and Neural Networks is a quarterly technical journal devoted to all aspects of the development and use of advanced computing. Papers are published on the condition that authors are prepared to assign the copyright to Blackwell Publishers Ltd. The journal is not equipped to deal with LaTeX, so please write papers using a commonly used Microsoft Windows based word-processing package such as 'Word' or 'Wordperfect' Three hard copies of the paper and figures should be supplied. The paper should be laser printed double spaced, on white A4 or white US equivalent paper. If your submission is accepted, you will be asked to submit a disk with the soft version. Each page should be numbered. The first page of the manuscript should bear only the names, titles and full addresses of the authors. Where there is more than one author, please indicate to whom correspondence should be sent, as well as contact address, phone and fax numbers, and e-mail address. Illustrations should be on separate pages, attached to the end of the manuscript. Figures should be neatly printed in black on a good white base on separate pages. Screen dumps do not always reproduce well; please make screen dumps as clear as possible. Each figure should be clearly identified for the printer (e.g. Figure 1). The position of each figure should be clearly marked and referred to in the text (e.g. Figure 1 about here). Each diagram or table should be clearly captioned (e.g. Figure 1: the sy Lettering and numbering should be large enough to be legible if reduced, which may be by up to 50%. Figures should not be marked in any way as we may reproduce them directly. Authors are responsible for obtaining permission to use figures borrowed from other works. Papers should always begin with an abstract and an introduction, and end with a conclusion and bibliography. Appendices may be used if appropriate. Papers and articles should be written in gender-free language Heading structure: Usually three levels:1. Main heading, 1.1. Subheading, 1.1.1. Sub-subheading. Every reference in the text should be given in full in the bibliography. References should be complete and correct - this is the author's responsibility. References in the text should give, in parentheses, the author's name and year of publication: for example: (Browne et. al., 1996; Niklasson & Boden, 1997; Browne, 1998). References in the bibliography at the end of the text should use the appropriate format, described below (the Harvard system). Books/Book Chapters: author name, author initials, date of publication, title of chapter (where applicable), full name of editor (where applicable), full title of book, page numbers, publisher, publisher's location. For example: NIKLASSON, L and BODEN, M. (1997). Representing structure and structured representations in connectionist networks, in A. Browne (Ed.), Neural Network Perspectives on Cognition and Adaptive Robotics, pp. 20-50. Institute of Physics Press, Bristol, UK. Journal Papers: author name, author initials, date of publication, full title of paper, full title of journal, journal volume number, journal issue number, page numbers of paper. For example: BROWNE, A. (1998). Detecting systematic structure in distributed representations. Neural Networks 11(5), 815-824. Conference proceedings, similar to the format above. If the proceedings have been published externally, please give name of publisher and publisher's location. BROWNE, A. PASCALIS, R. and AMIN, S. (1996). Signal and image processing with neural networks. Proceedings of Circuits, Systems and Computers 96, Vol. 1, 335-339. References not mentioned in the text should be listed separately as 'Further reading.' From l.s.smith at cs.stir.ac.uk Wed Nov 4 08:29:51 1998 From: l.s.smith at cs.stir.ac.uk (Dr L S Smith (Staff)) Date: Wed, 4 Nov 98 13:29:51 GMT Subject: 2nd European Workshop on Neuromorphic Systems: Call for Papers Message-ID: <199811041329.NAA05529@tinker.cs.stir.ac.uk> First Call For Papers EWNS2 2nd European Workshop on Neuromorphic Systems, 3-5 September 1999. University of Stirling, Stirling, Scotland. Neuromorphic systems are implementations in silicon of systems whose architecture and design are based on neurobiological systems. This growing area proffers exciting possibilities such as sensory systems which can compete with human senses, pattern recognition systems that can run in real-time and neuron models that can truly emulate living neurons. Neuromorphic systems are at the intersection of neurophysiology, computer science and electrical engineering. The meeting builds on the success of EWNS1, held in Stirling in August 1997. The meeting is intended both for the reporting of results, and for discussion about the way forward in neuromorphic systems: What should the role of neuromorphic systems be? Learning about neurobiological systems by rebuilding them, or engineering new solutions to problems in sensory perception using what we know about animal sensory perception? Can biologically-inspired techniques provide the basis for real improvements in auditory/visual/ olfactory/ sensorimotor systems or prostheses? How should neuromorphic systems be implemented? Purely in hardware, or as a mixture of software and hardware? As dedicated VLSI devices? In analogue or digital hardware? Should particular methodologies or specific transistor characteristics be used? Can what neuromorphic systems have to tell us about sensory perception and coding inform cognitive science? Papers are requested in the following areas: Design issues in sensorineural neuromorphic systems: auditory, visual, olfactory, proprioceptory, sensorimotor systems= Designs for silicon implementations of neurons or neural systems.= Theoretical aspects of the above areas. The meeting is being held just before ICANN'99, which will be in Edinburgh, 7-10 September 1999. Submission of papers: Papers not exceeding 8 A4 pages are requested: these should be sent to Dr. Leslie Smith, Department of Computing Science, University of Stirling, Stirling FK9 4LA, Scotland email: lss at cs.stir.ac.uk FAX (44) 1786 464551 Tel (44) 1786 467435 We also propose to hold a number of discussion sessions on some of the questions above. Short position papers (up to 4 pages) are also requested. We hope to publish the proceedings in book form after the meeting. We are particularly keen to encourage submissions by research students. Key Dates Submission Deadline April 6th 1999 Notification of Acceptance June 4th 1999 Organising Committee: Leslie S. Smith, Department of Computing Science and Mathematics, University of Stirling. Alister Hamilton, Department of Electrical Engineering, University of Edinburgh. Catherine Breslin, Department of Computing Science and Mathematics, University of Stirling. WWW page for conference: http://www.cs.stir.ac.uk/EWNS2/ Dr Leslie S. Smith Dept of Computing Science and Mathematics, Univ of Stirling Stirling FK9 4LA, Scotland l.s.smith at cs.stir.ac.uk (NeXTmail and MIME welcome) Tel (44) 1786 467435 Fax (44) 1786 464551 www http://www.cs.stir.ac.uk/‾lss/ From graepel2 at cs.tu-berlin.de Thu Nov 5 10:43:24 1998 From: graepel2 at cs.tu-berlin.de (Thore Graepel) Date: Thu, 5 Nov 1998 16:43:24 +0100 (MET) Subject: Winterschool announcement Message-ID: Winterschool Berlin, Germany, December 10-12, 1998 Networks with Spiking Neurons and Synaptic Plasticity ======================================================= The school focuses on key questions of computational neuroscience. Tutorials for non-experts will provide self-contained introductions to experimental results, models, and theoretical concepts. To prepare for the tutorials and lectures, references to selected reviews can be obtained via internet. Organization: Graduiertenkolleg Berlin "Signal Cascades in Living Systems", Sonderforschungsbereich "Mechanismen entwicklungs- und erfahrungsabhaengiger Plastizitaet des Nervensystems" (Sfb 515) Location: Ernst-Reuter-Haus, Saal ER-A, Strasse des 17. Juni 112, Berlin, Germany Abstracts, recommended background literature and further information about the winterschool can be found on: http://www.fu-berlin.de/grk120/ For registration, send an e-mail with your name and address to kolleg at zedat.fu-berlin.de before December 1. The registration fee of 25 DM includes the dinner and may be paid upon arrival. Visitors from outside Berlin can make reservations for accomodation at various rates at the Berlin Tourismus Marketing GmbH, phone: +49 30 25 00 25, e-mail: reservation at btm.de Winterschool Program: ===================== Thursday, December 10: Introductory Tutorials --------------------------------------------- 9:00-10:30 Larry Abbott: Methods of neuronal and network modeling 11:00-12:30 Mike Shadlen: Coding and computing with noisy neurons 14:30-16:00 Ad Aertsen: Temporal Coding and Dynamics of Spiking Neurons 16:30-18:00 Henry Markram: Non-linear synaptic transmission 18:30 Reception and Conference Dinner Friday, December 11: Synaptic Plasticity ---------------------------------------- 8:30- 9:30 Florian Engert: Pairing-induced ltp in hippocampal slice cultures is not strictly input-specific. 9:30-10:30 Lori McMahon: Modulation of hippocampal interneuron excitability through changes in synaptic strength and during rhythmic oscillations. 11:00-12:00 Jeff Magee: Temporal summation of synaptic activity is spatially normalized by a nonuniform dendritic Ih in hippocampal neurons. 12:00-13:00 Alex Thomson: Cortical Cicuits: simultaneous translation at each class of synapse. 14:30-15:30 Larry Abbott: Temporally asymmetric Hebbian plasticity: spike synchrony and response variability. 15:30-16:30 Henry Markram: The synaptic organization principle in the neocortex enables maximal diversity of information transmission between neurons. 17:00-18:00 Walter Senn: Depressing synapses, their modification, and receptive field formation Saturday, December 12: Spiking Neurons -------------------------------------- 8:30- 9:30 Tony Zador: Input synchrony and the irregular firing of cortical neurons 9:30-10:30 Mike Shadlen: Modeling ensembles of weakly correlated noisy neurons. 11:00-12:00 Sonja Gruen: Unitary joint-events in cortical activity. 12:00-13:00 Ad Aertsen: Conditions for stable propagation of synchronous spiking in cortical networks 14:30-15:30 Klaus Pawelzik: Functional roles of subthreshold membrane potential oscillations 15:30-16:30 Wulfram Gerstner: Dynamics in Networks of spiking neurons: Fast transients, synchronisation, and asynchronous firing 17:00-18:00 Gustavo Deco: Spatio-Temporal Coding in the Cortex: Information Flow Based Learning in Spiking Neural Networks. Organizers: Prof. Dr. Andreas V.M. Herz Innovationskolleg Theoretische Biologie Humboldt Universitaet zu Berlin http://itb.biologie.hu-berlin.de/ Prof. Dr. Randolf Menzel Institut f=FCr Neurobiologie Freie Universit=E4t Berlin http://www.neuro.biologie.fu-berlin.de/neuro.html Prof. Dr. Klaus Obermayer Department of Computer Science Technical University of Berlin http://ni.cs.tu-berlin.de/ From priel at mail.biu.ac.il Thu Nov 5 14:12:11 1998 From: priel at mail.biu.ac.il (Avner Priel) Date: Thu, 5 Nov 1998 21:12:11 +0200 (WET) Subject: paper on time series generation Message-ID: The following preprint on the subject of time series generation by feed-forward networks was submitted for publication in the Physical Review E. The paper is available from my home-page : http://faculty.biu.ac.il/‾priel/ comments are welcome. *************** NO HARD COPIES ****************** ---------------------------------------------------------------------- Long-term properties of time series generated by a perceptron with various transfer functions ----------------------------------------------------- A Priel and I Kanter Department of Physics, Bar Ilan University, 52900 Ramat Gan,Israel ABSTRACT: We study the effect of various transfer functions on the properties of a time series generated by a continuous-valued feed-forward network in which the next input vector is determined from past output values. The parameter space for monotonic and non-monotonic transfer functions is analyzed in the unstable regions with the following main finding; non-monotonic functions can produce robust chaos whereas monotonic functions generate fragile chaos only. In the case of non-monotonic functions, the number of positive Lyapunov exponents increases as a function of one of the free parameters in the model, hence, high dimensional chaotic attractors can be generated. We extend the analysis to a combination of monotonic and non-monotonic functions. -------------------------------------------------- Priel Avner < priel at mail.biu.ac.il > < http://faculty.biu.ac.il/‾priel > Department of Physics, Bar-Ilan University. Ramat-Gan, 52900. Israel. From hyson at darwin.psy.fsu.edu Fri Nov 6 13:37:03 1998 From: hyson at darwin.psy.fsu.edu (Richard Hyson) Date: Fri, 6 Nov 1998 13:37:03 -0500 Subject: Position announcement Message-ID: The Department of Psychology at the Florida State University seeks to make a TENURE-TRACK appointment all at the assistant professor level in COMPUTATIONAL PSYCHOLOGY: Applicants in all areas of Computational Psychology are encouraged to apply, but preference will be given to candidates whose research interests complement those of our faculty in either cognitive psychology or neuroscience. This position is part of a College initiative to develop a program in computational science. The successful candidate will have responsibilities to an interdisciplinary program in Computational Science and Engineering as well as to the Psychology Department. Recruits will join a diverse research faculty with training programs in Clinical, Cognitive and Behavioral Science, and Psychobiology/Neuroscience. We seek candidates with strong evidence of research potential and teaching ability. Applicants who can contribute to more than one of the department's areas of strength will receive special consideration. A curriculum vitae, a cover letter describing research and teaching interests, and three letters of reference should be sent by Dec. 1 to: Computational Psychology Search Committee, Department of Psychology, Florida State University, Tallahassee, FL 32306-1270. Florida State University is an Equal opportunity/Affirmative Action Employer. For information about the Department of Psychology, see: http://www.psy.fsu.edu ------------------------------------------------ Rick Hyson, Ph.D. Ph: (850) 644-5824 Psychology Department Fax: (850) 644-7739 FSU email: hyson at psy.fsu.edu Tallahassee, FL 32306-1270 From cas-cns at cns.bu.edu Mon Nov 9 15:00:36 1998 From: cas-cns at cns.bu.edu (cas-cns@cns.bu.edu) Date: Mon, 9 Nov 1998 15:00:36 -0500 (EST) Subject: No subject Message-ID: <199811092000.PAA10937@cns.bu.edu> ******************************************************************* Sender: cas-cns at cns.bu.edu Precedence: bulk Reply-To: cas-cns at cns.bu.edu GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. Applications for Fall, 1999, admission and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via e-mail your full name and mailing address to the attention of Mr. Robin Amos at: inquiries at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. Stephen Grossberg, Chairman Gail A. Carpenter, Director of Graduate Studies Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Students are trained in a broad range of areas concerning cognitive and neural systems, including vision and image processing; speech and language understanding; adaptive pattern recognition; cognitive information processing; self-organization; associative learning and long-term memory; cooperative and competitive network dynamics and short-term memory; reinforcement, motivation, and attention; adaptive sensory-motor control and robotics; and biological rhythms; as well as the mathematical and computational methods needed to support modeling research and applications. The CNS Department awards MA, PhD, and BA/MA degrees. The CNS Department embodies a number of unique features. It has developed a curriculum that consists of interdisciplinary graduate courses, each of which integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of neural networks to technology. Additional advanced courses, including research seminars, are also offered. Each course is typically taught once a week in the afternoon or evening to make the program available to qualified students, including working professionals, throughout the Boston area. Students develop a coherent area of expertise by designing a program that includes courses in areas such as biology, computer science, engineering, mathematics, and psychology, in addition to courses in the CNS curriculum. The CNS Department prepares students for thesis research with scientists in one of several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The unit most closely linked to the department is the Center for Adaptive Systems. Students interested in neural network hardware work with researchers in CNS, at the College of Engineering, and at MIT Lincoln Laboratory. Other research resources include distinguished research groups in neurophysiology, neuroanatomy, and neuropharmacology at the Medical School and the Charles River Campus; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department; and in biophysics and computational physics within the Physics Department. In addition to its basic research and training program, the department conducts a seminar series, as well as conferences and symposia, which bring together distinguished scientists from both experimental and theoretical disciplines. The department is housed in its own new four-story building which includes ample space for faculty and student offices and laboratories, as well as an auditorium, classroom and seminar rooms, a library, and a faculty-student lounge. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Thomas Anastasio Visiting Scholar, Department of Cognitive and Neural Systems (9/1/98 - 6/30/99) Associate Professor, Molecular & Integrative Physiology, University of Illinois, Urbana/Champaign PhD, McGill University Computational modeling of neurophysiological systems. Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior. Aijaz Baloch Adjunct Assistant Professor of Cognitive and Neural Systems Senior Development Engineer, Nestor, Inc. PhD, Electrical Engineering, Boston University Visual motion perception, computational vision, adaptive control, and financial fraud detection. Helen Barbas Professor of Anatomy and Neurobiology, Boston Univ. School of Medicine PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex. Jacob Beck Research Professor of Cognitive and Neural Systems PhD, Psychology, Cornell University Visual perception, psychophysics, computational models. Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development. Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems PhD, Mathematics, University of Wisconsin, Madison Learning and memory, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations. Gert Cauwenberghs Visiting Scholar, Department of Cognitive and Neural Systems (6/1/98 - 8/31/99) Associate Professor of Electrical And Computer Engineering, Johns Hopkins Univ. PhD, Electrical Engineering, California Institute of Technology VLSI circuits, systems and algorithms for parallel analog signal processing and adaptive neural computation. Laird Cermak Director, Memory Disorders Research Center, Boston Veterans Affairs Medical Center Professor of Neuropsychology, School of Medicine Professor of Occupational Therapy, Sargent College PhD, Ohio State University Memory disorders. Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems. H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, signal processing models of hearing. Birgitta Dresp Visiting Scholar, Department of Cognitive and Neural Systems (10/1/98 - 12/31/98) Research Agent of the French Government (CNRS), Universite Louis Pasteur PhD in Cognitive Psychology, Universite Rene Descartes, Paris Visual Psychophysics (Form Perception, Spatial Contrast, Perceptual Learning). Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory. William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology. Gil Engel Research Fellow, Department of Cognitive and Neural Systems Chief Engineer, Vision Applications, Inc. Senior Design Engineer, Analog Devices, CTS Division MS, Polytechnic University, New York Space-variant active vision systems for use in human-computer interactive control. Bruce Fischl Research Fellow, Department of Cognitive and Neural Systems Postdoctoral Research Fellow, Massachusetts General Hospital PhD, Cognitive and Neural Systems, Boston University Anisotropic diffusion and nonlinear image filtering, space-variant vision, computational models of early visual processing, and automated analysis of magnetic resonance images. Paolo Gaudiano Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Computational and neural models of robotics, vision, adaptive sensory-motor control, and behavioral neurobiology. Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics. Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition. Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Theoretical biology, theoretical psychology, dynamical systems, and applied mathematics. Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, and biological sensory-motor control. Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition. Michael E. Hasselmo Associate Professor of Psychology PhD, Experimental Psychology, Oxford University Electrophysiological studies of neuromodulatory effects in cortical structures, network biophysical simulations of memory function in hippocampus and piriform cortex, behavioral studies of amnestic drugs. Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing. Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Functional analysis, mathematical physics, partial differential equations. Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamical systems, mathematical physiology, pattern formation in biological/physical systems. Gregory Lesher Research Fellow, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neuro- developmental disorders. Ennio Mingolla Associate Professor of Cognitive and Neural Systems and Psychology PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes. Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production. Alan Peters Professor of Anatomy and Neurobiology, School of Medicine PhD, Zoology, Bristol University, United Kingdom Organization of neurons in the cerebral cortex; effects of aging on the primate brain; fine structure of the nervous system. Andrzej Przybyszewski Research Fellow, Department of Cognitive and Neural Systems Assistant Professor, University of Massachusetts Medical School PhD, Warsaw Medical Academy Electrophysiology of the primate visual system, mathematical and computer modeling of the neuronal networks in the visual system. Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision. Mark Reinitz Assistant Professor of Psychology PhD, University of Washington Cognitive psychology, attention, explicit and implicit memory, memory-perception interactions. Mark Rubin Research Assistant Professor of Cognitive and Neural Systems PhD, Physics, University of Chicago Pattern recognition; artificial and biological vision. Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Assistant Professor, Department of Psychology and Center for the Ecological Study of Perception and Action University of Connecticut, Storrs Research Scientist, Haskins Laboratories, New Haven, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities. Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Scientist, Rowland Institute for Science Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception. Teaching about functional MRI and other brain mapping methods. Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling. Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Jesse and Louis Salvage Professor of Psychology, Brandeis University PhD, Psychology, Brown University Visual motion, visual adaptation, relation of visual perception, memory, and movement. Barbara Shinn-Cunningham Assistant Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensori- motor adaptation, mathematical models of human performance. Malvin Teich Professor of Electrical and Computer Engineering, Biomedical Engineering, and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission. Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging. Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI). Allen Waxman Adjunct Associate Professor of Cognitive and Neural Systems Senior Staff Scientist, MIT Lincoln Laboratory PhD, Astrophysics, University of Chicago Visual system modeling, multisensor fusion, image mining, parallel computing, and advanced visualization. James Williamson Research Assistant Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Development of cortical receptive fields; perceptual grouping; pattern recognition. Jeremy Wolfe Adjunct Associate Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Dept. Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, preattentive and attentive object representation. Curtis Woodcock Professor of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing. CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN710 Advanced Topics in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and a student-run Journal Club, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by grants and contracts from federal agencies that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception, speech and language processing, and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Silicon Graphics workstations, Macintoshes, and PCs. All students have access to X-terminals or UNIX workstation consoles, a selection of color systems and PCs, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Lab is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; a light machine shop; an active vision lab including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. Neurobotics Laboratory The Neurobotics Lab utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The lab currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Other platforms may be investigated in the future. Psychoacoustics Laboratory The Psychoacoustics Lab houses a newly installed, 8 ft. x 8 ft. sound-proof booth. The laboratory is extensively equipped to perform both traditional psychoacoustic experiments and experiments using interactive auditory virtual-reality stimuli. The major equipment dedicated to the psychoacoustics laboratory includes two Pentium-based personal computers; two Power-PC-based Macintosh computers; a 50-MHz array processor capable of generating auditory stimuli in real time; programmable attenuators; analog-to-digital and digital-to-analog converters; a real-time head tracking system; a special-purpose, signal-processing hardware system capable of generating "spatialized" stereo auditory signals in real time; a two-channel oscilloscope; a two-channel spectrum analyzer; various cables, headphones, and other miscellaneous electronics equipment; and software for signal generation, experimental control, data analysis, and word processing. Sensory-Motor Control Laboratory The Sensory-Motor Control Lab supports experimental studies of motor kinematics. An infrared WatSmart system allows measurement of large-scale movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. Equipment includes a 40-inch monitor that allows computer display of animations generated by an SGI workstation or a Pentium Pro (Windows NT) workstation. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. Speech and Language Laboratory The Speech and Language Lab includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. For high speed processing, supercomputer facilities speed filtering and data analysis. Visual Psychophysics Laboratory The Visual Psychophysics Lab occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Silicon Graphics, Inc. (SGI) Onyx RE2, SGI Indigo2 High Impact, SGI Indigo2 Extreme, Power Computing (Macintosh compatible) PowerTower Pro 225, and Macintosh 7100/66 workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing glasses, prisms, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://cns-web.bu.edu/ ******************************************************************* --- This was an announcement from the Department of Cognitive and Neural Systems at Boston University. If you would like to be removed from this mailing list and discontinue receiving these announcements, you may send an e-mail to "majordomo at cns.bu.edu" with the following line in the body of the message: unsubscribe announcements If you have any trouble in doing so, please do not hesitate to report problems to "owner-majordomo at cns.bu.edu". Thank you. From austin at minster.cs.york.ac.uk Tue Nov 10 07:21:47 1998 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Tue, 10 Nov 1998 12:21:47 +0000 Subject: Workshop Message-ID: <9811101221.ZM9279@minster.cs.york.ac.uk> Announcement / Call for Papers Weightless Neural Networks Workshop WNNW-99 York, UK, 30th March 1999 WNNW-99 is a third in a series of premier international meetings for dissemination and discussion of research trends and results in weightless neural networks. The main aim of these conferences is to pro- vide a convenient forum for discussion of recent results and ideas in all areas pertaining to weightless neural networks, RAM based networks and N-tuple networks including architectures, algorithms, coding, hardware implementations and novel applications. The workshop will be held at the Department of Computer Science, University of York, UK. Most participants will be staying at the on-campus accommodation, close to the venue. The meetings will start on Tuesday, 30th of March, 1999 and will last for two days. This meeting is supported by the Advanced Computer Architecture Group at the Department of Computer Science, University of York, UK. The latest workshop information will be posted to the workshop home page at http://thalamus.cs.york.ac.uk/wnnw/ Important Dates Paper Submission Deadline Tuesday, 19th of January 1998 Acceptance Notification Date not later than Tuesday, 16th of February 1999 Workshop Date Tuesday, 30th of March 1999 Inquiries Inquiries concerning the meeting can be sent to the organisers either by email to wnnw at cs.york.ac.uk or by post to the following address: WNNW-98 Advanced Computer Architecture Group Department of Computer Science University of York York YO10 5DD UK Organising Committee Prof. Jim Austin, University of York Mr. Dan Kustrin, University of York Prof. Nigel M Allinson, UMIST Prof. Igor Aleksander, Imperial College of Science, Technology and Medicine Dr. Simon O'Keefe, University of York -- Professor Jim Austin, Department of Computer Science, University of York, York, YO1 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From Jon.Baxter at keating.anu.edu.au Mon Nov 9 17:29:03 1998 From: Jon.Baxter at keating.anu.edu.au (Jonathan Baxter) Date: Tue, 10 Nov 1998 09:29:03 +1100 (EST) Subject: no subject (file transmission) Message-ID: <199811092229.JAA08025@keating.anu.edu.au> *********************** NIPS*98 FINAL PROGRAM ********************** My apologies if you receive this notice more than once. Jonathan Baxter ******************************************************************** SUN NOV 29 ---------- 18:00-22:00 Registration MON NOV 30 ---------- 08:30-18:00 Registration 09:30-17:30 Tutorials 18:30 Reception and Conference Banquet 20:30 The laws of the WEB (Banquet talk) B. Huberman Xerox PARC TUE DEC 1 --------- Oral Session 1: 08:30 Statistics of visual images: neural representation and synthesis (Invited) E. Simoncelli New York University 09:20 Attentional modulation of human pattern discrimination psychophysics reproduced by a quantitative model (VS1, Oral) L. Itti, J. Braun, D. Lee, C. Koch California Institute of Technology 09:40 Orientation, scale, and discontinuity as emergent properties of illusory contour shape (VS2, Oral) K. Thornber, L. Williams NEC Research Institute, University of New Mexico 10:00 DTs: dynamic trees (AA1, Spotlight) C. Williams, N. Adams Aston University Modeling stationary and integrated time series with autoregressive neural networks (LT1, Spotlight) F. Leisch, A. Trapletti, K. Hornik Technical University of Vienna Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages (LT2, Spotlight) W. Maass, E. Sontag Technical University of Graz, Rutgers University Semiparametric support vector and linear programming machines (AA8, Spotlight) A. Smola, T. Friess, B. Schoelkopf GMD FIRST Blind separation of filtered source using state-space approach (AA13, Spotlight) L. Zhang, A. Cichocki RIKEN Brain Science Institute 10:15-11:00 Break Oral Session 2: 11:00 The bias-variance tradeoff and the randomized GACV (AA4, Oral) G. Wahba, X. Lin, F. Gao, D. Xiang, R. Klein, B. Klein University of Wisconsin-Madison, SAS Institute 11:20 Kernel PCA and de-noising in feature spaces (AA7, Oral) S. Mika, B. Schoelkopf, A. Smola, K. Mueller, M Scholz, G. Raetsch GMD FIRST 11:40 Sparse code shrinkage: denoising by maximum likelihood estimation (AA12, Oral) A. Hyvaarinen, P. Hoyer, E. Oja Helsinki University of Technology 12:00-14:00 Lunch Oral Session 3: 14:00 Temporally asymmetric Hebbian learning, spike timing and neuronal response variability (Invited) L. Abbott Brandeis University 14:50 Information maximization in single neurons (NS1, Oral) M. Stemmler, C. Koch California Institute of Technology 15:10 Multi-electrode spike sorting by clustering transfer functions (NS2, Oral) D. Rinberg, H. Davidowitz, N. Tishby NEC Research Institute 15:30 Distributional population codes and multiple motion models (NS3, Spotlight) R. Zemel, P. Dayan University of Arizona, Massachusetts Institute of Technology Population coding with correlated noise (NS4, Spotlight) H. Yoon, H. Sompolinsky Hebrew University Bayesian modeling of human concept learning (CS1, Spotlight) J. Tenenbaum Massachusetts Institute of Technology Mechanisms of generalization in perceptual learning (CS2, Spotlight) Z. Liu, D. Weinshall NEC Research Institute, Hebrew University An entropic estimator for structure discovery (SP1, Spotlight) M. Brand Mitsubishi Electric Research Laboratory 15:45-16:15 Break Oral Session 4: 16:15 The role of lateral cortical competition in ocular dominance development (NS6, Oral) C. Piepenbrock, K. Obermayer Technical University of Berlin 16:35 Evidence for learning of a forward dynamic model in human adaptive control (CS3, Oral) N. Bhushan, R. Shadmehr Johns Hopkins University 16:55-18:00 Poster Preview 19:30 Poster Session WED DEC 2 --------- Oral Session 5: 08:30 Computation by Cortical Modules (Invited) H. Sompolinsky Hebrew University 09:20 Learning curves for Gaussian processes (LT14, Oral) P. Sollich University of Edinburgh 9:40 Mean field methods for classification with Gaussian processes (LT15, Oral) M. Opper O. Winther Aston University, Niels Bohr Institute 10:00 Dynamics of supervised learning with restricted training sets (LT16, Spotlight) A. Coolen, D. Saad King's College London, Aston University Finite-dimensional approximation of Gaussian processes (LT18, Spotlight) G. Trecate, C. Williams, M. Opper University of Pavia, Aston University Inference in multilayer networks via large deviation bounds (LT20, Spotlight) M. Kearns, L. Saul AT&T Labs Gradient descent for general reinforcement learning (CN14, Spotlight) L. Baird, A. Moore Carnegie Mellon University Risk sensitive reinforcement learning (CN15, Spotlight) R. Neuneier, O. Mihatsch Siemens AG 10:15-11:00 Break Oral Session 6: 11:00 VLSI implementation of motion centroid localization for autonomous navigation (IM6, Oral) R. Etienne-Cummings, M. Ghani, V. Gruev Southern Illinois University 11:20 Improved switching among temporally abstract actions (CN16, Oral) R. Sutton, S. Singh, D. Precup, B. Ravindran University of Massachusetts, University of Colorado 11:40 Finite-sample convergence rates for Q-learning and indirect algorithms (CN17, Oral) M. Kearns, S. Singh AT&T Labs, University of Colorado 12:00-14:00 Lunch Oral Session 7: 14:00 Statistical natural language processing: better living through floating-point numbers (Invited) E. Charniak Brown University 14:50 Markov processes on curves for automatic speech recognition (SP3, Oral) L. Saul, M. Rahim AT&T Labs 15:10 Approximate learning of dynamic models (AA22, Oral) X. Boyen, D. Koller Stanford University 15:30 Learning nonlinear stochastic dynamics using the generalized EM algorithm (AA23, Spotlight) Z. Ghahramani, S. Roweis University of Toronto, California Institute of Technology Reinforcement learning for trading systems (AP9, Spotlight) J. Moody, M. Saffell Oregon Graduate Institute Bayesian modeling of facial similarity (AP13, Spotlight) B. Moghaddam, T. Jebara, A. Pentland Mitsubishi Electric Research Laboratory, Massachusetts Institute of Technology Computation of smooth optical flow in a feedback connected analog network (IM8, Spotlight) A. Stocker, R. Douglas University and ETH Zurich Classification on pairwise proximity data (AA26, spotlight) T. Graepel, R. Herbrich, P. Bollmann-Sdorra, K. Obermayer Technical University of Berlin 15:45-16:15 Break Oral Session 8: 16:15 Learning from dyadic data (AA27, Oral) T. Hofmann, J. Puzicha, M. Jordan Massachusetts Institute of Technology, University of Bonn 16:35 Classification in non-metric spaces (VS7, Oral) D. Weinshall, D. Jacobs, Y. Gdalyahu NEC Research Institute, Hebrew University 16:55-18:00 Poster Preview 19:30 Poster Session THU DEC 3 --------- Oral Session 9: 08:30 Convergence of the wake-sleep algorithm (LT22, Oral) S. Ikeda, S. Amari, H. Nakahara RIKEN Brain Science Institute 08:50 Learning a continuous hidden variable model for binary data (AA32, Oral) D. Lee, H. Sompolinsky Bell Laboratories, Hebrew University 09:10 Direct optimization of margins improves generalization in combined classifiers (LT23, Oral) L. Mason, P. Bartlett, J. Baxter Australian National University 09:30 A polygonal line algorithm for constructing principal curves (AA39, Oral) B. Kegl, A. Krzyzak, T. Linder, K. Zeger Concordia University, Queen's University, UC San Diego 09:50-10:30 Break Oral Session 10: 10:30 Graphical models for recognizing human interactions (AP15, Oral) N. Oliver, B. Rosario, A. Pentland Massachusetts Institute of Technology 10:50 Fast neural network emulation of physics-based models for computer animation (AP16, Oral) R. Grzeszczuk, D. Terzopoulos, G. Hinton Intel Corporation, University of Toronto 11:10 Things that think (Invited) N. Gershenfeld Massachusetts Institute of Technology 12:00 End of main conference POSTERS: TUE DEC 1 ------------------ Basis selection for wavelet regression (AA2, Poster) K. Wheeler NASA Ames Research Center Boxlets: a fast convolution algorithm for signal processing and neural networks (AA3, Poster) P. Simard, L. Bottou, P. Haffner, Y. LeCun AT&T Labs Least absolute shrinkage is equivalent to quadratic penalization (AA5, Poster) Y. Grandvalet, S. Canu Universite de Technologie de Compiegne Neural networks for density estimation (AA6, Poster) M. Magdon-Ismail, A. Atiya California Institute of Technology Semi-supervised support vector machines (AA9, Poster) K. Bennett, A. Demiriz Rensselaer Polytechnic Institute Exploiting generative models in discriminative classifiers (AA10, Poster) T. Jaakkola, D. Haussler UC Santa Cruz Using analytic QP and sparseness to speed training of support vector machines (AA11, Poster) J. Platt Microsoft Research Source separation as a by-product of regularization (AA14, Poster) S. Hochreiter, J. Schmidhuber Technical University of Munich, IDSIA Unsupervised classification with non-Gaussian mixture models using ICA (AA15, Poster) T-W. Lee, M. Lewicki, T. Sejnowski The Salk Institute Hierarchical ICA belief networks (AA16, Poster) H. Attias UC San Francisco Efficient Bayesian parameter estimation in large discrete domains (AA17, Poster) N. Friedman, Y. Singer UC Berkeley, AT&T Labs Discovering hidden features with Gaussian processes regression (AA18, Poster) F. Vivarelli, C. Williams Aston University Bayesian PCA (AA19, Poster) C. Bishop Microsoft Research Replicator equations, maximal cliques, and graph isomorphism (AA20, Poster) M. Pelillo University of Venice Convergence rates of algorithms for perceptual organization: detecting visual contours (AA21, Poster) A. Yuille, J. Coughlan Smith-Kettlewell Eye Research Institute Independent component analysis of intracellular calcium spike data (AP1, Poster) K. Prank, J.Boerger, A. von zur Muehlen, G. Brabant, C. Schoefl Medical School Hannover Applications of multi-resolution neural networks to mammography (AP2, Poster) P. Sajda, C. Spence Sarnoff Corporation Making templates rotationally invariant: an application to rotated digit recognition (AP3, Poster) S. Baluja Carnegie Mellon University Graph matching for shape retrieval (AP4, Poster) B. Huet, A. Cross, E. Hancock University of York Vertex identification in high energy physics experiments (AP5, Poster) G. Dror, H. Abramowicz, D. Horn The Academic College of Tel-Aviv-Yaffo, Tel-Aviv University Familiarity discrimination of radar pulses (AP6, Poster) E. Granger, S. Grossberg, M. Rubin, W. Streilein Ecole Polytechnique de Montreal, Boston University Robot docking using mixtures of Gaussians (AP7, Poster) M. Williamson, R. Murray-Smith, V. Hansen Massachusetts Institute of Technology, Technical University of Denmark, Daimler-Benz Call-based fraud detection in mobile communication networks using a hierarchical regime-switching model (AP8, Poster) J. Hollmen, V. Tresp Helsinki University of Technology, Siemens AG Multiple paired forward-inverse models for human motor learning and control (CS4, Poster) M. Haruno, D. Wolpert, M. Kawato ATR Human Information Processing Research Laboratories, University College London A neuromorphic monaural sound localizer (IM1, Poster) J. Harris, C-J. Pu, J. Principe University of Florida Active noise canceling using analog neuro-chip with on-chip learning capability (IM2, Poster) J-W. Cho, S-Y. Lee Korea Advanced Institute of Science and Technology Optimizing correlation algorithms for hardware-based transient classification (IM3, Poster) R. Edwards, G. Cauwenberghs, F. Pineda Johns Hopkins University A high performance k-NN classifier using a binary correlation matrix memory (IM4, Poster) P. Zhou, J. Austin, J. Kennedy University of York A micropower CMOS adaptive amplitude and shift invariant vector quantizer (IM5, Poster) R. Coggins, R. Wang, M. Jabri University of Sidney Where does the population vector of motor cortical cells point during reaching movements? (NS5, Poster) P. Baraduc, E. Guigon, Y. Burnod Universite Pierre et Marie Curie Heeger's normalization, line attractor network and ideal observers (NS7, Poster) S. Deneve, A. Pouget, P. Latham Georgetown University Image statistics and cortical normalization models (NS8, Poster) E. Simoncelli, O. Schwartz New York University Learning instance-independent value functions to enhance local search (CN1, Poster) R. Moll, A. Barto, T. Perkins, R. Sutton University of Massachusetts Exploring unknown environments with real-time heuristic search (CN2, Poster) S. Koenig Georgia Institute of Technology GLS: a hybrid classifier system based on POMDP research (CN3, Poster) A. Hayashi, N. Suematsu Hiroshima City University Non-linear PI control inspired by biological control systems (CN4, Poster) L. Brown, G. Gonye, J. Schwaber E.I. DuPont deNemours Coordinate transformation learning of hand position feedback controller by using change of position error norm (CN5, Poster) E. Oyama, S. Tachi University of Tokyo Optimizing admission control while ensuring quality of service in multimedia networks via reinforcement learning (CN6, Poster) T. Brown, H. Tong, S. Singh University of Colorado Barycentric interpolators for continuous space & time reinforcement learning (CN7, Poster) R. Munos, A. Moore Carnegie Mellon University Modifying the parti-game algorithm for increased robustness, higher efficiency and better policies (CN8, Poster) M. Al-Ansari, R. Williams Northeastern University Coding time-varying signals using sparse, shift-invariant representations (SP2, Poster) M. Lewicki, T. Sejnowski The Salk Institute Phase diagram and storage capacity of sequence storing neural networks (LT3, Poster) A. Duering, A. Coolen, D. Sherrington Oxford University, King's College London Discontinuous recall transitions induced by competition between short- and long-range interactions in recurrent networks (LT4, Poster) N. Skantzos, C. Beckmann, A. Coolen King's College London Computational differences between asymmetrical and symmetrical networks (LT5, Poster) Z. Li, P. Dayan Massachusetts Institute of Technology Shrinking the tube: a new support vector regression algorithm (LT6, Poster) B. Schoelkopf, P. Bartlett, A. Smola, R. Williamson GMD FIRST, Australian National University Dynamically adapting kernels in support vector machines (LT7, Poster) N. Cristianini, C. Campbell, J. Shawe-Taylor University of Bristol, University of London A theory of mean field approximation (LT8, Poster) T. Tanaka Tokyo Metropolitan University The belief in TAP (LT9, Poster) Y. Kabashima, D. Saad Tokyo Institute of Technology, Aston University Unsupervised clustering: the mutual information between parameters and observations (LT10, Poster) D. Herschkowitz, J-P. Nadal Ecole Normale Superieure Classification with linear threshold functions and the linear loss (LT11, Poster) C. Gentile, M. Warmuth University of Milan, UC Santa Cruz Almost linear VC dimension bounds for piecewise polynomial networks (LT12, Poster) P. Bartlett, V. Maiorov, R. Meir Australian National University, Technion Tight bounds for the VC-dimension of piecewise polynomial networks (LT13, Poster) A. Sakurai Japan Advanced Institute of Science and Technology Learning Lie transformation groups for invariant visual perception (VS3, Poster) R. Rao, D. Rudermann The Salk Institute Support vector machines applied to face recognition (VS4, Poster) J. Phillips National Institute of Standards and Technology Learning to find pictures of people (VS5, Poster) S. Ioffe, D. Forsyth UC Berkeley Probabilistic sensor fusion (VS6, Poster) R. Sharma, T. Leen, M. Pavel Oregon Graduate Institute POSTERS: WED DEC 2 ------------------ Learning multi-class dynamics (AA24, Poster) A. Blake, B. North, M. Isard Oxford University Fisher scoring and a mixture of modes approach for approximate inference and learning in nonlinear state space models (AA25, Poster) T. Briegel, V. Tresp Siemens AG A randomized algorithm for pairwise clustering (AA28, Poster) Y. Gdalyahu, D. Weinshall, M. Werman Hebrew University Visualizing group structure (AA29, Poster) M. Held, J. Puzicha, J. Buhmann University of Bonn Probabilistic visualization of high-dimensional binary data (AA30, Poster) M. Tipping Microsoft Research Restructuring sparse high dimensional data for effective retrieval (AA31, Poster) C. Isbell, P. Viola Massachusetts Institute of Technology Exploratory data analysis using radial basis function latent variable models (AA33, Poster) A. Marrs, A. Webb DERA SMEM algorithm for mixture models (AA34, Poster) N. Ueda, R. Nakano, Z. Ghahramani, G. Hinton NTT Communication Science Laboratories, University of Toronto Learning mixture hierarchies (AA35, Poster) N. Vasconcelos, A. Lippman Massachusetts Institute of Technology On-line and batch parameter estimation of Gaussian mixtures based on the relative entropy (AA36, Poster) Y. Singer, M. Warmuth AT&T Labs, UC Santa Cruz Very fast EM-based mixture model clustering using multiresolution kd-trees (AA37, Poster) A. Moore Carnegie Mellon University Maximum conditional likelihood via bound maximization and the CEM algorithm (AA38, Poster) T. Jebara, A. Pentland Massachusetts Institute of Technology Lazy learning meets the recursive least squares algorithm (AA40, Poster) M. Birattari, G. Bontempi, H. Bersini Universite Libre de Bruxelles Global optimization of neural network models via sequential sampling (AA41, Poster) J. de Freitas, M. Niranjan, A. Doucet, A. Gee Cambridge University Regularizing AdaBoost (AA42, Poster) G. Raetsch, T. Onoda, K. Mueller GMD FIRST Using collective intelligence to route Internet traffic (AP10, Poster) D. Wolpert, K. Tumer, J. Frank NASA Ames Research Center Scheduling straight-line code using reinforcement learning and rollouts (AP11, Poster) A. McGovern, E. Moss University of Massachusetts Probabilistic modeling for face orientation discrimination: learning from labeled and unlabeled data (AP12, Poster) S. Baluja Carnegie Mellon University Adding constrained discontinuities to Gaussian process models of wind fields (AP14, Poster) D. Cornford, I. Nabney, C. Williams Aston University A principle for unsupervised hierarchical decomposition of visual scenes (CS5, Poster) M. Mozer University of Colorado Facial memory is kernel density estimation (almost) (CS6, Poster) M. Dailey, G. Cottrell, T. Busey UC San Diego, Indiana University Perceiving without learning: from spirals to inside/outside relations (CS7, Poster) K. Chen, D. Wang Ohio State University Utilizing time: asynchronous binding (CS8, Poster) B. Love Northwestern University A model for associative multiplication (CS9, Poster) G. Christianson, S. Becker McMaster University Analog VLSI cellular implementation of the boundary contour system (IM7, Poster) G. Cauwenberghs, J. Waskiewicz Johns Hopkins University An integrated vision sensor for the computation of optical flow singular points (IM9, Poster) C. Higgins, C. Koch California Institute of Technology Spike-based compared to rate-based Hebbian learning (NS9, Poster) R. Kempter, W. Gerstner, L. van Hemmen Technical University of Munich, Swiss Federal Institute of Technology Neuronal regulation implements efficient synaptic pruning (NS10, Poster) G. Chechik, I. Meilijson, E. Ruppin Tel-Aviv University Signal detection in noisy weakly-active dendrites (NS11, Poster) A. Manwani, C. Koch California Institute of Technology Influence of changing the synaptic transmitter release probability on contrast adaptation of simple cells in the primary visual cortex (NS12, Poster) P. Adorjan, K. Obermayer Technical University of Berlin Complex cells as cortically amplified simple cells (NS13, Poster) F. Chance, S. Nelson, L. Abbott Brandeis University Synergy and redundancy among brain cells of behaving monkeys (NS14, Poster) I. Gat, N. Tishby Hebrew University, NEC Research Institute Visualizing and analyzing single-trial event-related potentials (NS15, Poster) T-P. Jung, S. Makeig, M. Westerfield, J. Townsend, E. Courchesne, T. Sejnowski The Salk Institute, Naval Health Research Center, UC San Diego A reinforcement learning algorithm in partially observable environments using short-term memory (CN9, Poster) N. Suematsu, A. Hayashi Hiroshima City University Experiments with a memoryless algorithm which learns locally optimal stochastic policies for partially observable Markov decision processes (CN10, Poster) J. Williams, S. Singh University of Colorado The effect of eligibility traces on finding optimal memoryless policies in partially observable Markovian decision processes (CN11, Poster) J. Loch University of Colorado Learning macro-actions in reinforcement learning (CN12, Poster) J. Randlov Niels Bohr Institute Reinforcement learning based on on-line EM algorithm (CN13, Poster) M. Sato, S. Ishii ATR Human Information Processing Research Laboratories, Nara Institute of Science and Technology Controlling the complexity of HMM systems by regularization (SP4, Poster) C. Neukirchen, G. Rigoll Gerhard-Mercator-University Maximum-likelihood continuity mapping (MALCOM): an alternative to HMMs (SP5, Poster) D. Nix, J. Hogden Los Alamos National Laboratory On-line learning with restricted training sets: exact solution as benchmark for general theories (LT17, Poster) H. Rae, P. Sollich, A. Coolen King's College London, University of Edinburgh General bounds on Bayes errors for regression with Gaussian processes (LT19, Poster) M. Opper, F. Vivarelli Aston University Variational approximations of graphical models using undirected graphs (LT21, Poster) D. Barber, W. Wiegerinck University of Nijmegen Optimizing classifiers for imbalanced training sets (LT24, Poster) G. Karakoulas, J. Shawe-Taylor Canadian Imperial Bank of Commerce, University of London On the optimality of incremental neural network algorithms (LT25, Poster) R. Meir, V. Maiorov Technion General-purpose localization of textured image regions (VS8, Poster) R. Rosenholtz Xerox PARC A V1 model of pop out and asymmetry in visual search (VS9, Poster) Z. Li Massachusetts Institute of Technology Minutemax: a fast approximation for minimax learning (VS10, Poster) J. Coughlan, A. Yuille Smith-Kettlewell Eye Research Institute Using statistical properties of a labelled visual world to estimate scenes (VS11, Poster) W. Freeman, E. Pasztor Mitsubishi Electric Research Laboratory Example based image synthesis of articulated figures (VS12, Poster) T. Darrell Interval Research ******************************************************************************* From Bill_Warren at Brown.edu Wed Nov 11 15:54:12 1998 From: Bill_Warren at Brown.edu (Bill Warren) Date: Wed, 11 Nov 1998 15:54:12 -0500 (EST) Subject: Please post -- thanks! Message-ID: Please circulate to your graduating seniors: GRADUATE TRAINEESHIPS Visual Navigation in Humans and Robots Brown University The Department of Cognitive and Linguistic Sciences and the Department of Computer Science at Brown University are seeking graduate applicants interested in visual navigation in humans and robots. The project investigates the nature of the spatial knowledge that is used in active navigation, and how it interacts with the environmental layout, landmark properties, and navigational task during learning. The research is based in a unique virtual reality lab with a 40 x 40 ft wide-area tracker, Kaiser head-mounted display, and SGI Onyx 2 graphics. Human experiments study active navigation and landmark recognition in virtual environments, where 3D structure and properties are easily manipulated. In conjunction, biologically-inspired navigation strategies are tested on a mobile robot platform. Computational modeling pursues (a) reinforcement learning and hidden Markov models for spatial navigation and (b) neural net model of the hippocampus. The project is under the direction of Leslie Kaelbling (Computer Science, www.cs.brown.edu), Michael Tarr and William Warren (Cognitive & Linguistic Sciences, www.cog.brown.edu). Three to four graduate traineeships are available, beginning in the Fall of 1999. Applicants should apply to either of these home departments. Application materials can be obtained from: The Graduate School, Brown University, Box 1867, Providence, RI 02912, phone (401) 863-2600, www.brown.edu. The application deadline is Jan. 1, 1999. This program is funded by a Learning and Intelligent Systems grant from NSF, and an IGERT training grant from NSF. -- Bill William H. Warren, Professor Dept. of Cognitive & Linguistic Sciences Box 1978 Brown University Providence, RI 02912 (401) 863-3980 ofc, 863-2255 FAX Bill_Warren at brown.edu From tani at csl.sony.co.jp Thu Nov 12 01:51:41 1998 From: tani at csl.sony.co.jp (Jun.Tani (SONY CSL)) Date: Thu, 12 Nov 1998 15:51:41 +0900 Subject: TR: An Interpretation of the `Self` from the Dynamical Systems Perspective: Message-ID: <199811120651.PAA12630@tani.csl.sony.co.jp> Dear connectionists, The following new technical report is avairable. "An Interpretation of the `Self` from the Dynamical Systems Perspective: A Constructivist Approach" by Jun Tani, Sony CSL. This paper will be published from Journal of Consciousness Studies, Vol.5 No.5-6, 1998. The TR can be retrieved from: (1) ftp.csl.sony.co.jp/CSL/CSL-Papers/98/SCSL-TR-98-018.ps.Z (you need to uncompress in Unix.) (2) ftp.csl.sony.co.jp/CSL/CSL-Papers/98/SCSL-TR-98-018.pdf (3) http://www.csl.sony.co.jp/person/tani.html. ABSTRACT This study attempts to describe the notion of the "self" using dynamical systems language based on the results of our robot learning experiments. A neural network model consisting of multiple modules is proposed, in which the interactive dynamics between the bottom-up perception and the top-down prediction are investigated. Our experiments with a real mobile robot showed that the incremental learning of the robot switches spontaneously between steady and unsteady phases. In the steady phase, the top-down prediction for the bottom-up perception works well when coherence is achieved between the internal and the environmental dynamics. In the unsteady phase, conflicts arise between the bottom-up perception and the top-down prediction; the coherence is lost, and a chaotic attractor is observed in the internal neural dynamics. By investigating possible analogies between this result and the phenomenological literature on the "self", we draw the conclusions that (1) the structure of the "self" corresponds to the "open dynamic structure" which is characterized by co-existence of stability in terms of goal-directedness and instability caused by embodiment; (2) the open dynamic structure causes the system's spontaneous transition to the unsteady phase where the "self" becomes aware. Best regards, jun ------------------------------------------------ Jun TANI, Ph.D Senior Researcher Sony Computer Science Laboratory Inc. Takanawa Muse Building, 3-14-13 Higashi-gotanda, Shinagawa-ku, Tokyo, 141 JAPAN email: tani at csl.sony.co.jp http://www.csl.sony.co.jp/person/tani.html Fax +81-3-5448-4273 Tel +81-3-5448-4380 ** Joint Appointment Visiting Associate Professor Graduate School of Arts and Sciences University of Tokyo From bengioy at IRO.UMontreal.CA Thu Nov 12 10:27:42 1998 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Thu, 12 Nov 1998 10:27:42 -0500 Subject: post-doc position in Montreal / machine learning + bootstrap for database mining Message-ID: <19981112102742.45902@IRO.UMontreal.CA> POST DOCTORAL RESEARCH STAFF MEMBER IN STATISTICAL DATA ANALYSIS AND MACHINE LEARNING FOR HIGH-DIMENSIONAL DATA SETS MATHEMATICS OF INFORMATION TECHNOLOGY AND COMPLEX SYSTEMS (MITACS: a new Canadian Network of Centers of Excellence) Position to be held jointly at the Department of Computer Science & Operations Research and the Department of Mathematics and Statistics, at the UNIVERSITY OF MONTREAL, Quebec, Canada NATURE AND SCOPE OF THE POSITION: A post-doctoral position position is available at the University of Montreal within the MITACS network of centers of excellence. The position is subject to the approval of funding by the MITACS Board of Directors. The main research area will be the statistical data analysis of high-dimensional data sets with machine learning algorithms, also known as, "database mining". The main research questions that will be addressed in this research are the following: - how to deal with the "curse of dimensionality": algorithms based on variable selection or based on combining many variables with different importance, while controlling generalization to avoid overfitting; - how to make inferences on the models obtained with such methods, mostly using resampling methods such as the BOOTSTRAP. This research will be performed within the MITACS project entitled "INFERENCE FROM HIGH-DIMENSIONAL DATA". See http://www.iro.umontreal.ca/‾bengioy/mitacs.html for more information on the project and http://www.mitacs.math.ca for more information on the MITACS network. The candidate will be working under the supervision of professors Yoshua Bengio (computer science and operations research) and Christian Leger (mathematics and statistics). See http://www.iro.umontreal.ca/‾bengioy and http://www.iro.umontreal.ca/‾lisa for more information respectively on Yoshua Bengio and his laboratory. See http://www.dms.umontreal.ca/‾leger for more information on Christian Leger. ESSENTIAL SKILLS, KNOWLEDGE, AND ABILITIES: Candidates must possess a recent Ph.D. in computer science, statistics, mathematics, or a related discipline, with a research background in machine learning (in particular neural networks) and/or computational statistical methods such as the bootstrap. Candidates must have excellent programming skills, with demonstrated experience. Experience in the following areas will be mostly considered: - Statistical data analysis in general, - bootstrapping methods in particular. - Machine learning algorithms in general, - artificial neural networks in particular. - Programming skills in general, - object-oriented programming, - participation in large-scale, multiple authors, software projects, - experience with C, C++ and S-Plus languages, in particular. LENGTH OF EMPLOYMENT: 1 year (with possible renewal for 4 years total), starting as soon as possible. FOR FURTHER INFORMATION, PLEASE CONTACT: Yoshua Bengio bengioy at iro.umontreal.ca, 514-343-6804, fax 514-343-5834 or Christian Leger leger at dms.umontreal.ca 514-343-7824, fax 514-343-5700 Electronic applications (preferably as a postscript, raw text, or pdf file) are encouraged, in the form of a a Curriculum Vitae with information on academic experience, academic standing, research experience, programming experience, and any other relevant information (e.g., pointer to your web site, if any). -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, C.P. 6128 Succ. Centre-Ville, 2920 Chemin de la Tour, Montreal, Quebec, Canada H3C 3J7 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/‾bengioy http://www.iro.umontreal.ca/labs/neuro From jbednar at cs.utexas.edu Fri Nov 13 05:25:31 1998 From: jbednar at cs.utexas.edu (James A. Bednar) Date: Fri, 13 Nov 1998 04:25:31 -0600 (CST) Subject: self-organization software, papers, web demos Message-ID: <199811131025.EAA25845@mail.cs.utexas.edu> The following software package for self-organization in laterally connected maps is now available from the UTCS Neural Networks Research Group website http://www.cs.utexas.edu/users/nn. The software has been developed in the RF-LISSOM project of modeling the primary visual cortex, and is intended to serve as a starting point for computational studies of development and function of perceptual maps in general. Abstracts of two recent papers on the RF-LISSOM project, on segmentation and on internal pattern generators, are also included below. Other papers and demos of the RF-LISSOM software are available at http://www.cs.utexas.edu/users/nn/pages/research/selforg.html. - Jim, Yoonsuck, and Risto Software: ----------------------------------------------------------------------- RFLISSOM: LATERALLY CONNECTED SELF-ORGANIZING MAPS http://www.cs.utexas.edu/users/nn/pages/software/abstracts.html#lissom James A. Bednar and Joseph Sirosh The LISSOM package contains the ANSI C source code and examples for training and testing RF-LISSOM. This implementation supports almost every modern single-processor workstation, as well as the Cray T3E massively parallel supercomputer. It is designed to have full functionality even when run in batch mode or remote mode, using a simple but powerful command file format. It also has an interactive command-line prompt accepting the same commands, which makes it easy to test and explore different options in real-time. Because of the focus on batch/remote use, it does not have a GUI, but it does create a wide variety of images for analysis and testing. It can display these images immediately or save them for later viewing. Sample command files are provided for running orientation and ocular-dominance map simulations on a variety of network and machine sizes. Extensive documentation is also included, all of which is also available via online help where appropriate. This implementation currently supports the LISSOM algorithm only, but it can serve as a good starting point for writing a batch-mode neural-network or related simulator. In particular, it includes independent and general-purpose routines for interprocessor communication, platform independence, console messaging, error handling, command-file/command-prompt processing, parameter setting/ retrieving/ bounds-checking, expression parsing, and online help. Papers: ----------------------------------------------------------------------- PATTERN-GENERATOR-DRIVEN DEVELOPMENT IN SELF-ORGANIZING MODELS James A. Bednar and Risto Miikkulainen In James M. Bower, editor, Computational Neuroscience: Trends in Research, 1998, 317-323. New York: Plenum, 1998. (7 pages) http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#bednar.cns97.ps.Z Self-organizing models develop realistic cortical structures when given approximations of the visual environment as input. Recently it has been proposed that internally generated input patterns, such as those found in the developing retina and in PGO waves during REM sleep, may have the same effect. Internal pattern generators would constitute an efficient way to specify, develop, and maintain functionally appropriate perceptual organization. They may help express complex structures from minimal genetic information, and retain this genetic structure within a highly plastic system. Simulations with the RF-LISSOM orientation map model indicate that such preorganization is possible, providing a computational framework for examining how genetic influences interact with visual experience. The results from this paper can be reproduced with the LISSOM software described above. ----------------------------------------------------------------------- SELF-ORGANIZATION AND SEGMENTATION IN A LATERALLY CONNECTED ORIENTATION MAP OF SPIKING NEURONS Yoonsuck Choe and Risto Miikkulainen Neurocomputing, in press (20 pages) http://www.cs.utexas.edu/users/nn/pages/publications/abstracts.html#choe.neurocomputing98.ps.Z The RF-SLISSOM model integrates two separate lines of research on computational modeling of the visual cortex. Laterally connected self-organizing maps have been used to model how afferent structures such as orientation columns and patterned lateral connections can simultaneously self-organize through input-driven Hebbian adaptation. Spiking neurons with leaky integrator synapses have been used to model image segmentation and binding by synchronization and desynchronization of neuronal group activity. Although these approaches differ in how they model the neuron and what they explain, they share the same overall layout of a laterally connected two-dimensional network. This paper shows how both self-organization and segmentation can be achieved in such an integrated network, thus presenting a unified model of development and functional dynamics in the primary visual cortex. A demo of RF-SLISSOM can be run remotely over the internet at http://www.cs.utexas.edu/users/nn/pages/research/selforg.html. From niebur at russell.mb.jhu.edu Fri Nov 13 14:32:08 1998 From: niebur at russell.mb.jhu.edu (Prof. Ernst Niebur) Date: Fri, 13 Nov 1998 14:32:08 -0500 Subject: Graduate studies in systems and computational neuroscience at Johns Hopkins University Message-ID: <199811131932.OAA07483@russell.mb.jhu.edu> The Johns Hopkins University is a major private research university and its hospital and medical school have been rated consistently as the first or second in the nation over the last years. The Department of Neuroscience ranks second in the nation (all according to 'US News and World Report'). The Zanvyl Krieger Mind/Brain Institute at Johns Hopkins encourages students with interest in systems neuroscience, including computational neuroscience, to apply for the graduate program in the Neuroscience department. The Institute is an interdisciplinary research center devoted to the investigation of the neural mechanisms of mental function and particularly to the mechanisms of perception: How is complex information represented and processed in the brain, how is it stored and retrieved, and which brain centers are critical for these operations? Research opportunities exist in all of the laboratories of the Institute. Interdisciplinary projects, involving the student in more than one laboratory, are particularly encouraged. All students accepted to the PhD program of the Neuroscience department receive full tuition remission plus a stipend at or above the National Institutes of Health predoctoral level. Additional information on the research interests of the faculty in the Mind/Brain Institute and the Department of Neuroscience can be obtained at http://www.mb.jhu.edu/mbi.html and at http://www.med.jhu.edu/neurosci/welcome.html, respectively. Applicants should have a B.S. or B.A. with a major in any of the biological or physical sciences. Applicants are required to take the Graduate Record Examination (GRE; both the aptitude tests and an advanced test), or the Medical College Admission Test (MCAT). Further information on the admission procedure can be obtained from the Department of Neuroscience: Director of Graduate Studies Neuroscience Training Program Department of Neuroscience The Johns Hopkins University School of Medicine 725 Wolfe Street Baltimore, MD 21205 Completed applications (including three letters of recommendation and either GRE scores or Medical College Admission Test scores) must be received by January 1, 1998 at the above address. -- Ernst Niebur, PhD Krieger Mind/Brain Institute Asst. Prof. of Neuroscience Johns Hopkins University niebur at jhu.edu 3400 N. Charles Street (410)516-8643, -8640 (secr), -8648 (fax), -4357 (lab) Baltimore, MD 21218 From jkh at dcs.rhbnc.ac.uk Fri Nov 13 06:53:28 1998 From: jkh at dcs.rhbnc.ac.uk (Keith Howker) Date: Fri, 13 Nov 1998 11:53:28 -0000 Subject: NeuroCOLT2 - Recent technical Reports Message-ID: <01BE0EFC.3AF90BA0.jkh@dcs.rhbnc.ac.uk> The following NeuroCOLT2 Technical reports are now available on our web site and ftp archive. best wishes, K. ++++++++++++++++++++++++++++ NC-TR-98-011 Matamala & Meer On the computational structure of the connected components of a hard problem NC-TR-98-019 Williamson, Smola & Schoelkopf Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators NC-TR-98-020 Shawe-Taylor & Cristianini Margin distribution bounds on generalization NC-TR-98-021 Raetsch, Onoda & Mueller Soft Margins for AdaBoost NC-TR-98-022 Smola, Williamson & Schoelkopf Generalization Bounds for Convex Combinations of Kernel Functions NC-TR-98-023 Williamson, Smola & Schoelkopf Entropy Numbers, Operators and Support Vector Kernels NC-TR-98-024 Smola, Friess & Schoelkopf Semiparametric support vector and linear programming machines NC-TR-98-025 Auer, Cesa-Bianchi, Freund & Schapire Gambling in the rigged casino: the adversarial multi-armed bandit problem NC-TR-98-026 Cesa-Bianchi & Lugosi On prediction of individual sequences NC-TR-98-027 Smola, Williamson & Schoelkopf Generalization bounds and learning rates for regularized principal manifolds NC-TR-98-028 Smola, Mika & Schoelkopf Quantization functionals and regularized principal manifolds NC-TR-98-029 Shawe-Taylor & Cristianini Robust bounds on generalization from the margin distribution NC-TR-98-030 Smola & Schoelkopf A tutorial on Support Vector regression ++++++++++++++++++++++++++++ GENERAL INFORMATION =================== The European Community ESPRIT Working Group in Neural and Computational Learning Theory (NeuroCOLT) has been funded for a further three years as the Working Group NeuroCOLT2. The overall objective of the Working Group is to demonstrate the effectiveness of technologies that arise from a deep understanding of the performance and implementation of learning systems on real world data. We will continue to maintain the Technical Report Archive of papers produced by members or associates of the group's partners. These results, together with other project information, may be accessed on our web site: http://www.neurocolt.com/. ACCESS INSTRUCTIONS =================== The files and abstracts may be accessed via WWW starting from the NeuroCOLT homepage: http://www.neurocolt.com/ or from the archive: ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports Alternatively, it is still possible to use ftp access as follows: % ftp ftp.dcs.rhbnc.ac.uk (134.219.96.1) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports/1998 ftp> binary ftp> get nc-tr-1998-001.ps.Z ftp> bye % zcat nc-tr-1998-001.ps.Z | lpr Similarly for the other technical reports. In some cases there are two files available, for example, nc-tr-97-002-title.ps.Z nc-tr-97-002-body.ps.Z The first contains the title page while the second contains the body of the report. The single command, ftp> mget nc-tr-97-002* will prompt you for the files you require. --------------------------------------------------------------------- | Keith Howker | e-mail: jkh at dcs.rhbnc.ac.uk | | Dept. of Computer Science | Phone : +44 1784 443696 | | RHUL | Fax : +44 1784 439786 | | EGHAM TW20 0EX, UK | Home: +44 1932 222529 | --------------------------------------------------------------------- From thimm at idiap.ch Mon Nov 16 12:05:20 1998 From: thimm at idiap.ch (Georg Thimm) Date: Mon, 16 Nov 1998 18:05:20 +0100 Subject: Contents of Neurocomputing 21 (1998) Message-ID: <199811161705.SAA09282@rotondo.idiap.ch> Dear reader, Please find below a compilation of the contents for Neurocomputing and Scanning the Issue written by V. David Snchez A. More information on the journal are available at the URL http://www.elsevier.nl/locate/jnlnr/05301 . The contents of this and other journals published by Elsevier are distributed also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect). Please feel free to redistribute this message. My apologies if this message is inappropriate for this mailing list; I would appreciate a feedback. With kindest regards, Georg Thimm Dr. Georg Thimm Research scientist & WWW: http://www.idiap.ch/‾thimm Current Events Editor of Neurocomputing Tel.: ++41 27 721 77 39 (Fax: 12) IDIAP / C.P. 592 / 1920 Martigny / Suisse E-mail: thimm at idiap.ch ******************************************************************************** Journal : NEUROCOMPUTING ISSN : 0925-2312 Vol./Iss. : 21 / 1-3 The self-organizing map Kohonen , Teuvo pp.: 1-6 TEXSOM: Texture segmentation using self-organizing maps Ruiz-del-Solar , Javier pp.: 7-18 Self-organizing maps of symbol strings Kohonen , Teuvo pp.: 19-30 SOM accelerator system Ru"ping , S. pp.: 31-50 Sufficient conditions for self-organisation in the one-dimensional SOM with a reduced width neighbourhood Flanagan , John A. pp.: 51-60 Text classification with self-organizing maps: Some lessons learned Merkl , Dieter pp.: 61-77 Local linear correlation analysis with the SOM Piras , Antonio pp.: 79-90 Applications of the growing self-organizing map Villmann , Th. pp.: 91-100 WEBSOM -- Self-organizing maps of document collections Kaski , Samuel pp.: 101-117 Theoretical aspects of the SOM algorithm Cottrell , M. pp.: 119-138 Self-organization and segmentation in a laterally connected orientation map of spiking neurons Choe , Yoonsuck pp.: 139-158 Neural detection of QAM signal with strongly nonlinear receiver Raivio , Kimmo pp.: 159-171 Self-organizing maps: Generalizations and new optimization techniques Graepel , Thore pp.: 173-190 Predicting bankruptcies with the self-organizing map Kiviluoto , Kimmo pp.: 191-201 Developments of the generative topographic mapping Bishop , Christopher M. pp.: 203-224 ******************************************************************************** Neurocomputing 21 (1998) Scanning the issue T. Kohonen presents in The self-organizing map (SOM) as overview of the SOM algorithm which realizes a data-compressing mapping of a high-dimensional data distribution onto a regular lower-dimensional grid. The SOM preserves the most important topological and metric relationships of the original data, this leads to its main properties of visualization and abstraction. In TEXSOM: Texture segmentation using self-organizing maps J. Ruiz-del-Solar describes a texture segmentation architecture based on the joint spatial/spatial-frequency paradigm. The Adaptive-Subspace Self-Organizing Map (ASSOM) or the Supervised ASSOM (SASSOM) are used to generate automatically the oriented filters. Defect identification on textured images can be performed using the proposed architecture. T. Kohonen and P. Somervuo present Self-organizing maps of symbol strings. Instead of defining metric vector spaces to be used with the self-organizing map (SOM) as is usually the case, symbols strings are organized on a SOM array. The batch map principle and average over strings are applied. The Feature Distance (FD) between the symbol strings and the Redundant Hash Addressing (RHA) are employed for the construction of large SOM arrays. S. Rping, M. Porrmann, and U. Rckert describe the SOM accelerator system. The system is a massively parallel system based on the NBISOM - 25 chips. Each chip contains an array of five times five processing elements. A VMEBus board with sixteen chips on it was built. The model vectors have up to 64 weights with 8-Bit accuracy. The system performance is 2.4 GCUPS for learning and 4.1 GCPS for recall. J.A. Flanagan presents Sufficient conditions for self-organisation in the one-dimensional SOM with a reduced width neighborhood. To achieve self-organization three intervals of non-zero probability separated by a minimum distance are required. The conditions are sufficient, but appear to be close to necessary. D. Merkl describes in Text classification with self-organizing maps: Some lessons learned the application of a hierarchical neural network approach to the task of document classification. The different network levels are realized by independent SOMs allowing the selection of different levels of granularity while exploring the document collection. A. Piras and A. Germond present in Local linear correlation analysis with the SOM an extension to the SOM algorithm for selecting relevant input variables in nonlinear regression. The linear correlation between variables in neighbor spaces is studied using the SOM. A localized correlation coefficient is determined that allows the understanding of the varying dependencies over the definition manifold. In Applications of the growing self-organizing map T. Villmann and H.-U. Bauer describe the growing self-organizing map (GSOM) algorithm. This extension of the SOM algorithm adapts the topology of the map output space and allows for unsupervised generation of dimension-reducing projections with optimal neighborhood preservation. In WEBSOM - Self-organizing maps of document collections S. Kaski, T. Honkela, K. Lagus, and T. Kohonen describe a new method to organize large collections of full-text documents in electronic form. A histogram of word categories is used to encode each document. The documents are ordered in a document map according to their contents. Computationally efficient SOM algorithms were used. In Theoretical aspects of the SOM algorithm M. Cottrell, J.C. Fort, and G. Pges review the status quo of the efforts to understand the SOM algorithm theoretically. The study of the one-dimensional case is complete with the exception of finding the appropriate decreasing rate to ensure ordering. The study of the multi-dimensional case is difficult and far from being complete. Y. Choe and R. Miikkulainen present Self-organization and segmentation in a laterally connected orientation map of spiking neurons. The model achieves selforganization based on rudimentary visual input and develops orientation-selective receptive fields and patterned lateral connections forming a global orientation map. Crucial principles of the biological visual cortex are incorporated. In Neural detection of QAM signal with strongly nonlinear receiver K. Raivio, J. Henriksson, and O. Simula describe neural receiver structures for adaptive discrete-signal detection. A receiver structure based on the self-organizing map (SOM) is compared with the conventional Decision Feedback Equalizer (DFE) for a Quadrature Amplitude Modulated (QAM) transmitted signal. T. Graepel, M. Burger and K. Obermayer describe in Self-organizing maps: Generalizations and new optimization techniques three algorithms for generating topographic mappings based on cost function minimization using an EM algorithm and deterministic annealing. The algorithms described are the Soft Topographic Vector Quantization (STVQ) algorithm, the Kernel-based Soft Topographic Mapping (STMK) algorithm, and the Soft Topographic Mapping for Proximity data (STMP) algorithm. K. Kiviluoto presents Predicting bankruptcies with the self-organizing map. The qualitative analysis on going bankrupt is performed using the SOM algorithm in a supervised manner. The quantitative analysis is performed using three classifiers: SOM-1, SOM-2, and RBF-SOM. Results are compared to Linear Discriminant Analysis (LDA) and Learning Vector Quantization (LVQ). C.M. Bishop, M. Svensn, and C.K.I. Williams present Developments of the generative topographic mapping (GTM) which is an enhancement of the standard SOM algorithm with a number of advantages. Extensions of the GTMare reported: the use of an incremental EM algorithm, the use of local subspace models, mixing discrete and continuous data, and the use of semi-linear models and high-dimensional manifolds. I appreciate the cooperation of all those who submitted their work for inclusion in this issue. V. David Snchez A. Editor-in-Chief From baveja at research.att.com Mon Nov 16 10:17:45 1998 From: baveja at research.att.com (satinder singh baveja) Date: Mon, 16 Nov 1998 10:17:45 -0500 Subject: Call for Papers: Machine Learning Journal special issue on REINFORCEMENT LEARNING Message-ID: <36504219.DEA39F52@research.att.com> Machine Learning Journal Special Issue on REINFORCEMENT LEARNING Edited by Satinder Singh Submission Deadline: March 1, 1999 Reinforcement learning has become an exciting focus for research in machine learning and artificial intelligence. Some of this new excitement comes from a convergence of computational approaches to planning and learning in large-scale environments, some from accumulating empirical evidence that reinforcement learning algorithms can solve significant real-world problems, and some from the ever-increasing mathematical maturity of this field. In this special issue we would like to capture a snapshot of this recent excitement, and invite submission of papers describing new work in reinforcement learning. New algorithms, problem frameworks, theoretical results, and empirical results are all appropriate contributions. Submission Deadline: March 1, 1999 Expected publication Date: January, 2000 SUBMISSION INSTRUCTIONS *********************** Submissions will undergo the standard Machine Learning journal review process. Send one hard-copy of submissions to: Satinder Singh AT&T Shannon Lab Room A269 180 Park Avenue Florham Park, NJ 07932 USA Fax: (973) 360-8970 E-mail: baveja at research.att.com If possible, e-mail the abstract separately to baveja at research.att.com, even before it is final, so that reviewers can be allocated efficiently. Also mail five hard-copies of submitted papers to: Karen Cullen MACHINE LEARNING Editorial Office Kluwer Academic Publishers 101 Philip Drive Assinippi Park Norwell, MA 02061 USA phone: (617) 871-6300 E-mail: karen at world.std.com **** PLEASE INDICATE that your submission is for the REINFORCEMENT **** LEARNING SPECIAL issue. Submissions exceptionally longer than 24 pages when formatted according the journal's style (pointers to which are below) may be rejected without review. Note: Machine Learning is now accepting submission for FINAL copy in electronic form. There is a latex style file and a related files available via the url: http://www.wkap.nl/journalhome.htm/0885-6125 Authors are strongly encouraged to use these style files or the formatting instructions stated at the back of the Machine Learning journal for their submissions. From omlin at waterbug.cs.sun.ac.za Mon Nov 16 08:29:55 1998 From: omlin at waterbug.cs.sun.ac.za (Christian Omlin) Date: Mon, 16 Nov 1998 15:29:55 +0200 Subject: preprints available Message-ID: The technical reports below are available from my homepage at http://www.cs.sun.ac.za/people/staff/omlin.html. Comments are welcome. My apologies if you receive multiple copies of this announcement. Best regards, Christian Christian W. Omlin e-mail: omlin at cs.sun.ac.za Department of Computer Science phone (direct): +27-21-808-4932 University of Stellenbosch phone (secretary): +27-21-808-4232 Private Bag X1 fax: +27-21-808-4416 Stellenbosch 7602 http://www.cs.sun.ac.za/people/staff/omlin.html SOUTH AFRICA http://www.neci.nj.nec.com/homepages/omlin ------------------------------------------------------------------------------------ Recurrent Neural Networks Learn Deterministic Representations of Fuzzy Finite-State Automata C.W. Omlin$^a$, C.L. Giles$^{b,c}$ $^a‾$Department of Computer Science University of Stellenbosch 7600 Stellenbosch South Africa $^b‾$NEC Research Institute 4 Independence Way Princeton, NJ 08540 USA $^c‾$UMIACS University of of Maryland College Park, MD 20742 USA ABSTRACT The paradigm of deterministic finite-state automata (DFAs) and their corresponding regular languages have been shown to be very useful for addressing fundamental issues in recurrent neural net- works. The issues that have been addressed include knowledge rep- resentation, extraction, and refinement as well development of advanced learning algorithms. Recurrent neural networks are also very promising tool for modeling discrete dynamical systems through learning, particularly when partial prior knowledge is available. The drawback of the DFA paradigm is that it is inap- propriate for modeling vague or uncertain dynamics; however, many real-world applications deal with vague or uncertain information. One way to model vague information in a dynamical system is to allow for vague state transitions, i.e. the system may be in several states at the same time with varying degree of certainty; fuzzy finite-state automata (FFAs) are a formal equivalent of such systems. It is therefore of interest to study how uncertain- ty in the form of FFAs can be modeled by deterministic recurrent neural networks. We have previously proven that second-order re- current neural networks are able to represent FFAs, i.e. recur- rent networks can be constructed that assign fuzzy memberships to input strings with arbitrary accuracy. In such networks, the classification performance is independent of the string length. In this paper, we are concerned with recurrent neural networks that have been trained to behave like FFAs. In particular, we are interested in the internal representation of fuzzy states and state transitions and in the extraction of knowledge in symbolic form. ------------------------------------------------------------------------------------ Equivalence in Knowledge Representation: Automata, Recurrent Neural Networks, and Dynamical Fuzzy Systems C.W. Omlin$^a$, C.L. Giles$^{b,c}$, K.K. Thornber $^b$ $^a‾$Department of Computer Science University of Stellenbosch 7600 Stellenbosch South Africa $^b‾$NEC Research Institute 4 Independence Way Princeton, NJ 08540 USA $^c‾$UMIACS University of of Maryland College Park, MD 20742 USA ABSTRACT Neuro-fuzzy systems - the combination of artificial neural net- works with fuzzy logic - are becoming increasingly popular. How- ever, neuro-fuzzy systems need to be extended for %are often in- adequate for applications which require context (e.g., speech, handwriting, control). Some of these applications can be modeled in the form of finite-state automata. Previously, it was proved that deterministic finite-state automata (DFAs) can be stably synthesized or mapped into second-order recurrent neural networks with sigmoidal discriminant functions and sparse interconnection topology by programming the networks' weights to $+H$ or $-H$. Based on those results, this paper proposes a synthesis method for mapping fuzzy finite-state automata (FFAs) into recurrent neural networks which is suitable for implementation in VLSI, i.e. the encoding of FFAs is a generalization of the encoding of DFAs. The synthesis method requires FFAs to undergo a transfor- mation prior to being mapped into recurrent networks. Their neu- rons have a slightly enriched functionality in order to accommo- date a fuzzy representation of FFA states, i.e. any state can be occupied with a fuzzy membership that takes on values in the range $[0, 1]$ and several fuzzy states can be occupied at any given time. [ This is in contrast to stochastic finite-state automata where there exists no ambiguity about which is an au- tomaton's current state. The automaton can only be in exactly one state at any given time and the choice of a successor state is determined by some probability distribution. For a discussion of the relation between probability and fuzziness, see for in- stance] The enriched neuron functionality allows fuzzy parameters of FFAs to be directly represented as parameters of the neural network. In this paper we prove the stability of fuzzy finite- state dynamics of constructed neural networks for finite values of network weight $H$ and through simulations give empirical val- idation of the proofs. ------------------------------------------------------------------------------------ Dynamic Adaptation of Recurrent Neural Network Architectures Guided By Symbolic Knowledge C.W. Omlin$^a$, C.L. Giles$^{b,c}$ $^a‾$Department of Computer Science University of Stellenbosch 7600 Stellenbosch South Africa $^b‾$NEC Research Institute 4 Independence Way Princeton, NJ 08540 USA $^c‾$UMIACS University of of Maryland College Park, MD 20742 USA ABSTRACT The success and the time needed to train neural networks with a given learning algorithm depend on the learning task, the initial conditions, and the network architecture. Particularly the num- ber of hidden units in feedforward and recurrent neural networks is an important factor. We propose a novel method for dynamical- ly adapting the architecture of recurrent neural networks trained to behave like deterministic finite-state automata (DFAs). It is different from other constructive approaches so far as our method relies on algorithms for extracting and inserting symbolic knowl- edge in the form of DFAs. The network architecture (number of neurons and weight configuration) changes during training based on the symbolic information extracted from undertrained networks. We successfully trained recurrent networks to recognize strings of a regular language accepted by a non-trivial, randomly gener- ated deterministic finite-state automaton. Our empirical results indicate that symbolically-driven network adaptation results in networks that generalize better than networks trained without dy- namic network adaptation or networks trained with standard dynam- ic growing methods. From steve at cns.bu.edu Wed Nov 18 22:11:09 1998 From: steve at cns.bu.edu (Stephen Grossberg) Date: Wed, 18 Nov 1998 23:11:09 -0400 Subject: possible listing Message-ID: The following preprint on the subject of how the visual cortex is organized to give rise to visual percepts has been accepted for publication in Spatial Vision and is currently in press. The paper is available from my home page: http://cns-web.bu.edu/Profiles/Grossberg/ Paper copies are available through the mail. Contact: amos at cns.bu.edu and request by title: Grossberg, S. (1998). How does the cerebral cortex work? Learning, attention, and grouping by the laminar circuits of visual cortex. Spatial Vision, in press. Abstract: The organization of neocortex into layers is one of its most salient anatomical features. These layers include circuits that form functional columns in cortical maps. A major unsolved problem concerns how bottom-up, top-down, and horizontal interactions are organized within cortical layers to generate adaptive behaviors. This article models how these interactions help visual cortex to realize: (1) the binding process whereby cortex groups distributed data into coherent object representations; (2) the attentional process whereby cortex selectively processes important events; and (3) the developmental and learning processes whereby cortex shapes its circuits to match environmental constraints. New computational ideas about feedback systems suggest how neocortex develops and learns in a stable way, and why top-down attention requires converging bottom-up inputs to fully activate cortical cells, whereas perceptual groupings do not. A related article applies these insights to the processing of Synthetic Aperture Radar images. This article is in press in Neural Networks. It is: Mingolla, E., Ross, W., and Grossberg, S. (1998). A Neural Network for Enhancing Boundaries and Surfaces in Synthetic Aperture Radar Images. Paper copies can be gotten by writing amos at cns.bu.edu. From mieko at hip.atr.co.jp Wed Nov 18 04:28:33 1998 From: mieko at hip.atr.co.jp (Mieko Namba) Date: Wed, 18 Nov 1998 18:28:33 +0900 Subject: Announcement of Brain Science of Communications Symposium Message-ID: <199811180928.SAA22769@mailhost.hip.atr.co.jp> Dear members, I am glad to inform you that the "Brain Science of Communications Symposium" will be held as below. The project "Communications of Primates including Humans" is supported by Science and Technology Agency Special Coordination Fund Projects for Strategic Promotion System for Brain Science. Please join us if you have an interest. For details of the project: http://hozu.ctr.atr.co.jp/kbp/index.html For details of the symposium: URL:http://hozu.ctr.atr.co.jp/kbp/symposium.html Mitsuo Kawato ATR Human Information Processing Res. Labs. ****************************************************************** Brain Science of Communications Symposium ****************************************************************** The Science and Technology Agency Special Coordination Fund Project Strategic Promotion System for Brain Science "Communications of Primates including Humans" Date: December 14 (Mon) and 15 (Tue), 1998 Place: The Keihanna Plaza, Kansai Science City Fee: Free (charged for the party, ¥3000) ----------------------------------------------------------------- Dec. 14, 1998 Monday 13:00?18:00 Language: English ----------------------------------------------------------------- 13:00-13:15 Welcome & Introduction Yoh'ichi Tohkura (NTT Basic Res. Labs., ATR-I) 13:15-14:00 Invited Lecture Computational Models of Cortical-Cerebellar and Cortical-Basal Ganglionic Control and Cognition James C. Houk (Northwestern University Medical School) 14:10-14:50 Role of the basal ganglia in learning sequential motor tasks Minoru Kimura (School of Health and Sport Sciences, Osaka Univ.) 15:00-15:40 The roles of reward prediction in motor control and communication Kenji Doya (Kawato Dynamic Brain Project, ERATO) 16:00-16:40 Cerebellar simple spikes and complex spikes during arm movements Shigeru Kitazawa (Electrotechnical Lab.) 16:50-17:30 Computational theory of communication based on multiple paired forward-inverse models Mitsuo Kawato (ATR Human Information Processing Res. Labs.) 17:30-18:00 Discussion 18:00-19:30 Party ----------------------------------------------------------------- Dec.15, 1998 Tuesday 10:00?17:30 Language: Japanese ----------------------------------------------------------------- 10:00-10:40 Universal Properties of Language: Embedding and Long-Distance Dependency Yukio Otsu (Inst. of Cultural and Linguistic Studies, Keio Univ.) 10:50-11:30 Brain mechanisms of verbal and non-verbal communication Toshio Inui (Dept. of Intelligence Science and Technology Graduate School of Informatics, Kyoto Univ.) 11:30-12:00 Discussion 12:00-13:20 Lunch Break 13:20-14:00 FMRI and MEG study of neural activity during judgments of visually presented characters and character strings Norio Fujimaki (Fujitsu Labs. Ltd., Communications Res. Lab.) 14:10-14:50 Neuronal representation of facial information for communication : Single neuronsin the temporal cortex encode finer facial information with a categorized header Shigeru Yamane (Electrotechnical Lab.) 15:10-15:50 Development of Neuromagnetic virtual sensor system Yoshida Yoshikazu (Technology Res. Lab., Shimadzu Corporation) 16:00-16:40 The study of visual grouping in human brain using high spatial resolution MEG Keisuke Toyama (Dept. of Physiology, Kyoto Prefectural Univ. of Medicine) 16:40-17:20 Discussion 17:20-17:30 Closing Remarks Yoh'ichi Tohkura(NTT Basic Research Laboratories, ATR-I Registration Form: Kansai Brain Project Symposium (e-mail to the secretariat: ikeda at ctr.atr.co.jp) ------------------------------------------------------------- Name: ------------------------------------------------------------- Institute: ------------------------------------------------------------- TEL: ------------------------------------------------------------- e-mail: ------------------------------------------------------------- Day 1(12/14) Attending/Not attending ------------------------------------------------------------- Party (We will collect ?3000 per person, ?1000 for students at the front) Evening of Day 1(12/14) Attending/Not attending ------------------------------------------------------------- Day 2(12/15) Attending/Not attending ------------------------------------------------------------- ****************************************************************** end. ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR Human Information Processing Research Laboratories 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL mieko at hip.atr.co.jp ========================================================= From Dave_Touretzky at cs.cmu.edu Wed Nov 18 04:28:36 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Wed, 18 Nov 1998 04:28:36 -0500 Subject: Graduate training in the Neural Basis of Cognition Message-ID: <12550.911381316@skinner.boltz.cs.cmu.edu> Graduate Training with the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers an interdisciplinary doctoral training program operated jointly with nine affiliated PhD programs at Carnegie Mellon University and the University of Pittsburgh. Detailed information about this program is available on our web site at http://www.cnbc.cmu.edu. The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, or robotics. For a brochure describing the program and application materials, contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu Application materials are also available online. The affiliated PhD programs at the two universities are: Carnegie Mellon University of Pittsburgh Biological Sciences Mathematics Computer Science Neurobiology Psychology Neuroscience Robotics Psychology Statistics The CNBC training faculty includes: German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving B.J. Casey (Pitt Psychology): attention; developmental cognitive neuroscience Carson Chow (Pitt Mathematics): spatiotemporal dynamics in neural networks Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Chris Genovese (CMU Statistics): making inferences from scientific data John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Marcel Just (CMU Psychology): visual thinking, language comprehension Robert Kass (CMU Statistics): transmission of info. by collections of neurons Eric Klann (Pitt Neuroscience): hippocampal LTP and LTD Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Pat Levitt (Pitt Neurobiology): molecular basis of cortical development Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Brian MacWhinney (CMU Psychology): models of language acquisition James McClelland (CMU Psychology): connectionist models of cognition Paula Monaghan Nichols (Pitt Neurobiology): vertebrate CNS development Carl Olson (CNBC): spatial representations in primate frontal cortex David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia John Pollock (CMU Biological Sci.): neurodevelopment of the fly visual system Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex William Skaggs (Pitt Neuroscience): representations in rodent hippocampus David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning See http://www.cnbc.cmu.edu for further details. From omlin at waterbug.cs.sun.ac.za Thu Nov 19 04:23:36 1998 From: omlin at waterbug.cs.sun.ac.za (Christian Omlin) Date: Thu, 19 Nov 1998 11:23:36 +0200 Subject: preprints of technical reports - URL correction Message-ID: Dear List Members Thru an oversight on my part, I gave the wrong URL when I announced preprints of technical reports a few days ago; my apologies. The correct URL is http://www.cs.sun.ac.za/people/staff/omlin The link to the technical reports is all the way at the bottom of the page. Alternatively, the papers may now also be downloaded from the U.S. mirror site at http://www.neci.nj.nec.com/homepages/omlin Again, my apologies for the confusion. We welcome your comments. Best regards, Christian From S.Singh at exeter.ac.uk Thu Nov 19 05:53:33 1998 From: S.Singh at exeter.ac.uk (Sameer Singh) Date: Thu, 19 Nov 1998 10:53:33 +0000 (GMT Standard Time) Subject: PhD Studentship in Financial Forecasting Using Neural Networks Message-ID: PhD STUDENTSHIP IN FINANCIAL FORECASTING SCHOOL OF ENGINEERING AND COMPUTER SCIENCE Department of Computer Science Applications are now invited for a PhD studentship in the area of "Financial Forecasting using Neural Networks". The project will develop student skills in areas including neural networks, financial markets, forecasting, and pattern recognition. Candidates for this studentship should have a degree in computer science, engineering or a related subject. They should have programming skills in C/C++/JAVA and knowledge of unix operating system. The studentships are only open to UK and EU applicants, the latter being on a fees-only basis. The successful candidate should expect to take up the studentships no later than 1 February, 1999. Applicants should send a CV, including the names and addresses of two referees, to Dr Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK. The applications can also be emailed at s.singh at exeter.ac.uk and informal enquiries can be made at +44-1392-264053. -------------------------------------------- Sameer Singh Department of Computer Science University of Exeter Exeter EX4 4PT UK tel: +44-1392-264053 fax: +44-1392-264067 email: s.singh at exeter.ac.uk web: http://www.dcs.exeter.ac.uk/academics/sameer -------------------------------------------- From steve at cns.bu.edu Fri Nov 20 20:44:14 1998 From: steve at cns.bu.edu (Stephen Grossberg) Date: Fri, 20 Nov 1998 21:44:14 -0400 Subject: a special issue of interest to Connectionists Message-ID: Below is a compilation of the contents for the 1998 Special Issue of Neural Networks on "Neural Control and Robotics: Biology and Technology", edited by Rodney Brooks, Stephen Grossberg, and Lance Optican. The contents of this and other journals published by Elsevier are distributed by the ContentsDirect service (http://www.elsevier.nl/locate/ContentsDirect). Journal: Neural Networks, Volume 11(7/8), 1998 VISION SENSORS: Vladimir M. Brajovic, Ryohei Miyagawa, and Takeo Kanade: Temporal photoreception for adaptive dynamic range image sensing and encoding pp. 1149-1158 CONTROL OF EYE AND HEAD MOVEMENTS: Gregory Gancarz and Stephen Grossberg: A neural model of the saccade generator in the reticular formation pp. 1159-1174 Philippe Lefevre, Christian Quaia, and Lance M. Optican: Distributed model of control of saccades by superior colliculus and cerebellum pp. 1175-1190 Francesco Panerai and Giulio Sandini: Oculo-motor stabilization reflexes: Integration of inertial and visual information pp. 1191-1204 Paul Dean and John Porrill: Pseudo-inverse control in biological systems: A learning mechanism for fixation stability pp. 1205-1218 Michael G. Paulin: A method for analysing neural computation using receptive fields in state space pp. 1219-1228 Christian Quaia, Lance M. Optican, and Michael E. Goldberg: The maintenance of spatial accuracy by the perisaccadic remapping of visual receptive fields pp. 1229-1240 Jeffrey D. Schall and Doug P. Hanes: Neural mechanisms of selection and control of visually guided eye movements pp. 1241-1251 H. Sebastian Seung: Continuous attractors and oculomotor control pp. 1253-1258 Yasuo Kuniyoshi and Luc Berthouze: Neural learning of embodied interaction dynamics pp. 1259-1276 CONTROL OF ARM AND OTHER BODY MOVEMENTS: Andrew H. Fagg and Michael A. Arbib: Modeling parietal-premotor interactions in primate control of grasping pp. 1277-1303 J.J. van Heijst, J.E. Vos, and D. Bullock: Development in a biologically inspired spinal neural network for movement control pp. 1305-1316 Daniel M. Wolpert and Mitsuo Kawato: Multiple paired forward and inverse models for motor control pp. 1317-1329 Hiroyuki Miyamoto and Mitsuo Kawato: A tennis serve and upswing learning robot based on bi-directional theory pp. 1331-1344 Joseph A. Doeringer and Neville Hogan: Serial processing in human movement production pp. 1345-1356 H.A. Tabeli, K. Khorasani, and R.V. Patel: Neural network based control schemes for flexible-link manipulators: Simulations and experiments pp. 1357-1377 Matthew M. Williamson: Neural control of rhythmic arm movements pp. 1379-1394 Jean-Luc Buessler and Jean-Phillipe Urban: Visually guided movements: Learning with modular neural maps in robotics pp. 1385-1415 Pietro G. Morasso, Vittorio Sanguineti, Francesco Frisone, and Luca Perico: Coordinate-free sensorimotor processing: Computing with population codes pp. 1417-1428 Bjorn O. Peters, Gert Pfurtscheller, and Henrik Flyvbjerg: Mining multi-channel EEG for its information content: An ANN-based method for a brain-computer interface pp. 1429-1433 CONTROL OF LOCOMOTION AND NAVIGATION: Holk Cruse, Thomas Kindermann, Michael Schumm, Jeffrey Dean, and Josef Schmitz: Walknet--A biologically inspired network to control six-legged walking pp. 1435-1447 Gennady S. Cymbalyuk, Roman M. Borisyuk, Uwe Mueller-Wilm, and Holk Cruse: Oscillatory network controlling six-legged locomotion: Optimization of model parameters pp. 1449-1460 Dario Floreano and Francesco Mondada: Evolutionary neurocontrollers for autonomous mobile robots pp. 1461-1478 Barbara Webb: Robots, crickets and ants: Models of neural control of chemotaxis and phonotaxis pp. 1479-1496 Terry Huntsberger and John Rose: BISMARC: A biologically inspired system for map-based autonomous rover control pp. 1497-1510 John J. Weng and Shaoyun Chen: Vision-guided navigation using SHOSLIF pp. 1511-1529 Paul F.M.J. Verschure and Thomas Voegtlin: A bottom up approach towards the acquisition and expression of sequential representations applied to a behaving real-world device: Distributed adaptive control III pp. 1531-1549 Christian Sheier, Rolf Pfeifer, and Yasuo Kuniyoshi: Embedded neural networks: Exploiting constraints pp. 1551-1569 From jose at tractatus.rutgers.edu Sun Nov 8 12:54:36 1998 From: jose at tractatus.rutgers.edu (Stephen Jose Hanson) Date: Sun, 08 Nov 1998 12:54:36 -0500 Subject: RUTGERS DIGITAL LIBRARIES--GRADUATE AWARDS Message-ID: <3645DADB.5A1191EC@tractatus.rutgers.edu> [ Moderator's note: Steve Hanson informs me that people interested in doing neural network or connectionist modeling research are among those being sought for these graduate fellowships. The steering committee includes several people who are well-known in the machine learning community. -- Dave Touretzky, CONNECTIONSTS moderator ] GRADUATE AWARDS FOR INTERDISCIPLINARY STUDY IN DIGITAL LIBRARIES The Rutgers University Distributed Laboratory for Digital Libraries (RDLDL) is pleased to announce the availability of competitive graduate awards for interdisciplinary doctoral studies in digital libaries, including contributing technologies and relevant basic research. Digital libraries is an exciting new domain for research, and for society at large, and the RDLDL is taking a leading role in establishing an explicitly interdisciplinary approach to the variety of problems in this burgeoning area. Current faculty members of the RDLDL come from the disciplines of cognitive science, computer science, library and information science and psychology, and students in the program are expected to participate in research and to take courses in two or more of the disciplines represented in the RDLDL. Successful candidates will participate in digital library-oriented research projects with several faculty members of the RDLDL, as well as being enrolled in a Ph.D. program in one of the disciplines associated with the RDLDL. They will also take part in an interdisciplinary seminar involving the faculty and all of the other graduate students associated with the RDLDL. The RDLDL Awards offer tuition remission and an annual stipend of $13,000, and the opportunity to achieve an interdisciplinary education for research in the interdisciplinary field of digital libaries. These awards are tenable for one year in the first instance, beginning in either January or September 1999, and are renewable. For further information, including the specific interests of the RDLDL faculty, and application materials, please see the RDLDL WWW site, http://diglib.rutgers.edu/RDLDL or contact any one of the members of the RDLDL Steering Committee, listed below. Steering Committee of the Rutgers Distributed Laboratory for Digital Libraries Nicholas J. Belkin, Department of Library and Information Science, New Brunswick Campus nick at belkin.rutgers.edu Benjamin M. Bly, Department of Psychology, Newark Campus ben at psychology.rutgers.edu Sven Dickinson, Department of Computer Science, New Brunswick Campus sven at ruccs.rutgers.edu Stephen Hanson, Department of Psychology, Newark Campus jose at tractatus.rutgers.edu Haym Hirsh, Department of Computer Science, New Brunswick Campus hirsh at cs.rutgers.edu Paul Kantor, Department of Library and Information Science, New Brunswick Campus kantor at scils.rutgers.edu Zenon Pylyshyn, Rutgers Center for Cognitive Science, New Brunswick Campus zenon at ruccs.rutgers.edu To contact us by mail, please write to: Rutgers Distributed Laboratory for Digital Libraries c/o School of Communication, Information and Library Studies Rutgers University 4 Huntington Street New Brunswick, NJ 08901-1071, USA From thimm at idiap.ch Fri Nov 20 08:27:54 1998 From: thimm at idiap.ch (Georg Thimm) Date: Fri, 20 Nov 1998 14:27:54 +0100 Subject: Contents of Neurocomputing 22 (1998) Message-ID: <199811201327.OAA17337@rotondo.idiap.ch> Dear reader, Please find below a compilation of the contents for Neurocomputing and Scanning the Issue written by V. David Snchez A. More information on the journal are available at the URL http://www.elsevier.nl/locate/jnlnr/05301 . The contents of this and other journals published by Elsevier are distributed also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect). Please feel free to redistribute this message. My apologies if this message is inappropriate for this mailing list; I would appreciate a feedback. With kindest regards, Georg Thimm Dr. Georg Thimm Research scientist & WWW: http://www.idiap.ch/‾thimm Current Events Editor of Neurocomputing Tel.: ++41 27 721 77 39 (Fax: 12) IDIAP / C.P. 592 / 1920 Martigny / Suisse E-mail: thimm at idiap.ch ******************************************************************************** Journal : NEUROCOMPUTING ISSN : 0925-2312 Vol./Iss. : 22 / 1-3 The nonlinear PCA criterion in blind source separation: Relations with other approaches Karhunen , Juha pp.: 5-20 Blind separation of convolved mixtures in the frequency domain Smaragdis , Paris pp.: 21-34 Blind source separation using algorithmic information theory Pajunen , Petteri pp.: 35-48 Independent component analysis in the presence of Gaussian noise by maximizing joint likelihood Hyva"rinen , Aapo pp.: 49-67 Learned parametric mixture based ICA algorithm Xu , Lei pp.: 69-80 Bayesian Kullback Ying--Yang dependence reduction theory Xu , Lei pp.: 81-111 Robust techniques for independent component analysis (ICA) with noisy data Cichocki , A. pp.: 113-129 Searching for a binary factorial code using the ICA framework Palmieri , Francesco pp.: 131-144 Constrained PCA techniques for the identification of common factors in data Charles , Darryl pp.: 145-156 A method of blind separation for convolved non-stationary signals Kawamoto , Mitsuru pp.: 157-171 Removing artifacts from electrocardiographic signals using independent components analysis Barros , Allan Kardec pp.: 173-186 >From neural learning to independent components Oja , Erkki pp.: 187-199 A nonlinear model of the binaural cocktail party effect Girolami , Mark pp.: 201-215 ******************************************************************************** Neurocomputing 22 (1998) Scanning the issue In The nonlinear PCA criterion in blind source separation: Relations with other approaches J. Karhunen, P. Pajunen and E. Oja derive the nonlinear Principal Component Analysis (PCA) in Blind Source Separation (BSS) in a form appropriate for comparison with other BSS and Independent Component Analysis (ICA). The choice of the optimal nonlinearity is explained. P. Smaragdis presents Blind separation of convolved mixtures in the frequency domain using information theoretic algorithms. Improved efficiency and better convergence are accomplished by the proposed approach which shows clear advantages when compared to its time-domain counterparts. The filter parameters in the frequency domain are orthogonal to each other, therefore one can update one parameter without any influence on the rest of parameters. P. Pajunen describes Blind source separation using algorithmic information theory. The algorithmic complexity of the sources and the mixing mapping is measured. Natural signals meet often the requirement of low complexity. An experiment consisting of separating correlated signals is discussed. A Hyvrinen presents Independent component analysis in the presence of Gaussian noise by maximizing joint likelihood. In the presence of noise the relationship between observed data and the estimates of the independent components is non-linear. For supergaussian (sparse) data the nonlinearity can be approximated by a shrinkage operation and realized using competitive learning. For subgaussian components anti-competitive learning can be used. L. Xu, C.C. Cheung and S. Amari describe a Learned parametric mixture based ICA algorithm. It is based on linear mixture and its separation capability is shown to be superior to the original model with prefixed nonlinearity. Experiments with subgaussian, supergaussian and combinations of these types of sources confirm the applicability of the algorithm. L. Xu presents the (BKYY-DR) Bayesian Kullback Yingang Dependence Reduction theory. In particular the solution of the Blind Source Separation (BSS) problem is addressed. Algorithms and criteria for parameter learning and model selection are based on stochastic approximation. They are given for a general BKYY-DR system and its three typical architectures. Experiments with binary sources are reported. A. Cichocki, S.C. Douglas and S. Amari propose Robust techniques for independent component analysis (ICA) with noisy data. A recurrent dynamic neural network architecture is introduced for simultaneous unbiased estimation of the unknown mixing matrix, blind source separation and noise reduction in the extracted output signals. The shape parameters of the nonlinearities are adjusted using gradient-based rules. F. Palmieri, A. Budillon, M. Calabrese and D. Mattera present Searching for a binary factorial code using the ICA framework. Independent Component Analysis (ICA) is formulated as density search and used for finding a mapping of the input space into a binary string with independent bits, i.e. a binary factorial code. Experiments with a real image show the feasibility of the approach whose results are compared with the ones of a Principal Component Analyzer (PCA). D. Charles describes Constrained PCA techniques for the identification of common factors in data. An unsupervised learning network is presented that operates similarly to Principal Factor Analysis. The network responds to the covariance of the input data. Extensions include preprocessing and a function that supports sparseness at the network output. M. Kawamoto, K. Matsuoka and N. Ohnishi present A method of blind separation for convolved non-stationary signals. The method extracts non-stationary signals from their convolutive mixtures. The minimization of Matsuoka's cost function is performed. Simulations confirm the feasibility of the method. The cases number of observed signals greater or equal to the number of source signals are analyzed. A.K. Barros, A. Mansour and N. Ohnishi describe Removing artifacts from ECG signals using independent component analysis. An architecture is proposed consisting of a high-pass filter, a two-layer network based on the Independent Component Analysis (ICA) algorithm, and a self-adaptive step-size. Simulations using a standard ECG database are performed. E. Oja presents From neural learning to independent components. Emphasis is placed on the connection between independent Component Analysis (ICA) and neural learning, especially constrained Hebbian learning. The latter is a non-linear extension of Principal Component Analysis (PCA). Results are given on stationary points and their asymptotic stability. M. Girolami describes A nonlinear model of the binaural cocktail party effect. The network is based on a temporal anti-Hebbian adaptive maximum likelihood estimator and its operation is similar to many approaches for blind source separation of convolutive mixtures. Impressive results are obtained when using simulations. Favorable results are obtained under realistic unconstrained acoustic conditions when compared with current adaptive filters. I appreciate the cooperation of all those who submitted their work for inclusion in this issue. V. David Snchez A. Editor-in-Chief From cf99 at stern.nyu.edu Mon Nov 23 01:40:03 1998 From: cf99 at stern.nyu.edu (Andreas Weigend - Computational Finance 99) Date: Mon, 23 Nov 1998 01:40:03 -0500 (EST) Subject: Computational Finance CF99 Program and Registration Message-ID: <199811230640.BAA01787@kuaile.stern.nyu.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 26639 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/51e48358/attachment-0001.ksh From harnad at coglit.soton.ac.uk Mon Nov 23 14:59:16 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Mon, 23 Nov 1998 19:59:16 +0000 (GMT) Subject: Color & Consciousness: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article (see also 4 important announcements about new BBS policies at the end of this message) COLOR, CONSCIOUSNESS, AND THE ISOMORPHISM CONSTRAINT by Stephen E. Palmer This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ COLOR, CONSCIOUSNESS, AND THE ISOMORPHISM CONSTRAINT Stephen E. Palmer Psychology Department University of California Berkeley, CA 94720-1650 palmer at cogsci.berkeley.edu http://socrates.berkeley.edu/‾plab ABSTRACT: The relations among consciousness, brain, behavior, and scientific explanation are explored within the domain of color perception. Current scientific knowledge about color similarity, color composition, dimensional structure, unique colors, and color categories is used to assess Locke's "inverted spectrum argument" about the undetectability of color transformations. A symmetry analysis of color space shows that the literal interpretation of this argument -- reversing the experience of a rainbow -- would not work. Three other color-to-color transformations might, however, depending on the relevance of certain color categories. The approach is then generalized to examine behavioral detection of arbitrary differences in color experiences, leading to the formulation of a principled distinction, called the isomorphism constraint, between what can and cannot be determined about the nature of color experience by objective behavioral means. Finally, the prospects for achieving a biologically based explanation of color experience below the level of isomorphism are considered in light of the limitations of behavioral methods. Within-subject designs using biological interventions hold the greatest promise for scientific progress on consciousness, but objective knowledge of another person's experience appears impossible. The implications of these arguments for functionalism are discussed. In this article I discuss the relations among mind, brain, behavior, and science in the particular domain of color perception. My reasons for approaching these difficult issues from the perspective of color experience are two-fold. First, there is long philosophical tradition of debating the nature of internal experiences of color, dating from John Locke's (1690) discussion of the so-called "inverted spectrum argument". This intuitively compelling argument constitutes an important historical backdrop for much of the article. Second, color is perhaps the most tractable, best understood aspect of mental life from a scientific standpoint. It demonstrates better than any other topic how a mental phenomenon can be more fully understood by integrating knowledge from many different disciplines (Kay & McDaniel, 1978; Thompson, 1995; Palmer, in press). In this article I turn once more to color for new insights into how conscious experience can be studied and understood scientifically. I begin with a brief description of the inverted spectrum problem as posed in classical philosophical terms. I then discuss how empirical constraints on the answer can be brought to bear in terms of the structure of human color experience as it is currently understood scientifically. This discussion ultimately leads to a principled distinction, called the isomorphism constraint, between what can and what cannot be determined about the nature of experience by objective behavioral means. Finally, I consider the prospects for achieving a biologically based explanation of color experience, ending with some speculations about limitations on what science can achieve with respect to understanding color experience and other forms of consciousness. ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/‾harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.palmer.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.palmer ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.palmer To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.palmer When you have the file(s) you want, type: quit ____________________________________________________________ FOUR IMPORTANT ANNOUNCEMENTS ------------------------------------------------------------------ (1) There have been some extremely important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/‾harnad/science.html Nature: http://www.cogsci.soton.ac.uk/‾harnad/nature.html http://www.cogsci.soton.ac.uk/‾harnad/nature2.html American Scientist: http://www.cogsci.soton.ac.uk/‾harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is exceedingly simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ Paper/Disk: Behavioral and Brain Sciences Department of Electronics and Computer Science New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/‾harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html Nominations of books for BBS Multiple book review are also invited. From black at signal.dera.gov.uk Tue Nov 24 05:57:21 1998 From: black at signal.dera.gov.uk (John V. Black) Date: Tue, 24 Nov 98 10:57:21 +0000 Subject: IEE Colloquium on Applied Statistical Pattern Recognition Message-ID: IEE COLLOQUIUM ON APPLIED STATISTICAL PATTERN RECOGNITION Tuesday, 20 April 1999 to be held at The Midlands Engineering Centre, Birmingham, UNITED KINGDOM Professional Group E5 (Signal processing) and the Pattern & Information Processing Group, DERA Malvern, United Kingdom The innovative application of advanced pattern recognition techniques is increasingly required to achieve the high levels of performance demanded by modern systems. It is therefore vital for engineers and scientists to develop their awareness and understanding of emerging and established techniques, and of their successful application. This colloquium will provide a focus for applied research in this area, drawing contributions both from industry and academia. It will increase awareness of developing techniques across a broad range of applications areas; and will be of interest to people working in all areas of statistical pattern recognition, addressing applications drawn (for example) from signal and image processing. A number of invited papers will be presented by leading researchers: some of these will take the form of illustrative tutorial sessions. Additional contributions are welcome, and a one page synopsis of proposed papers should be sent by Friday, 15 January 1999 to: Dr Amanda Goode Senior Research Scientist, Pattern & Information Processing DERA Malvern St Andrews Road Great Malvern Worcestershire WR14 3PS United Kingdom Tel: (+44) (0)1684 894366 Fax: (+44) (0)1684 894384 Email: goode at signal.dera.gov.uk From terry at salk.edu Wed Nov 25 02:54:38 1998 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 24 Nov 1998 23:54:38 -0800 (PST) Subject: NEURAL COMPUTATION 11:1 Message-ID: <199811250754.XAA21965@helmholtz.salk.edu> Neural Computation - Contents - Volume 11, Number 1 - January 1, 1999 *** A list of all reviewers for Vols 5-10 will be included in this 10th anniversary issue. REVIEW Evolution of Time Coding Systems C. E. Carr and M. A. Friedman ARTICLE Computing With Self-Excitatory Cliques: A Model and an Application to Hyperacuity-Scale Computation in Visual Cortex Steven Zucker and Douglas Miller NOTES Complex Response to Periodic Inhibition In Simple and Detailed Neuronal Models Corrado Bernasconi, Kaspar Schindler, Ruedi Stoop and Rodney Douglas Neuronal Tuning: To Sharpen Or Broaden? Kechen Zhang and Terrence J. Sejnowski Narrow vs Wide Tuning Curves: What's Best for a Population Code? Alexandre Pouget, Sophie Deneve, Jean-Christophe Ducom, and Peter Latham LETTERS The Effect of Correlated Variability on the Accuracy of a Population Code L. F. Abbott and Peter Dayan A Neural Network Model of Temporal Code Generation and Position Invariant Pattern Recognition Dean V. Buonomano and Michael Merzenich Probabilistic Synaptic Transmission in the Associative Net Bruce Graham and David Willshaw A Stochastic Self-Organizing Map for Proximity Data Thore Graepel and Klaus Obermayer High-Order Constrasts for Independent Component Analysis Jean-Francois Cardoso Variational Learning in Nonlinear Gaussian Belief Networks Brendan J. Frey and Geoffrey E. Hinton Propagating Distributions Up Directed Acyclic Graphs Eric B. Baum and Warren D. Smith Modeling and Prediction of Human Behavior Alex Pentland and Andrew Liu Analog VLSI-Based Modeling of the Primate Oculomotor System Timothy K. Horiuchi and Christof Koch JPEG Quality Transcoding Using Neural Networks Trained With a Perceptual Error Measure John Lazzaro and John Wawrzynek ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1999 - VOLUME 11 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $84 Individual $82 $87.74 $116 Institution $302 $323.14 $336 * includes 7% GST (Back issues from Volumes 1-10 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From horn at neuron.tau.ac.il Wed Nov 25 11:11:03 1998 From: horn at neuron.tau.ac.il (David Horn) Date: Wed, 25 Nov 1998 18:11:03 +0200 (IST) Subject: NCST-99 Message-ID: Conference Announcement ------------------------- Neural Computation in Science and Technology --------------------------------------------- Israel, October 10-13, 1999 This conference, sponsored by the Israel Ministry of Science, aims to bring together scientists and practitioners of neural computation who are interested in scientific as well as technological applications. It will focus on two main general topics: A. The use of neural network techniques in various applications in science and technology. B. Modeling in computational neuroscience. The conference's dual focus on neuroscience and technological themes follows one of the fundamental ideas of the field of Computational Neuroscience, i.e., the belief that such cross-disciplinary interactions will be beneficial to researchers of both schools. Such cross-fertilization of ideas has already manifested itself in both directions over the last decade: On one hand, thinking about the brain in computational concepts has given rise to the development of artificial neural networks as leading computing devices for performing a multitude of technological tasks, such as classification, function approximation, prediction and control. On the other hand, basic engineering and technological methods such as principal component analysis, blind separation, learning vector quantization and reinforcement learning have all led to the development of neurally-based realizations. The latter, in turn, have given rise to many interesting ideas about the possible central role that such networks may play in the brain in a multitude of information processing tasks, varying from odor processing to reward-related action selection to self-organization of cortical maps, just to name a few. Moreover, new frontiers of neural computation such as neuromorphic engineering and adaptive autonomous control manifest even further the growing need for technological-scientific interaction, to fully exploit the potential promises of neural computation. The conference will consist of a four day meeting that will include tutorials, a series of invited and contributed talks, a poster session, and a discussion panel. An informal atmosphere will be maintained, encouraging questions and discussions. It will take place in a hotel that serves also as a conference center in Kibbutz Maale Hachamisha, located between Tel Aviv and Jerusalem. It has the nice scenery, and pleasant weather, of the Judean hills. Currently Confirmed Invited Speakers: -------------------------------------- D. Amit, Y. Baram, W. Bialek, E. Domany, R. Douglas, G. Dreyfus, D. Golomb, M. Hasselmo, J. Hertz, D. Horn, N. Intrator, I. Kanter, W. Kinzel, H. Oja, E. Ruppin, I. Segev, T. Sejnowski, H. Sompolinsky, N. Tishby M. Tsodyks and V. Vapnik. Call for Papers ------------------ We call herewith for the submission of papers for oral or poster presentation at the conference. The papers should fit into the two general categories of the conference. Submitted papers should be limited to seven pages in length and indicate physical and e-mail addresses of all authors and the author to whom correspondence should be addressed. Four copies of the submitted papers should reach the conference scientific committee by May 15th, 1999. Mail submissions to Prof. David Horn, NCST-99, School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel. Scientific Organizing Committee --------------------------------- D. Horn (chairman), H. Abramowicz, A. Agranat, Y. Baram, I. Kanter, E. Ruppin, N. Tishby, M. Tsodyks, G. Ariely and B. Shimony. Information and Registration: ----------------------------- Registration forms and information about the conference are available at the website http://neuron.tau.ac.il/NCST-99 ---------------------------------------------------------------------------- Prof. David Horn horn at neuron.tau.ac.il School of Physics and Astronomy http://neuron.tau.ac.il/‾horn Tel Aviv University Tel: ++972-3-642-9305, 640-7377 Tel Aviv 69978, Israel. Fax: ++972-3-640-7932 From gorr at willamette.edu Wed Nov 25 13:21:42 1998 From: gorr at willamette.edu (Jenny Orr) Date: Wed, 25 Nov 1998 10:21:42 -0800 (PST) Subject: Book Announcement: Tricks of the Trade Message-ID: <199811251821.KAA16692@mirror.willamette.edu> Dear collegues, The following book is now available: Neural Networks: Tricks of the Trade, edited by Genevieve B. Orr and Klaus-Robert M¥"uller LNCS 1524, Springer Heidelberg (1998) http://www.springer.de/comp/lncs/neuralnw/index.html (contains ordering and other information) Some background information: The idea for this book dates back to our 1996 NIPS workshop on tricks of the trade (http://www.first.gmd.de/persons/Mueller.Klaus-Robert/nipsws.htm). People who work with neural networks acquire, through experience and word-of-mouth, techniques and heuristics that help them successfully apply neural networks to difficult real world problems. Often these tricks are theoretically well motivated. Sometimes they are the result of trial and error. However, their most common link is that they are usually hidden in people's heads or in the back pages of space constrained conference papers. As a result newcomers to the field waste much time wondering why their networks train so slowly and perform so poorly. The tricks book tried to collect all this interesting and useful information about how to train neural networks. For a glimpse on the table of content see: http://www.springer.de/comp/lncs/neuralnw/contents.pdf Contributors are: Anderson, Back, Bottou, Burns, Caruana, Denker, Finke, Flake, Fritsch, Giles, Hansen, Hirzinger, Horn, Intrator, Larsen, Lawrence, LeCun, Lyon, M¥"uller, Moody, Naftaly, Neuneier, Orr, Plate, Prechelt, R¥"ognvaldsson, Schraudolph, Simard, van der Smagt, Svarer, Tsoi, Victorri, Webb, Yaeger, Zimmermann. Best wishes, Klaus-Robert M¥"uller & Genevieve B. Orr From shultz at psych.mcgill.ca Wed Nov 25 13:57:18 1998 From: shultz at psych.mcgill.ca (Tom Shultz) Date: Wed, 25 Nov 1998 13:57:18 -0500 Subject: Academic position at McGill Message-ID: <3.0.2.32.19981125135718.007a2430@psych.mcgill.ca> Dear Connectionists, Here is a notice for an academic position at McGill University that may interest you or your colleagues or students. Please feel free to circulate this notice. MCGILL UNIVERSITY DEPARTMENT OF PSYCHOLOGY The Department of Psychology of McGill University seeks applicants for a tenure-track position at the Assistant Professor level in Cognitive Psychology broadly construed. We seek applicants with a strong program of research and teaching in areas such as cognitive psychology, computational modeling, decision-making, cognitive neuroscience. The Department has excellent facilities for interdisciplinary research through its links with McGill Cognitive Science, the Montreal Neurological Institute, and related departments. The deadline for receipt of completed applications is February 1, 1999, with an anticipated starting date of September 1, 1999. Applicants should arrange for three confidential letters of recommendation to be sent to the address below. Statements of current and proposed areas of research and teaching, curriculum vitae, selected reprints, and other relevant material should also be sent to: Thomas Shultz, Chair, Cognitive Psychology Search Committee, Department of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, Quebec, Canada H3A 1B1. In accordance with Canadian immigration requirements, priority will be given to Canadian citizens and permanent residents of Canada. McGill University is committed to equity in employment. Although priority is given to Canadian citizens and permanent residents of Canada, final decisions will be based on the relative excellence of candidates. Therefore, non-Canadians should seriously consider applying if they are interested in the position. Neural network modelers with an interest in cognition and neuroscience are particularly encouraged to apply. Cheers, Tom -------------------------------------------------------- Thomas R. Shultz, Professor, Department of Psychology, McGill University, 1205 Penfield Ave., Montreal, Quebec, Canada H3A 1B1. E-mail: shultz at psych.mcgill.ca http://www.psych.mcgill.ca/labs/lnsc/html/Lab-Home.html Phone: 514 398-6139 Fax: 514 398-4896 -------------------------------------------------------- From sd136 at umail.umd.edu Wed Nov 25 11:45:46 1998 From: sd136 at umail.umd.edu (Sandy Davis) Date: Wed, 25 Nov 1998 11:45:46 -0500 (Eastern Standard Time) Subject: Computational neuroscientist faculty position Message-ID: Computational Neuroscientist. The Neuroscience and Cognitive Science Program at the Univ. of Maryland (www.life.umd.edu/nacs) seeks tenure-track faculty. Areas of research include sensory and motor physiology, analysis of control systems or cognitive neuroscience. A wide range of techniques can be supported, from neural network modeling to brain mapping/imaging/EEG. Candidates who integrate theoretical with experimental research are preferred. Tenure will be held in the Depts. of Psychology or Kinesiology. Teaching duties include a graduate course in computational neuroscience and undergraduate course(s). For appointment in Psychology, the rank will be Assistant Professor. For Kinesiology, rank is open. For best consideration send, before Feb. 1, 1999, a CV, addresses (including emails) of three references, copies of publications, and statements of both research (documenting any previous extramural funding) and teaching interests to: Dr. Richard Payne, NACS Search, 2239 Zoo/Psych Building, University of Maryland, College Park, MD 20742. Women and minorities are strongly encouraged to apply. The University of Maryland is an Equal Opportunity/Affirmative Action Employer Sandy Davis, Graduate Secretary Neuroscience and Cognitive Science Program 2239 Zoo/Psych Building Univ. of Maryland, College Park,MD 20742-4415 phone: (301) 405-8910 fax: (301) 314-9358 email: sd136 at umail.umd.edu From Dave_Touretzky at cs.cmu.edu Thu Nov 26 02:07:22 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Thu, 26 Nov 1998 02:07:22 -0500 Subject: CNS*99 CALL FOR PAPERS and CNS*99 REGISTRATION FORM Message-ID: <23440.912064042@skinner.boltz.cs.cmu.edu> CALL FOR PAPERS Eighth Annual Computational Neuroscience Meeting CNS*99 July 18 - 22, 1999, Pittsburgh, Pennsylvania Co-sponsored by the Center for the Neural Basis of Cognition at Carnegie Mellon University and the University of Pittsburgh DEADLINE FOR SUMMARIES AND ABSTRACTS: **>> 11:59pm January 26, 1999 <<** This is the eighth annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. Peer reviewed papers are presented all related to understanding how nervous systems compute. As in previous years, CNS*99 will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1999 will take place at the Pittsburgh Hilton and Towers Hotel in downtown Pittsburgh, Pennsylvania. The first session starts at 9 am, Sunday, July 18th and ends with the annual banquet (and river cruise) on Thursday evening, July 22nd. There will be no parallel sessions. The meeting will include time for informal workshops focused on current issues in computational neuroscience. Pittsburgh International Airport is a prime hub for US Airways and enjoys direct flights from most major US cities. The conference hotel is located at Point State Park downtown, where the Allegheny and Monongahela rivers meet to form the Ohio. Having undergone a major renaissance in the 1980s, Pittsburgh is a high tech town that still honors its blue collar roots. You'll find clubs, restaurants, and cultural facilities comparable to most east coast US cities, but also some attractions unique to Pittsburgh's heritage as the "Steel City". SUBMISSION INSTRUCTIONS With this announcement we solicit the submission of presented papers all of which will be refereed. Peer review will be conducted based on a 1000-word (or less) summary describing the methods, nature, and importance of your results. Authors will be notified of acceptance by the second week of May, 1999. As with last year's meeting, all submission of papers will be performed electronically using a custom designed JAVA/HTML interface. Full instructions for submission can be found at the meeting web site: http://numedeon.com/cns-meetings/CNS99/. In brief, authors should cut and paste text from their own word processors into the forms available on the web site. It is important that all requested information be provided, including a 100 word abstract for publication in the conference program, all author information, and selection of the appropriate category and theme from the list provided. Authors should especially note the mechanisms used for figures and mathematical equations. All submissions will be acknowledged immediately by email. Program committee decisions will be sent to the designated correspondence author only. Submissions will not be considered if they lack category information, abstracts, or author addresses, or if they arrive late. FURTHER MEETING CORRESPONDENCE We would like to strongly encourage authors to make their submissions electronically. However, if you do not have ready access to the internet or a web server, we will send you instructions for paper submission if you contact us either by email or at the following address: CNS*99 Division of Biology 216-76 Caltech Pasadena, CA 91125 ADDITIONAL INFORMATION concerning the meeting, including hotel and travel arrangements or information on past meetings can be obtained by: o Using our on-line WWW information and registration server, URL of: http://numedeon.com/cns-meetings/CNS99/ o Sending Email to: cns99 at bbb.caltech.edu CNS*99 ORGANIZING COMMITTEE: Co-meeting Chair / Logistics - David Touretzky, Carnegie Mellon University Co-meeting Chair / Finances and Program - Jim Bower, Caltech Governmental Liaison - Dennis Glanzman, NIMH/NIH Workshop Organizer - To be Announced 1999 Program Committee: Erik De Schutter, University of Antwerp, Belgium Anders Lansner, Royal Institute of Technology, Sweden Chip Levy, University of Virginia Ray Glanz, Rice University David Touretzky, Carnegie Mellon University Gina Turrigiano, Brandeis University Ranu Jung, University of Kentucky Simon Thorpe, CNRS, Toulouse, France 1999 Regional Organizers: Europe- Erik DeSchutter (Belgium) Middle East - Idan Segev (Jerusalem) Down Under - Mike Paulin (New Zealand) South America - Marcelo and Marilene Mazza (Brazil) Asia - Zhaoping Li (London) India - Upinder Bhalla (Bangalore) XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX CNS*99 REGISTRATION FORM Last Name: First Name: Position: Student___ Graduate Student___ Post Doc___ Professor___ Committee Member___ Other___ Organization: Address: City: State: Zip: Country: Telephone: Email Address: REGISTRATION FEES: Technical Program --July 18 - July 22, 1999 Regular $275 ($300 after July 3rd) Price Includes One Banquet/Cruise Ticket Student $125 ($150 after July 3rd) Price Includes One Banquet/Cruise Ticket Each Additional Banquet/Cruise Ticket $35 Total Payment: $ Please Indicate Method of Payment: * Check or Money Order payable in U. S. Dollars to CNS*99 - Caltech (Please make sure to indicate CNS*99 and YOUR name on all money transfers.) * Charge my credit card. Cards the Caltech Ticket Office accepts are: Visa, Mastercard, or American Express Card number: Expiration Date: Name of Cardholder: Signature as appears on card (for mailed in applications): Date: ADDITIONAL QUESTIONS: Previously Attended: CNS*92___ CNS*93___ CNS*94___ CNS*95___ CNS*96___ CNS*97___ CNS*98___ Did you submit an abstract and summary? ( ) Yes ( ) No Title: Do you have special dietary preferences or restrictions (e.g., diabetic, low sodium, kosher, vegetarian)? If so, please note: Some grants to cover partial travel expenses may become available for students and postdoctoral fellows who present papers at the meeting. Do you wish further information? ( ) Yes ( ) No PLEASE CALL PITTSBURGH HILTON AND TOWERS TO MAKE HOTEL RESERVATIONS AT (412) 391-4600 Fax (412) 594-5144 PLEASE FAX OR MAIL REGISTRATION FORM TO: Caltech, Division of Biology 216-76, Pasadena, CA 91125 Attn: Judy Macias Fax Number: (626) 795-2088 (Refund Policy: 50% refund for cancellations on or before July 9; no refund after July 10th) From srx014 at coventry.ac.uk Fri Nov 27 11:24:53 1998 From: srx014 at coventry.ac.uk (Colin Reeves) Date: Fri, 27 Nov 1998 16:24:53 +0000 (GMT) Subject: ICANNGA99 Message-ID: Reminder: 4th International Conference on Artificial Neural Networks and Genetic Algorithms, to be held on 6th-9th April, 1999 at Portoroz, Slovenia, hosted by the University of Ljubljana. Plenary talks by Kevin Warwick, Hugo De Garis with Marco Tomassini and Michael Korkin, and Ingo Rechenberg. Tutorial sessions are planned on 6th April on approximations of multi-variable functions by neural networks (Vera Kurkova), Multi-objective optimisation using GAs (Kalyan Deb), Alternative viewpoints on GAs (Colin Reeves), Neural networks and Fuzzy Logic (Nigel Steele). Full programme of accepted talks plus a visit to the world-famous Postojna cave and Lipica stud...also an optional post-conference visit to Venice by sea. For latest details and registration information see http://cherry.fri.uni-lj.si/icannga99.html or send an email to icannga at fri.uni-lj.si From horn at neuron.tau.ac.il Thu Nov 26 10:50:46 1998 From: horn at neuron.tau.ac.il (David Horn) Date: Thu, 26 Nov 1998 17:50:46 +0200 (IST) Subject: conference announcement: NCST-99 Message-ID: Conference Announcement ------------------------- Neural Computation in Science and Technology --------------------------------------------- Israel, October 10-13, 1999 This conference, sponsored by the Israel Ministry of Science, aims to bring together scientists and practitioners of neural computation who are interested in scientific as well as technological applications. It will focus on two main general topics: A. The use of neural network techniques in various applications in science and technology. B. Modeling in computational neuroscience. The conference's dual focus on neuroscience and technological themes follows one of the fundamental ideas of the field of Computational Neuroscience, i.e., the belief that such cross-disciplinary interactions will be beneficial to researchers of both schools. Such cross-fertilization of ideas has already manifested itself in both directions over the last decade: On one hand, thinking about the brain in computational concepts has given rise to the development of artificial neural networks as leading computing devices for performing a multitude of technological tasks, such as classification, function approximation, prediction and control. On the other hand, basic engineering and technological methods such as principal component analysis, blind separation, learning vector quantization and reinforcement learning have all led to the development of neurally-based realizations. The latter, in turn, have given rise to many interesting ideas about the possible central role that such networks may play in the brain in a multitude of information processing tasks, varying from odor processing to reward-related action selection to self-organization of cortical maps, just to name a few. Moreover, new frontiers of neural computation such as neuromorphic engineering and adaptive autonomous control manifest even further the growing need for technological-scientific interaction, to fully exploit the potential promises of neural computation. The conference will consist of a four day meeting that will include tutorials, a series of invited and contributed talks, a poster session, and a discussion panel. An informal atmosphere will be maintained, encouraging questions and discussions. It will take place in a hotel that serves also as a conference center in Kibbutz Maale Hachamisha, located between Tel Aviv and Jerusalem. It has the nice scenery, and pleasant weather, of the Judean hills. Currently Confirmed Invited Speakers: -------------------------------------- D. Amit, Y. Baram, W. Bialek, E. Domany, R. Douglas, G. Dreyfus, D. Golomb, M. Hasselmo, J. Hertz, D. Horn, N. Intrator, I. Kanter, W. Kinzel, H. Oja, E. Ruppin, I. Segev, T. Sejnowski, H. Sompolinsky, N. Tishby M. Tsodyks and V. Vapnik. Call for Papers ------------------ We call herewith for the submission of papers for oral or poster presentation at the conference. The papers should fit into the two general categories of the conference. Submitted papers should be limited to seven pages in length and indicate physical and e-mail addresses of all authors and the author to whom correspondence should be addressed. Four copies of the submitted papers should reach the conference scientific committee by May 15th, 1999. Mail submissions to Prof. David Horn, NCST-99, School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel. Scientific Organizing Committee --------------------------------- D. Horn (chairman), H. Abramowicz, A. Agranat, Y. Baram, I. Kanter, E. Ruppin, N. Tishby, M. Tsodyks, G. Ariely and B. Shimony. Information and Registration: ----------------------------- Registration forms and information about the conference are available at the website http://neuron.tau.ac.il/NCST-99 ---------------------------------------------------------------------------- Prof. David Horn horn at neuron.tau.ac.il School of Physics and Astronomy http://neuron.tau.ac.il/‾horn Tel Aviv University Tel: ++972-3-642-9305, 640-7377 Tel Aviv 69978, Israel. Fax: ++972-3-640-7932 From Annette_Burton at Brown.edu Mon Nov 30 08:17:36 1998 From: Annette_Burton at Brown.edu (Annette Burton) Date: Mon, 30 Nov 1998 09:17:36 -0400 Subject: Program and Job Announcement Message-ID: Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY GRADUATE TRAINING PROGRAM in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES Deadline for Applications: January 1, 1999 Brown University is actively recruiting graduate students for a new NSF-supported Interdisciplinary Graduate Education, Research and Training (IGERT) program in "Learning and Action in the Face of Uncertainty: Cognitive, Computational and Statistical Approaches". The use of probabilistic models and statistical methods has had a major impact on our understanding of language, vision, action, and reasoning. This training program provides students with the opportunity to integrate a detailed study of human or artificial systems for language acquisition and use, visual processing, action, and reasoning with appropriate mathematical and computational models. Students will be enrolled in one of the three participating departments (Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science) and will study an interdisciplinary program of courses in topics such as statistical estimation, cognitive processes, linguistics, and computational models. The aim of this program is to provide promising students with a mix of mathematical, computational and experimental expertise to carry out multidisciplinary collaborative research across the disciplines of Applied Mathematics, Computer Science, and Cognitive Science. Interested students should apply to the participating department closest to their area of interest and expertise, and should indicate their interest in the IGERT training program in their application. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program, application procedures, and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT or contact Dr. Julie Sedivy Department of Cognitive & Linguistic Sciences Brown University, Box 1978 Providence, RI 02912 USA Julie_Sedivy at brown.edu ___________________________________________________________________ Brown University's Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY POSTDOCTORAL OPPORTUNITY in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES As part of an NSF award to Brown University through the IGERT program, the Departments of Cognitive and Linguistic Sciences, Computer Science, and Applied Mathematics will be hiring two Postdoctoral Research Associates. Fellows will be scholars who have displayed significant interest and ability in conducting collaborative interdisciplinary research in one or more of the research areas of the program: computational and empirical approaches to uncertainty in language, vision, action, or human reasoning. As well as participating in collaborative research, responsibilities will include helping to coordinate cross-departmental graduate teaching and research as well as some teaching of interdisciplinary graduate courses. We expect that these fellows will play an important role in creating a highly visible presence for the IGERT program at Brown, and their interdisciplinary activities will help unify the interdepartmental activities of the IGERT program. Applicants must hold a PhD in Cognitive Science, Linguistics, Computer Science, Mathematics, Applied Mathematics, or a related discipline, or show evidence that the PhD will be completed before the start of the position. Applicants should send a vita and three letters of reference to Steven Sloman, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, RI 02912. Special consideration will be given to those applicants whose research is relevant to at least two of the participating departments. The positions will begin September 1, 1999 for one year, renewable upon satisfactory completion of duties in the first year. Salaries will be between $35,000 and $45,000 per year. All materials must be received by Jan. 1, 1999, for full consideration. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT