From wiskott at itb.biologie.hu-berlin.de Tue Aug 1 03:59:21 2000 From: wiskott at itb.biologie.hu-berlin.de (Laurenz Wiskott) Date: Tue, 1 Aug 2000 09:59:21 +0200 Subject: Jobs for PhD-Stud. and Postdocs in Berlin Message-ID: <200008010759.JAA26017@huxley.biologie.hu-berlin.de> Dear Connectionists, could you please forward this job advertisement to students and postdocs who might be interested. Thanks, Laurenz Wiskott. ___________________________________________________________________________ Open Positions for PhD-Students and Postdocs in Neuroinformatics at the Innovationskolleg Theoretische Biologie, Berlin ___________________________________________________________________________ Institute: Innovationskolleg Theoretische Biologie Humboldt-Universitt zu Berlin Invalidenstrae 43 D-10115 Berlin, Germany The Innovationskolleg Theoretische Biologie is a young and dynamic lab with three full professors and about 30 students and researchers doing interdisciplinary and innovative research in different areas of theoretical biology, such as Molecular and Cellular Evolution (Prof. Hanspeter Herzel), Evolution of Organismic Systems (Prof. Peter Hammerstein), Computational Neurobiology (Prof. Andreas V.M. Herz), and Neuroinformatics (Dr. Laurenz Wiskott). Research group: The positions are available in the junior research group led by Dr. Laurenz Wiskott and funded by the Volkswagen Foundation. Research topics: Learning Invariances from Sensory Data: Biological Principles and Technical Applications. The central issue addressed by the junior research group is the question of how invariant object perception can be achieved and learned in sensory systems, especially in the primate visual system. Extensive computer simulations and analytical considerations shall demonstrate that invariances can be learned based on visual experience. On the biological side, the existing learning rules have to be transferred to models of spiking neuron. On the technical side the existing system shall be extended to the processing of video sequences and technical data. Time: The positions will be available beginning November 2000. The initial appointment will be for 1 year with a possible extension by additional 2 to 4 years. Postdocs may also apply for part-time employment. Requirements: Candidates should have a degree in physics, electrical engineering, computer science, biology, or a related field. Required are strong mathematical and programming skills, as well as the ability to communicate and work well in a team. Salary: Salary will be up to BAT IIa and will depend on qualification, experience, age, and family status. BAT is the regular salary scale for public employees in Germany. Inquiries: Informal inquiries can be addressed to Dr. Laurenz Wiskott (wiskott at itb.biologie.hu-berlin.de). Application: Applications, including a detailed CV, a statement of research interests, and the names and addresses of two professional referees, should be sent to Dr. Laurenz Wiskott at the address given above. Please send only copies and not original documents, since the applications will not be sent back. Applications via e-mail, possibly including a reference to a well maintained web-page, are equally acceptable. Deadline: None. Applications will be accepted until the positions have been filled, as will be indicated on the web-page of this job advertisement; see below. PS: Could you please indicate in your application where you first read about this job advertisement. WWW: http://itb.biologie.hu-berlin.de/~wiskott/jobs.html. From skremer at q.cis.uoguelph.ca Tue Aug 1 23:41:33 2000 From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer) Date: Tue, 1 Aug 2000 23:41:33 -0400 (EDT) Subject: No subject Message-ID: Dear Connectionists: We are pleased to announce the official start of the NIPS 2000 Unlabeled Data Supervised Learning Competition! This competition is designed to compare algorithms and architectures that use unlabeled data to help supervised learning. The first four datasets, and more details, are now available at the competition web-site: http://q.cis.uoguelph.ca/~skremer/NIPS2000/ May the best algorithm win! Stefan, Deb, and Kristin -- -- Dr. Stefan C. Kremer, Assistant Prof., Dept. of Computing and Information Science University of Guelph, Guelph, Ontario N1G 2W1 WWW: http://hebb.cis.uoguelph.ca/~skremer Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323 E-mail: skremer at snowhite.cis.uoguelph.ca From herbert.jaeger at gmd.de Thu Aug 3 13:33:47 2000 From: herbert.jaeger at gmd.de (Herbert Jaeger) Date: Thu, 03 Aug 2000 19:33:47 +0200 Subject: PhD and PostDoc positions Message-ID: <3989ACFB.61243CE2@gmd.de> DOCTORATE AND POSTDOCTORATE POSITIONS FOR RESEARCH ON BIOLOGICALLY ORIENTED INFORMATION PROCESSING New doctorate and Postdoc research positions are open immediately at the German National Research Center for Information Technology (GMD, http://www.gmd.de/) in the Institute for Autonomous Intelligent Systems (AiS, http://ais.gmd.de/). In the interdisciplinary arena of neural information processing, robotics, cognitive science and neurobiology, AiS will intensify its research on biologically inspired neural information processing systems. Research will focus on the aspects of stochasticity, parallelism, asynchrony, self-organization and high-bandwidth communication between subsystems. The methodological focus within AiS is on mathematical modeling and robot implementations. The biological side is substantiated by collaborations with leading neuroscience and ethological research institutions. Within this wide field, we solicit applications of young researchers who combine a broad scientific curiosity with the desire and the training to achieve "hard" results (mathematical models, algorithms, robot implementations). The candidate should have an education in computational neuroscience, (neuro-)biology, mathematics, control engineering, computer science, physics, cognitive science or related areas. A characteristic of both PhD and postdoc positions at GMD is intellectual freedom. The positions are not connected to pre-defined projects but admit (indeed, require) that candidates define their own research objectives within the general research framework of AiS. For the PhD positions we expect a university degree (master, diploma). Payment will be in agreement with German BAT 2a salary (corresponding to university research assistants) in a part-time (19 hrs/week) contract. The position will be granted initially for one year, with a possible extension for two further years on mutual agreement. For the postdoc positions, we expect a PhD and a significant research record. Contracts will be full-time BAT 2a (equivalent to university research assistants) for 2 years initially with a possible extension of one further year. GMD is an equal opportunity employer. Full applications should comprise a CV, copies of university degrees (including marks), letters of reference, a publication list, and - importantly - a research plan. We encourage sending a preliminary application by email, consisting of a CV, a publication list, and a statement of research interests. Please direct applications and inquiries to Dr. Herbert Jaeger Phone +49-2241-14-2253 GMD, AiS.BE Fax 2241-14-2384 Schloss Birlinghoven D-53754 Sankt Augustin, Germany email: herbert.jaeger at gmd.de web: http://www.gmd.de/People/Herbert.Jaeger/ From oby at cs.tu-berlin.de Thu Aug 3 10:01:51 2000 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Thu, 3 Aug 2000 16:01:51 +0200 (MET DST) Subject: postdoctoral position Message-ID: <200008031401.QAA03258@pollux.cs.tu-berlin.de> Postdoctoral Position (salary level C1) Neural Information Processing Group, Department of Computer Science Technical University of Berlin, Berlin, Germany A postdoctoral position is available within the NI group at the Technical University of Berlin, Germany, beginning Oct. 1st 2000. The position is initially for a duration of three years, but an extension up to six years is possible. The successful candidate may choose between the following research areas: - computational models of visual cortex - theory of unsupervised learning - analysis of functional imaging data Teaching duties include 4 hours tutoring per week (in German) during the winter and summer terms, i.e. the successful candidate must be fluent in the German language. Interested candidates please send their CV, transcripts of their certificates, a short statement of their research interest and a list of publications to: Prof. Klaus Obermayer FR2-1, NI, Informatik, Technische Universitaet Berlin Franklinstrasse 28/29, 10587 Berlin, Germany phone: ++49-30-314-73120, fax: -73121, email: oby at cs.tu-berlin.de preferably by email. For a list of relevant publications and an overview of current research projects please refer to our web-page at: http://ni.cs.tu-berlin.de/ Cheers Klaus --------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ From cjlin at csie.ntu.edu.tw Mon Aug 7 00:52:06 2000 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Mon, 7 Aug 2000 12:52:06 +0800 (CST) Subject: new version of two software for SVM Message-ID: <200008070452.MAA02961@ntucsa.csie.ntu.edu.tw> Dear Colleagues: We announce the new version of two software for support vector machines. 1. LIBSVM 2.0 http://www.csie.ntu.edu.tw/~cjlin/libsvm Not only for large classification, LIBSVM now solves regression, and two variants: nu-SVM and one-class SVM. Like version 1.0, the new release is still a simple and easy-to-use tool. All different SVMs are implemented in one short file (1000-line C++ code). Graphics interfaces for different SVMs are also provided. 2. BSVM 1.1 http://www.csie.ntu.edu.tw/~cjlin/bsvm The main improvement is on solving the optimization sub-problem of the decomposition method. This makes the use of larger working sets possible. Unlike the previous version which was implemented by combining C and Fortran programs, BSVM 1.1 uses only C and the code is simpler. Any comments are very welcome. Sincerely, Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan cjlin at csie.ntu.edu.tw From gary at nd.com Mon Aug 7 11:54:22 2000 From: gary at nd.com (Gary Lynn) Date: Mon, 7 Aug 2000 11:54:22 -0400 Subject: JOB: Engineer needed in Florida Message-ID: NeuroDimension will be starting a research project in August 2000 to design, develop, and prototype a neural network controlled ventilator. This project involves cutting-edge neural network and bioengineering technology and is completely funded for a minimum of 2 years. NeuroDimension is looking for full-time and part-time engineers or computer scientists with bachelor?s degree or better. In addition, we can also provide research assistantships for masters or Ph.D. students. Good candidates for these positions should have skills in ONE OR MORE of the following areas and be willing to learn many of the others: - Neural networks - Digital control - Digital signal processing - Bioengineering (especially knowledge of the respiratory system) - Electrical circuit design and microcontroller software - Software design and development NeuroDimension is located in Gainesville (link http://www.state.fl.us/gvl) in north-central Florida. The pay for a full-time engineering position ranges from $37K to $45K per year. Benefits include health, dental, and retirement plans. If you are interested, please email a resume to neil at nd.com From lpk at ai.mit.edu Mon Aug 7 16:38:00 2000 From: lpk at ai.mit.edu (Leslie Pack Kaelbling) Date: Mon, 7 Aug 2000 16:38:00 -0400 (EDT) Subject: Announcing: The Journal of Machine Learning Research Message-ID: <200008072038.QAA14274@soggy-fibers.ai.mit.edu> Announcing the JOURNAL of MACHINE LEARNING RESEARCH The Journal of Machine Learning Research (JMLR) provides an international forum for the electronic and paper publication of high-quality scholarly articles in all areas of machine learning. JMLR seeks previously unpublished papers that contain: - new algorithms with empirical, theoretical, psychological, or biological justification; - experimental and/or theoretical studies yielding new insight into the design and behavior of learning in intelligent systems; - accounts of applications of existing techniques that shed light on the strengths and weaknesses of the methods; - formalization of new learning tasks (e.g., in the context of new applications) and of methods for assessing performance on those tasks; - development of new analytical frameworks that advance theoretical studies of practical learning methods; - computational models of data from natural learning systems at the behavioral or neural level; or - extremely well-written surveys of existing work. JMLR has a commitment to rigorous yet rapid reviewing; reviews are returned within six weeks of paper submission. Final versions are published electronically immediately upon receipt, and an annual paper volume is published by MIT Press and sold to libraries and individuals. JMLR is accepting new submissions. Please see http://www.jmlr.org for submission information. Accepted papers, when published on the web, will be announced via an email list. To subscribe, send an email message to majordomo at ai.mit.edu with body text "subscribe jmlr-announce" (without quotation marks). Editor: Leslie Pack Kaelbling Managing Editor: David Cohn Action Editors: Peter Bartlett, Australian National University, Australia Craig Boutilier, University of Toronto, Canada Claire Cardie, Cornell University, US Peter Dayan, University College, London, UK Thomas Dietterich, Oregon State University, US Donald Geman, University of Massachusetts at Amherst, US Michael Jordan, University of California at Berkeley, US Michael Kearns, AT&T Research, US John Lafferty, Carnegie Mellon University, US Heikki Mannila, Helsinki University of Technology, Finland Fernando Pereira, Whizbang! Laboratories, US Pietro Perona, California Institute of Technology, US Stuart Russell, University of California at Berkeley, US Claude Sammut, University of New South Wales, Australia Bernhard Schoelkopf, Microsoft Research, Cambridge, UK Larry Wasserman, Carnegie Mellon University, US Stefan Wrobel, Otto-von-Guericke-Universitat Magdeburg, Germany Editorial Board: Naoki Abe, NEC Corporation, Japan Christopher M. Bishop, Microsoft Research, UK Andrew G. Barto, University of Massachusetts, Amherst, USA Henrik Bostrom, Stockholm University/KTH, Sweden Carla Brodley, Purdue University, USA Nello Cristianini, Royal Holloway, University of London, UK William W. Cohen, Whizbang! Laboratories, USA David Cohn, Burning Glass Technologies, USA Luc De Raedt, University of Freiburg, Germany Saso Dzeroski, Jozef Stefan Institute, Slovenia Nir Friedman, Hebrew University, Israel Dan Geiger, The Technion, Israel Zoubin Ghahramani, University College London, UK Sally Goldman, Washington University, St. Louis, USA Russ Greiner, University of Alberta, Canada David Heckerman, Microsoft Research, USA Thomas Hofmann, Brown University, USA Tommi Jaakkola, Massachusetts Institute of Technology, USA Daphne Koller, Stanford University, USA Michael Littman, AT&T Research, USA Sridhar Mahadevan, Michigan State University, USA Yishay Mansour, Tel-Aviv University, Israel Andrew McCallum, Whizbang! Laboratories, USA Raymond J. Mooney, University of Texas at Austin, USA Stephen Muggleton, York University, UK Foster Provost, New York University, USA Dana Ron, Tel-Aviv University, Israel Lawrence Saul, AT&T Labs, USA John Shawe-Taylor, Royal Holloway, University of London, UK Dale Schuurmans, University of Waterloo, Canada Yoram Singer, The Hebrew University, Israel Alex Smola, Australian National University, Australia Padhraic Smyth, University of California at Irvine, USA Moshe Tennenholtz, The Technion, Israel Sebastian Thrun, Carnegie Mellon University, USA Naftali Tishby, Hebrew University, Israel David Touretzky, Carnegie Mellon University, USA Chris Watkins, Royal Holloway, University of London, UK Robert C. Williamson, Australian National University, Australia Advisory Board: Shun-Ichi Amari, RIKEN Brain Science Institute, Japan Andrew Barto, University of Massachusetts at Amherst, USA Thomas Dietterich, Oregon State University, USA Jerome Friedman, Stanford University, USA Stuart Geman, Brown University, USA Geoffrey Hinton, University College London, UK Michael Jordan, University of California at Berkeley, USA Michael Kearns, AT&T Research, USA Steven Minton, University of Southern California, USA Thomas Mitchell, Carnegie Mellon University, USA Stephen Muggleton, University of York, UK Nils Nilsson, Stanford University, USA Tomaso Poggio, Massachusetts Institute of Technology Ross Quinlan, University of New South Wales, Australia Stuart Russell, University of California at Berkeley, USA Terrence Sejnowski, Salk Institute for Biological Studies, USA Richard Sutton, AT&T Research, USA Leslie Valiant, Harvard University, USA Stefan Wrobel, Otto-von-Guericke-Universitaet, Germany From degaris at starlab.net Tue Aug 8 08:19:55 2000 From: degaris at starlab.net (Hugo de Garis) Date: Tue, 08 Aug 2000 14:19:55 +0200 Subject: CFP,Special Issue,Neurocomputing Journal, "Evolutionary Neural Systems" References: <397ED7DD.4E575005@starlab.net> Message-ID: <398FFAEB.6405B464@starlab.net> > CALL FOR PAPERS > > NEUROCOMPUTING Journal (Elsevier) > > Special Issue > "Evolutionary Neural Systems" > > Guest Editor > > Prof. Dr. Hugo de Garis (Starlab, Belgium) > > Editorial Committee > > Dr. David Fogel (Natural Selection Inc., USA) > Prof. Dr. Michael Conrad (Wayne State University, USA) > Prof. Dr. Xin Yao (Birmingham University, UK) > > Submission deadline: October 1st, 2000 > > Evolutionary Neural Systems - The application of evolutionary > algorithms to the creation of neural network based systems, in > software and/or hardware. > > The Neurocomputing Journal (http://www.elsevier.nl/locate/neucom) > invites original contributions for a forthcoming special issue on > "Evolutionary Neural Systems". > > Examples of topics relevant to this special issue include : > > -- Evolved Neural Software Systems > -- Evolved Neural Hardware Systems > -- Discussion/Analysis of Neural System Evolvability > -- Scaling Issues of Evolved Multicomponent Neural Systems > -- Evolved Artificial Nervous Systems/Artificial Brains > -- Large Scale, Real World, Applications of Evolved Neural > Systems > -- Novel Evolutionary Computation Techniques for Neural Systems > -- Related Issues > > Manuscripts (in English) should not normally exceed 10,000 words > in length and should be formatted and submitted according to the > requirements found at the journal web site (above). > > Please provide a cover page containing > > (i) the title of the paper, > (ii) family names with initial(s) of the personal name(s), > (iii) address and email of each author, and > (iv) an abstract. > > Three to five keywords should be supplied after the abstract for > indexing purposes. Figures should be cited in the text and marked in > > the left margin of the manuscript. Four hard copies of the manuscript > > should be submitted to: > > Prof. Dr. Hugo de Garis > Starlab, Blvd. St. Michel 47, B-1040, > Brussels, Belgium, Europe. > > E-mails: > > degaris at starlab.net > dfogel at natural-selection.com > conrad at cs.wayne.edu > x.yao at cs.bham.ac.uk > > This special issue will consist of a mix of invited papers and openly > solicited papers. > > =================== From kagan at ptolemy.arc.nasa.gov Wed Aug 9 18:10:26 2000 From: kagan at ptolemy.arc.nasa.gov (Kagan Tumer) Date: Wed, 9 Aug 2000 15:10:26 -0700 (PDT) Subject: ML position at NASA Ames Research Center Message-ID: <200008092210.PAA17929@avogadro.arc.nasa.gov> *** JOB ANNOUNCEMENT *** Machine Learning Research Position at NASA Ames Research Center. The Automated Learning Group at NASA Ames Research Center is seeking applicants for the position of Principal Investigator. The successful applicant for this position would initiate new research in the general field of machine learning and statistical inference. Although not restricted to such problems, the main thrust of the investigator's work is expected to involve projects to be carried out by NASA. Applicants must have a doctoral degree in a relevant field and an outstanding research record. Current members of the Automated Learning Group include David Wolpert, Peter Cheeseman, Kagan Tumer, David Maluf, Vadim Smelyanskiy, Robin Morris and Esfandiar Bandari. The Automated Learning Group is part of the the Automated Software Engineering Research Area led by Michael Lowry. Applicants should send a curriculum vitae (including a list of publications) their citizenship status and the names of at least three references to: Peter Cheeseman (cheesem at ptolemy.arc.nasa.gov), Kagan Tumer (kagan at ptolemy.arc.nasa.gov), and Mike Lowry (lowry at ptolemy.arc.nasa.gov) Post mail can be sent to all three at: Mail Stop 269-2 (For Kagan Tumer, use Mail Stop 269-3) NASA Ames Research Center Moffett Field, CA 94035 The current position is restricted to US citizens only. Future positions may be open to non-citizens. Women and minorities are strongly encouraged to apply. -- Kagan Tumer, PhD email: kagan at ptolemy.arc.nasa.gov NASA Ames Research Center http://ic.arc.nasa.gov/people/kagan/ Mail Stop 269-4 phone: (650) 604-4940 Moffett Field, CA 94035-1000 fax : (650) 604-3594 From stefan.wermter at sunderland.ac.uk Thu Aug 10 14:17:28 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Thu, 10 Aug 2000 19:17:28 +0100 Subject: 8 new positions at University of Sunderland, Computing and Engineering. Message-ID: <3992F1B8.D1720E29@sunderland.ac.uk> 8 new positions are created at University of Sunderland, Computing and Engineering. Wrt to this list, we would like to hear from qualified researchers in intelligent systems, neural networks, hybrid systems or language processing and would like to encourage them to apply for a position in intelligent systems. If you have specific questions or interests within intelligent systems, neural networks/neuroscience, hybrid systems or language processing please let me know. The brief general text for all positions is given below. Stefan Wermter stefan.wermter at sunderland.ac.uk --------------------------------------------- The School of Computing, Engineering and Technology draws together the disciplines of Computing, Information Systems, Engineering and Technology. It enjoys a strong student base and a growing research profile, and provides first-class computing facilities. The School has almost 2M per annum income from research and over 100 registered PhD research students. The School is looking to strengthen its staffing complement with a number of key appointments. All posts require an honours degree, and a Higher Degree in a relevant area. A relevant PhD and Membership/Fellowship of the British Computer Society would be a strong advantage. Excellent communication skills are essential. Professor of Software Engineering With an international research profile and a good publication record, you will provide strong leadership in Software Engineering. You will conduct teaching and research, and will make a major contribution to our research culture, demonstrating experience in leading research; developing external links; managing research funding; and supervising research students. Reference: CET Principal Lecturer in Computing 28,978 - 36,436 A proven team-worker with experience of leadership and module/programme design, you will lead, motivate and develop a team of staff on our MSc Programme. You must be currently research active, and able to deliver high quality, innovative teaching and research. You will create and develop external links in the UK and Europe, and some travel will be required. Reference: CET Lecturer / Senior Lecturer in Computing (6 posts) 14,902 - 30,636 Experienced in research and with proven teamworking, organisational and administrative skills, you will undertake high quality teaching, research and reach-out activities. We require staff with specialised knowledge in: Computer Systems; Software Engineering; Multimedia/ Media Systems/ Human-Computer Interaction; Intelligent Systems; Information Systems; Systems Design. Reference: CET Informal enquiries are welcomed by Professor Peter Smith on tel (0191) 5152761 or email peter.smith at sunderland.ac.uk Application is by CV with a covering letter detailing current salary and full contact details of two referees, to the Personnel Department, University of Sunderland, Langham Tower, Ryhope Road, Sunderland, SR2 7EE or email employee.recruitment at sunderland.ac.uk quoting appropriate reference number. A relocation package is available in approved cases. Working Towards Equal Opportunities *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ http://www.his.sunderland.ac.uk/intelligent/ **************************************** From krichmar at nsi.edu Fri Aug 11 19:30:57 2000 From: krichmar at nsi.edu (Jeff Krichmar) Date: Fri, 11 Aug 2000 16:30:57 -0700 Subject: Postdoctoral position Message-ID: <000601c003ec$337a6b80$0db985c6@nsi.edu> Please post and circulate as you see fit. Thank you. POSTDOCTORAL FELLOWSHIP W.M. KECK MACHINE PSYCHOLOGY LABORATORY The Neurosciences Institute, located in San Diego, California, invites applications for a POSTDOCTORAL FELLOWSHIP to study biologically based models of behaving real world devices (robots) as part of the W.M. Keck Machine Psychology Laboratory. Continuing previous research conducted at the Institute, this Laboratory will be focusing on the construction of autonomous robots, the design of simulated models of large-scale neuronal networks that are capable of guiding behavior in the real world, and on developing methods for the simultaneous analysis of neural and behavioral states. Applicants should have a background in computational neuroscience, robotics, computer science, behavioral science or cognitive science. Fellows will receive stipends appropriate to their qualifications and experience. Submit a curriculum vitae, statement of research interests, and names of three references to: Dr. Jeffrey L. Krichmar The Neurosciences Institute 10640 John Jay Hopkins Drive San Diego, California 92121 Email: krichmar at nsi.edu Fax: 858-626-2099 For a description of the project, refer to http://www.nsi.edu/nomad/. For a description of The Neurosciences Institute, refer to http://www.nsi.edu. From oreilly at grey.colorado.edu Sat Aug 12 15:27:28 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Sat, 12 Aug 2000 13:27:28 -0600 Subject: PDP++ version 2.0 Message-ID: <200008121927.NAA19000@grey.colorado.edu> ANNOUNCING: The PDP++ Software, version 2.0 Authors: Randall C. O'Reilly, Chadley K. Dawson, and James L. McClelland The PDP++ software is a neural-network simulation system written in C++. It represents the next generation of the PDP software released with the McClelland and Rumelhart "Explorations in Parallel Distributed Processing Handbook", MIT Press, 1987. It is easy enough for novice users, but very powerful and flexible for research use. The current version is 2.0, released August, 2000, which is a major upgrade from previous versions, as detailed below. The software can be obtained by anonymous ftp from: Anonymous FTP Site: ftp://grey.colorado.edu/pub/oreilly/pdp++ *or* ftp://cnbc.cmu.edu/pub/pdp++/ *or* unix.hensa.ac.uk/mirrors/pdp++/ For more information, see our web page: WWW Page: http://www.cnbc.cmu.edu/PDP++/PDP++.html There is a 250 page (printed) manual and an HTML version available on-line at the above address. The new features in 2.0 include: --------------------------------- o MS Windows platform fully supported (using CYGWIN environment) o Project View window for GUI onto project/processes & specs o Enviro View rewritten, GUI for event/pattern layout, etc. o Grid View rewritten, interactively configurable grid layout o Easy viewing of entire network weights in grid log o Easy cluster plot interface, displayed in graph log o GUI for interactive construction improved o Context-senstive help via "Help" menu on all objects (via HTML) o Lots and lots of bug fixes, minor improvements: every known way to crash software has been fixed! Software Features: ================== o Full Graphical User Interface (GUI) based on the InterViews toolkit. Allows user-selected "look and feel". o Network Viewer shows network architecture and processing in real- time, allows network to be constructed with simple point-and-click actions. o Training and testing data can be graphed on-line and network state can be displayed over time numerically or using a wide range of color or size-based graphical representations. o Environment Viewer shows training patterns using color or size-based graphical representations; interactive configuration. o Flexible object-oriented design allows mix-and-match simulation construction and easy extension by deriving new object types from existing ones. o Built-in 'CSS' scripting language uses C++ syntax, allows full access to simulation object data and functions. Transition between script code and compiled code is simplified since both are C++. Script has command-line completion, source-level debugger, and provides standard C/C++ library functions and objects. o Scripts can control processing, generate training and testing patterns, automate routine tasks, etc. o Scripts can be generated from GUI actions, and the user can create GUI interfaces from script objects to extend and customize the simulation environment. Supported Algorithms: ===================== o Feedforward and recurrent error backpropagation. Recurrent BP includes continuous, real-time models, and Almeida-Pineda. o Constraint satisfaction algorithms and associated learning algorithms including Boltzmann Machine, Hopfield models, mean-field networks (DBM), Interactive Activation and Competition (IAC), and continuous stochastic networks. o Self-organizing learning including Competitive Learning, Soft Competitive Learning, simple Hebbian, and Self-organizing Maps ("Kohonen Nets"). o Leabra algorithm that combines error-driven and Hebbian learning with k-Winners-Take-All inhibitory competition. Over 40 research-grade simulations available for this algorithm in association with new book: "Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain", O'Reilly & Munakata, 2000, MIT Press. From jrw at cns.bu.edu Wed Aug 16 10:54:19 2000 From: jrw at cns.bu.edu (Jim Williamson) Date: Wed, 16 Aug 2000 10:54:19 -0400 Subject: Paper on self-organizing topographic networks for classification Message-ID: <399AAB1B.31DFF4F5@cns.bu.edu> I would like to announce the availability of a new paper, accepted for publication in Neural Computation. Available at: http://cns-web.bu.edu/pub/jrw/www/pubs.html ------------------------------------------------------------ SELF-ORGANIZATION OF TOPOGRAPHIC MIXTURE NETWORKS USING ATTENTIONAL FEEDBACK James R. Williamson Department of Cognitive and Neural Systems Boston University, Boston, MA 02215 Neural Computation, in press. ABSTRACT This paper proposes a neural network model of supervised learning which employs biologically-motivated constraints of using local, on-line, constructive learning. The model possesses two novel learning mechanisms. The first is a network for learning topographic mixtures. The network's internal category nodes are the mixture components, which learn to encode smooth distributions in the input space by taking advantage of topography in the input feature maps. The second mechanism is an attentional biasing feedback circuit. When the network makes an incorrect output prediction, this feedback circuit modulates the learning rates of the category nodes, by amounts based on the sharpness of their tuning, in order to improve the network's prediction accuracy. The network is evaluated on several standard classification benchmarks and shown to perform well in comparison to other classifiers. From bgiven at gmu.edu Wed Aug 16 15:14:06 2000 From: bgiven at gmu.edu (Barbara K. Given) Date: Wed, 16 Aug 2000 15:14:06 -0400 Subject: post-doc position Message-ID: <399AE7FE.7414BCDE@gmu.edu> Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA. Post-doctoral Research Fellow in Neuropsychology / Event Related Potential / EEG research for a 3-year or longer federally funded project on adolescent learning to: * Identify neural characteristics of chronically disruptive youth with receptive language deficits, * Compare these characteristics with academically and socially successful age mates, * Determine if electrophysiological markers can be identified that clearly differentiate the two groups, * Investigate frontal and temporal brain activity pre and post educational intervention. Candidates should have a doctorate in a field relevant to the cognitive neuroscience, psychological testing, quantitative analysis of EEG, ERP, and signal processing required to carry out this project. FTE position to $34,000. Available Sept 1, 2000 or asap thereafter. Research team includes: Steven Schiff (Co-PI), Paul Rapp, Martha Farah, Steven Weinstein, William Gaillard, and Kristen Jerger. Address inquiries and send resume' with two references to: bgiven at gmu.edu -- Barbara K. Given, Ph.D. Director, Adolescent Learning Research Center, Krasnow Institute for Advanced Study, and Associate Professor, Graduate School of Education George Mason University Fairfax, VA 22030-4444 Phone: 703-993-4406 Fax: 703-993-4325 From vera at cs.cas.cz Fri Aug 18 14:26:56 2000 From: vera at cs.cas.cz (Vera Kurkova) Date: Fri, 18 Aug 00 14:26:56 CET Subject: ICANNGA 2001 (Prague) Message-ID: <52016.vera@uivt1.uivt.cas.cz> **************************************************************** * * * >>>>> ICANNGA 2001 <<<<< * * * * 5th International Conference * * on Artificial Neural Networks and Genetic Algorithms * * * * Prague, April 22-25, 2001 * * * * 2nd call for papers (due September 20, 2000) * * * * for details see * * http://www.cs.cas.cz/icannga * * * * Proposals for special sessions, tutorials * * and software demonstrations are welcome * * * **************************************************************** From p.tino at aston.ac.uk Fri Aug 18 12:56:46 2000 From: p.tino at aston.ac.uk (Peter Tino) Date: Fri, 18 Aug 2000 17:56:46 +0100 Subject: fractal representations of symbolic sequences Message-ID: <399D6ACE.33B0B27E@aston.ac.uk> Dear Connectionists, I would like to announce the availability of papers dealing with theoretical and practical aspects of fractal representations of (possibly very long and complex) symbolic sequences via iterative function systems. 1. P. Tino: Spatial Representation of Symbolic Sequences through Iterative Function Systems. IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans. 1999. - A theoretical study connecting multifractal properties of such representations with entropic measures on symbolic sequences 2. P. Tino, G. Dorffner: Predicting the future of discrete sequences from fractal representations of the past. Machine Learning, accepted. - Mostly empirical study of predictive models constructed on fractal representations. Predictive models are closely related to variable memory length Markov models. 3. P. Tino, G. Dorffner, Ch. Schittenkopf: Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics. In Hybrid Neural Symbolic Integration, 2000. - A connection between recurrent nets and fractal representations. 4. P. Tino, M. Koteles: Extracting finite state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE Transactions on Neural Networks, 1999. - Contains an example of using fractal representations to monitor the processes of training recurrent nets and extracting knowledge from trained nets. The papers can be downloaded from http://www.ncrg.aston.ac.uk/~tinop/my.publ.html Also available are some minor applications of this methodology in finance and natural language modeling. Best regards, Peter T. -- Peter Tino - Neural Computing Research Group Aston University, Aston Triangle, Birmingham, B4 7ET, UK (+44 (0)121) 359 3611 ext. 4285, fax: 333 6215 http://www.ncrg.aston.ac.uk/~tinop/ From steve at cns.bu.edu Fri Aug 18 18:19:22 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Fri, 18 Aug 2000 18:19:22 -0400 Subject: a neural model of learning to write Message-ID: The following article can be accessed at http://www.cns.bu.edu/Profiles/Grossberg Paper copies can also be gotten by writing Mr. Robin Amos, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215 or amos at cns.bu.edu. Grossberg S. and Paine R. W. (2000). A neural model of corticocerebellar interactions during attentive imitation and predictive learning of sequential handwriting movement. Special Issue of Neural Networks on "The global Brain: Imaging and Neural Modeling", in press. A preliminary version is available as Boston University Technical Report, CAS/CNS TR-2000-009. The paper is available in PDF format GrossbergPaine2000.pdf, or in Gzipped postscript format GrossbergPaine2000.ps.gz. ABSTRACT Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an inverse relation between curvature and speed. How are such complex movements learned through attentive imitation? Novel movements may be made as a series of distinct segments, but a practiced movement can be made smoothly, with a continuous, often bell-shaped, velocity profile. How does learning of complex movements transform reactive imitation into predictive, automatic performance? A neural model is developed which suggests how parietal and motor cortical mechanisms, such as difference vector encoding, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. To initiate movement, visual attention shifts along the shape to be imitated and generates vector movement using motor cortical cells. During such an imitative movement, cerebellar Purkinje cells with a spectrum of delayed response profiles sample and learn the changing directional information and, in turn, send that learned information back to the cortex and eventually to the muscle synergies involved. If the imitative movement deviates from an attentional focus around a shape to be imitated, the visual system shifts attention, and may make an eye movement, back to the shape, thereby providing corrective directional information to the arm movement system. This imitative movement cycle repeats until the corticocerebellar system can accurately drive the movement based on memory alone. A cortical working memory buffer transiently stores the cerebellar output and releases it at a variable rate, allowing speed scaling of learned movements which is limited by the rate of cerebellar memory readout. Movements can be learned at variable speeds if the density of the spectrum of delayed cellular responses in the cerebellum varies with speed. Learning at slower speeds facilitates learning at faster speeds. Size can be varied after learning while keeping the movement duration constant (isochrony). Context-effects arise from the overlap of cerebellar memory outputs. The model is used to simulate key psychophysical and neural data about learning to make curved movements, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a Two-Thirds Power Law relation between angular velocity and curvature. From steve at cns.bu.edu Fri Aug 18 18:53:33 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Fri, 18 Aug 2000 18:53:33 -0400 Subject: neural model of horizontal and interlaminar cortical development and adult perceptual grouping Message-ID: The following article can be accessed at http://www.cns.bu.edu/Profiles/Grossberg Paper copies can also be gotten by writing Mr. Robin Amos, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215 or amos at cns.bu.ed Grossberg S. and Williamson J. R. (2000). A neural model of how horizontal and interlaminar connections of visual cortex develop into adult circuits that carry out perceptual grouping and learning. Cerebral cortex, in press. The paper is available in PDF format GroWil00.pdf, or in Gzipped postscript format GroWil00.ps.gz. ABSTRACT: A neural model suggests how horizontal and interlaminar connections in visual cortical areas V1 and V2 develop within a laminar cortical architecture and give rise to adult visual percepts. The model suggests how mechanisms that control cortical development in the infant lead to properties of adult cortical anatomy, neurophysiology, and visual perception. The model clarifies how excitatory and inhibitory connections can develop stably by maintaining a balance between excitation and inhibition. The growth of long-range excitatory horizontal connections between layer 2/3 pyramidal cells is balanced against that of short-range disynaptic interneuronal connections. The growth of excitatory on-center connections from layer 6-to-4 is balanced against that of inhibitory interneuronal off-surround connections. These balanced connections interact via intracortical and intercortical feedback to realize properties of perceptual grouping, attention, and perceptual learning in the adult, and help to explain the observed variability in the number and temporal distribution of spikes emitted by cortical neurons. The model replicates cortical point spread functions and psychophysical data on the strength of real and illusory contours. The on-center off-surround layer 6-to-4 circuit enables top-down attentional signals from area V2 to modulate, or attentionally prime, layer 4 cells in area V1 without fully activating them. This modulatory circuit also enables adult perceptual learning within cortical area V1 and V2 to proceed in a stable way. From max at maccs.mq.edu.au Fri Aug 18 20:25:25 2000 From: max at maccs.mq.edu.au (max) Date: Sat, 19 Aug 2000 10:25:25 +1000 Subject: Computational modeling of reading Message-ID: Dear Connectionists, I would like to announce the availability of: Coltheart, M., Rastle, K., Perry, C., Langdon, R. & Ziegler, J. DRC: A Dual Route Cascaded model of visual word recognition and reading aloud. Psychological Review, in press. This can be obtained (.pdf 340K, 183 pp.) from the DRC Home Page at http://www.maccs.mq.edu.au/~max/DRC/ where other papers relevant to this model are also listed. Here's the Abstract: Abstract In this paper we describe a computational model of visual word recognition and reading aloud, the Dual Route Cascaded (DRC) model, a computational realisation of the dual-route theory of reading. The two tasks most commonly used to study reading are lexical decision and reading aloud; the DRC model is the only computational model of reading which can perform both of these tasks. The model is evaluated by comparing its lexical decision latency data to lexical decision latency data from various published studies of human subjects, and by comparing its reading aloud latency data to reading aloud latency data from various published studies of reading aloud by human subjects. For both tasks, it is found that a wide variety of variables which affect human latencies affect the DRC model's latencies in exactly the same way. We note a number of such effects which the DRC model simulates but which other computational models of reading do not, whereas as far as we are aware there are no effects which any other current computational model of reading can simulate but which the DRC model cannot. We conclude that the DRC model is the most successful of existing computational models of reading. M.C. Max Coltheart, Macquarie Centre for Cognitive Science, Macquarie University Sydney NSW 2109 Australia tel +61 2 9850 8086 (work) +61 2 9418 7269 (home) fax +61 2 9850 6059 (work) +61 2 9418 7101 (home) my diary http://calendar.yahoo.com/public/maxcoltheart my home page http://www.maccs.mq.edu.au/~max/ DRC's home page http://www.maccs.mq.edu.au/~max/DRC From Connectionists-Request at cs.cmu.edu Thu Aug 17 16:00:11 2000 From: Connectionists-Request at cs.cmu.edu (Connectionists-Request@cs.cmu.edu) Date: Thu, 17 Aug 2000 16:00:11 -0400 (EDT) Subject: CONNECTIONISTS Bi-Monthly reminder Message-ID: *** DO NOT FORWARD TO ANY OTHER LISTS *** This note was last updated August 15, 2000. This is an automatically posted bi-monthly reminder about how the CONNECTIONISTS list works and how to access various online resources. CONNECTIONISTS is a moderated forum for enlightened technical discussions and professional announcements. It is not a random free-for-all like comp.ai.neural-nets. The following posting guidelines are designed to reduce the amount of irrelevant messages sent to the list. Before you post, please remember that this list is distributed to thousands of busy people who don't want their time wasted on trivia. Also, some subscribers pay cash for each kbyte; they shouldn't be forced to pay for junk mail. -- Dave Touretzky & Mark C. Fuhs --------------------------------------------------------------------- To send mail to everyone on the list, address it to Connectionists at CS.CMU.EDU Note that in many mail programs a reply to a message is automatically "CC"-ed to all the addresses on the "To" and "CC" lines of the original message. If the mailer you use has this property, please make sure your personal response (request for a Tech Report etc.) is NOT broadcast over the net. Requests for address changes, deletions, etc., should be sent to: Connectionists-Request at CS.CMU.EDU The address list is now managed by Majordomo. If you are not familiar with Majordomo, send the word "help" by itself in the body of an email message to Connectionists-Request and you will receive a detailed explanation of its use. If you mention our mailing list to someone who may apply to be added to it, please make sure they use the "-requests" address and NOT "Connectionists at cs.cmu.edu". --------------------------------------------------------------------- What to post to CONNECTIONISTS ------------------------------ - The list is primarily intended to support the discussion of technical issues relating to neural computation. - We encourage people to post the abstracts of their latest papers and tech reports, provided that the report itself is available on-line (please give the URL) or the author is accepting requests for hardcopies. - Conferences and workshops should be announced on this list at most twice: once to send out a call for papers, and once to remind non-authors about the registration deadline. A flood of repetitive announcements about the same conference is not welcome here. For major neural net conferences (e.g., NIPS, IJCNN, INNS) we'll allow a second call for papers close (but not unreasonably close) to the deadline. - Announcements of job openings related to neural computation. - Announcements of new books related to neural computation. - Requests for ADDITIONAL references. This has been a particularly sensitive subject. Please try to demonstrate that you have already pursued the quick, obvious routes to finding the information you desire. You should also give people something back in return for bothering them. The easiest way to do both these things is to FIRST do the library work to find the basic references, then POST these as part of your query. Here's an example: WRONG WAY: "Can someone please mail me all references to cascade correlation?" RIGHT WAY: Enclosed is a bibliography I've compiled of papers referencing cascade correlation. If you are aware of additional papers not listed here, please send me the citations and I'll include them in the next version. What NOT to post to CONNECTIONISTS: ----------------------------------- * Requests for free software or databases. Try comp.ai.neural-nets. * Requests for reprints of papers, or for persons' email addresses. * Announcements of conferences not directly relevant to this list. Example: generic AI or computer vision conferences have their own newsgroups and mailing lists, and won't be advertised here. * Job postings, unless the posting makes specific mention of neural nets or a closely related topic (e.g., computational neuroscience.) * Postings not properly formatted. 80 columns is the maximum line width. Do not post HTML, LaTeX, Microsoft Word, or Postscript files. Do not include any attachments. ------------------------------------------------------------------------------- The CONNECTIONISTS Archive: --------------------------- All e-mail messages sent to "Connectionists at cs.cmu.edu" starting 27-Feb-88 are available for public perusal. A separate file exists for each month. The files' names are: arch.yymm where yymm stand for the obvious thing. Thus the earliest available data are in the file: arch.8802 Files ending with .Z are compressed using the standard unix compress program. The files ending with .gz are compressed using the GNU gzip program. In the event that you do not already have gzip, it is available via ftp from "prep.ai.mit.edu" in the "/pub/gnu" directory. To browse through these files (as well as through other files, see below) you must FTP them to your local machine. The file "current" in the same directory contains the archives for the current month. ------------------------------------------------------------------------------- How to Access Files from the CONNECTIONISTS Archive --------------------------------------------------- There are two ways to access the CONNECTIONISTS archive: 1. Using your World Wide Web browser. Enter the following location: http://www.cs.cmu.edu/afs/cs/project/connect/connect-archives/ 2. Using an FTP client. a) Open an FTP connection to host FTP.CS.CMU.EDU b) Login as user anonymous with password your username. c) 'cd' directly to the following directory: /afs/cs/project/connect/connect-archives The archive directory is the ONLY one you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into this directory. Problems? - contact us at "Connectionists-Owner at cs.cmu.edu". From sml at essex.ac.uk Wed Aug 23 12:55:14 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Wed, 23 Aug 2000 17:55:14 +0100 Subject: OCR competition (first announcement) Message-ID: <7051D4D8783DD411A7EF00A0C9DD67E1664127@sernt14.essex.ac.uk> Dear All, I would like to draw your attention to an OCR contest that we intend to run in the near future. This will be sponsored by the Post Office (UK) who intend to contribute a modest prize for the winner - but the intention is also to give developers useful feedback on how their algorithms compare with other ones on a range of OCR datasets and a range of different criteria. Currently, all submissions must be in Java, either in the form of an archive of class files (jar, zip or gzip etc) or a single body of source code. The problem set up is described on Algoval: http://algoval.essex.ac.uk Follow link to problems then to Trainable OCR. We have yet to decide the exact criteria for winning, but will likely be based on a combination of accuracy, speed and memory considerations - this will be some function of the performance criteria that are already mentioned on the web page. The intention is to run the competition in October of this year. This announcement is to give researchers chance to comply with the interface for the problem. If you wish to have solutions in languages other than Java considered, please send me a brief email to this effect. I don't think this will be possible for this competition, but if I get a strong enough response I'll increase the priority of allowing other-language submissions in the future. Best regards, Simon Lucas ps. After Thursday 24th August I'm away for two weeks, so I may be a bit slow responding to your emails. ------------------------------------------------ Dr. Simon Lucas Senior Lecturer Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom http://algoval.essex.ac.uk Email: sml at essex.ac.uk ------------------------------------------------- From schultz at cns.nyu.edu Wed Aug 23 13:26:58 2000 From: schultz at cns.nyu.edu (Simon Schultz) Date: Wed, 23 Aug 2000 13:26:58 -0400 Subject: Preprints on neural coding Message-ID: <39A40962.D16E8C0F@cns.nyu.edu> Dear Connectionists, The following two preprints are available for downloading: 1. ................................................................... "A UNIFIED APPROACH TO THE STUDY OF TEMPORAL, CORRELATIONAL AND RATE CODING", Stefano Panzeri* and Simon R. Schultz+. In press, Neural Computation. * Neural Systems Group, Department of Psychology, University of Newcastle upon Tyne. + Howard Hughes Medical Institute & Center for Neural Science, New York University. ------ Abstract: ------ We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each of which reflect something about potential coding mechanisms. This is possible in the coding r{\'e}gime in which few spikes are emitted in the relevant time window. This approach allows us to study the additional information contributed by spike timing beyond that present in the spike counts; to examine the contributions to the whole information of different statistical properties of spike trains, such as firing rates and correlation functions; and forms the basis for a new quantitative procedure for the analysis of simultaneous multiple neuron recordings. It also provides theoretical constraints upon neural coding strategies. We find a transition between two coding r{\'e}gimes, depending upon the size of the relevant observation timescale. For time windows shorter than the timescale of the stimulus-induced response fluctuations, there exists a spike count coding phase, where the purely temporal information is of third order in time. For time windows much longer than the characteristic timescale, there can be additional timing information of first order, leading to a temporal coding phase in which timing information may affect the instantaneous information rate. In this new framework we study the relative contributions of the dynamic firing rate and correlation variables to the full temporal information; the interaction of signal and noise correlations in temporal coding; synergy between spikes and between cells; and the effect of refractoriness. We illustrate the utility of the technique by analysis of a few cells from the rat barrel cortex. Download alternatives: http://www.cns.nyu.edu/~schultz/tcode.ps.gz http://www.cns.nyu.edu/~schultz/tcode.pdf 2. ................................................................... "SYNCHRONISATION, BINDING AND THE ROLE OF CORRELATED FIRING IN FAST INFORMATION TRANSMISSION", Simon R. Schultz+, Huw D. R. Golledge* and Stefano Panzeri*. To be published in S. Wermter, J. Austin and D. Willshaw (Eds.), Emergent Neural Computational Architectures based on Neuroscience, Springer-Verlag, Heidelberg. + Howard Hughes Medical Institute & Center for Neural Science, New York University. * Neural Systems Group, Department of Psychology, University of Newcastle upon Tyne. ------ Abstract: ------ Does synchronisation between action potentials from different neurons in the visual system play a substantial role in solving the binding problem? The binding problem can be studied quantitatively in the broader framework of the information contained in neural spike trains about some external correlate, which in this case is object configurations in the visual field. We approach this problem by using a mathematical formalism that quantifies the impact of correlated firing in short time scales. Using a power series expansion, the mutual information an ensemble of neurons conveys about external stimuli is broken down into firing rate and correlation components. This leads to a new quantification procedure directly applicable to simultaneous multiple neuron recordings. It theoretically constrains the neural code, showing that correlations contribute less significantly than firing rates to rapid information processing. By using this approach to study the limits upon the amount of information that an ideal observer is able to extract from a synchrony code, it may be possible to determine whether the available amount of information is sufficient to support computational processes such as feature binding. Download alternatives: http://www.cns.nyu.edu/~schultz/emernet.ps.gz http://www.cns.nyu.edu/~schultz/emernet.pdf -- Dr. Simon R. Schultz Phone: +1-212 998 3775 Howard Hughes Medical Institute & Fax: +1-212 995 4011 Center for Neural Science, Email:schultz at cns.nyu.edu New York University, 4 Washington Place, New York NY 10003, U.S.A. http://www.cns.nyu.edu/~schultz/ From ckiw at dai.ed.ac.uk Wed Aug 23 09:39:53 2000 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Wed, 23 Aug 2000 14:39:53 +0100 (BST) Subject: Faculty positions at University of Edinburgh, UK Message-ID: The Division of Informatics at the University of Edinburgh is seeking to hire two lecturers. I am keen to encourage people from the machine learning/probabilistic modelling fields to apply. Note that this area is highlighted in the advert (see below). Apologies if you receive this message multiple times. Informal questions and requests for information can be sent to c.k.i.williams at ed.ac.uk Chris Williams Dr Chris Williams c.k.i.williams at ed.ac.uk Institute for Adaptive and Neural Computation Division of Informatics, University of Edinburgh 5 Forrest Hill, Edinburgh EH1 2QL, Scotland, UK fax: +44 131 650 6899 tel: (direct) +44 131 651 1212 (department switchboard) +44 131 650 3090 http://anc.ed.ac.uk/ -------------------------------------------------------------------- LECTURESHIP IN INFORMATICS The Division of Informatics (http://www.informatics.ed.ac.uk) has inherited a very strong tradition of research in computer systems, theoretical computer science, cognitive science, artificial intelligence, robotics and neural networks. You will add to our existing strengths in research and teaching, encourage the integration of your own research with that of others and contribute to the development of Informatics. You can work in any area of Informatics, but we would particularly welcome applications where algorithms and complexity or machine learning was your primary interest. We would also especially welcome your application if you are working in bioinformatics, computer systems, virtual or augmented reality, knowledge representation or cognitive modelling. The appointment will be at the Lecturer (18,731 - 34,601 pounds) scales (salary scales under review). Please quote ref: 306630 Further particulars can be found at http://www.informatics.ed.ac.uk/events/vacancies/lectureship_306630.html and applications packs can be obtained from the PERSONNEL DEPARTMENT, The University of Edinburgh, 9-16 Chambers Street, Edinburgh EH1 1HT, UK Closing date: 22 September 2000. From Daniel.Memmi at imag.fr Thu Aug 24 06:10:34 2000 From: Daniel.Memmi at imag.fr (Daniel Memmi) Date: Thu, 24 Aug 2000 12:10:34 +0200 Subject: Report on Flexible Word Order Message-ID: The following technical report might of interest to connectionists in natural language processing and to computational linguists: "Processing Flexible Word Order Languages with Neural Networks" D. Memmi (LEIBNIZ-IMAG-CNRS, Grenoble, France) Abstract: most research in computational linguistics is strongly inspired by the particular structure and rigid word order of the English language, although many common languages exhibit in fact a much freer syntax. These languages should be interpreted by using various other cues (morphological or semantic) beside word order. We will show how recurrent neural networks can learn to use a variety of cues as well as word order so as to interpret simple sentences. This connectionist approach will be applied to several different languages: Japanese, French and Spanish. In this way will be demonstrated the diversity of linguistic cues available for natural language processing. The report is available on-line at the following address. - for a PDF version: http://www-leibniz.imag.fr/LesCahiers/CLLeib03.pdf - for a Postscript version: http://www-leibniz.imag.fr/LesCahiers/CLLeib03.ps _________________________________________________________ Daniel Memmi Neural Networks Group LEIBNIZ-IMAG Tel: (33) 4 76 57 46 64 46, avenue Felix Viallet Fax: (33) 4 76 57 46 02 38000 Grenoble (France) Email: memmi at imag.fr www-leibniz.imag.fr www-leibniz.imag.fr/RESEAUX/ _________________________________________________________ From bengio at idiap.ch Fri Aug 25 05:10:20 2000 From: bengio at idiap.ch (Samy Bengio) Date: Fri, 25 Aug 2000 11:10:20 +0200 (MET DST) Subject: new tech report and new version of SVMTorch Message-ID: We would like to announce the following: Regarding SVMTorch, our implementation of Support Vector Machines for Large-Scale Regression and Classification Problems, that was previously announced on the same list, (a) A new version of SVMTorch is now available on the web at the usual place: http://www.idiap.ch/learning/SVMTorch.html The main additions compared with the previous version are the following: + A Multiclass mode (one class against the others). + An input/output sparse mode (it runs faster on sparse data). (b) A new technical report is also available reporting a convergence proof for our regression method. It is available at ftp://ftp.idiap.ch/pub/reports/2000/rr00-17.ps.gz The abstract is as follows: Recently, many researchers have proposed decomposition algorithms for SVM regression problems (see for instance [11, 3, 6, 10]). In a previous paper [1], we also proposed such an algorithm, named SVMTorch. In this paper, we show that while there is actually no convergence proof for any other decomposition algorithm for SVM regression problems to our knowledge, such a proof does exist for SVMTorch for the particular case where no shrinking is used and the size of the working set is equal to 2, which is the size that gave the fastest results on most experiments we have done. This convergence proof is in fact mainly based on the convergence proof given by Keerthi and Gilbert [4] for their SVM classification algorithm. ----- Samy Bengio Research Director. Machine Learning Group Leader. IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Switzerland. tel: +41 27 721 77 39, fax: +41 27 721 77 12. mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio From bengio at idiap.ch Fri Aug 25 05:21:25 2000 From: bengio at idiap.ch (Samy Bengio) Date: Fri, 25 Aug 2000 11:21:25 +0200 (MET DST) Subject: correction for new tech rep regarding SVMTorch: 17 --> 24 Message-ID: Sorry, the new tech report available regarding the convergence proof of our regression algorithm for SVMTorch is ftp://ftp.idiap.ch/pub/reports/2000/rr00-24.ps.gz and not ftp://ftp.idiap.ch/pub/reports/2000/rr00-17.ps.gz as given in the previous mail. Again, sorry, ----- Samy Bengio Research Director. Machine Learning Group Leader. IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Switzerland. tel: +41 27 721 77 39, fax: +41 27 721 77 12. mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio From wichert at neuro.informatik.uni-ulm.de Tue Aug 29 12:07:26 2000 From: wichert at neuro.informatik.uni-ulm.de (Andreas Wichert) Date: Tue, 29 Aug 2000 18:07:26 +0200 (MET DST) Subject: PhD thesis ``Associative Computation'' Message-ID: <10008291607.AA13728@neuro.informatik.uni-ulm.de> Dear Connectionists, Remember, in August 1998 Dave Touretzky asked on the connectionistic e-mailing list: ``..I concluded that connectionist symbol processing had reached a plateau... No one is trying to build distributed connectionist reasoning systems any more, like the connectionist production system I built with Geoff Hinton...'' The PhD thesis ``Associative Computation'' tries to fill the explicatory gap... Abstract Currently neural networks are used in many different domains. But are neural networks also suitable for modeling problem solving, a domain which is traditionally reserved for the symbolic approach? This central question of cognitive science is answered in this work. It is affirmed by corresponding neural network models. The models have the same behavior as the symbolic models. However, also additional properties resulting from the distributed representation emerge. It is shown by comparison of those additional abilities with the basic behavior of the model, that the additional properties lead to a significant algorithmic improvement. This is verified by statistical hypothesis testing. The associative computer, a neural model for a reaction system based on the assembly theory, is introduced. It is shown that planning can be realized by a neural architecture that does not use symbolic representation. A crucial point is the description of states by pictures. The human ability to process images and understand w hat they mean in order to solve a problem holds an important clue to how the human thought process works. This clue is examined by empirical experiments with the associative computer. One general conclusion from the experiments is the claim that it is possible to use systematically associativestructures to perform reasoning by forming chains of associations. In addition, beside symbolical problem solving, pictorial problem solving is possible. Available at: 1) web site, in gzipped postscript formatat: http://www.informatik.uni-ulm.de/ni/mitarbeiter/AWichert.html 2) anonymous ftp, in gzipped postscript format or gzipped pdf format: ftp.neuro.informatik.uni-ulm.de, directory /ni/wichert 3) web site University of Ulm library in pdf format: h ttp://vts.uni-ulm.de/query/longview.meta.asp?document_id=533 4) cdrom (pdf + ps) upon request Andrzej ---------------------------------------------------------------------------- Andrzej Wichert Computer Science Department of Neural Information Processing University of Ulm Oberer Eselsberg Tel.: +49 731 502 4257 89069 Ulm Germany Fax : +49 731 502 4156 http://www.informatik.uni-ulm.de/ni/mitarbeiter/AWichert.html ---------------------------------------------------------------------------- From iiass.alfredo at tin.it Mon Aug 28 14:26:46 2000 From: iiass.alfredo at tin.it (Alfredo Petrosino) Date: Mon, 28 Aug 2000 20:26:46 +0200 Subject: Neural Nets School 2000 Message-ID: <39AAAEE6.3D14CCB9@tin.it> SECOND ANNOUNCE 5th Course of International Summer School "NeuralNets E. R. Caianiello" on Visual Attention Mechanisms 23-29 October 2000 International Institute for Advanced Scientific Studies (IIASS) Vietri sul Mare, Salerno (Italy) DIRECTOR OF THE 5TH COURSE Virginio CANTONI (Pavia University, Italy) DIRECTORS OF THE SCHOOL Michael JORDAN (University of California, Berkely, USA) Maria MARINARO (Salerno University, Italy) The preliminary program of the School is avalilable at http://www.iiass.it/nnschool. Due to holidays the NEW DEADLINE for submitting applications and proposals of poster is SEPTEMBER 24 2000 The school, open to all suitably qualified scientists from around the world, is organized in lectures, panel discussions and poster presentations and will cover a number of broad themes relevant to Visual Attention, among them: - Foundation: Early vision, Visual streams, Perception and action, Log-map analysis - Attentional mechanisms: Pop-out theory, Texton theory, Contour integration and closure, Fuzzy engagement mechanisms - Visual search : Attentional control, Selective attention, Spatial attention, Detection versus discrimination - Multiresolution and planning : Complexity of search tasks, Hierarchical perceptual loops, Multiresolution and associative memory systems, Attention and action planning - Attentional Visual Architectures: Neural models of visual attention, Hierarchical and associative networks, Attentional pyramidal neural mechanisms - Experiences: Eyeputer and scanpath recorders, etc. INVITED SPEAKERS : Virginio CANTONI, Pavia University Leonardo CHELAZZI, Verona University, Italy Vito DI GESU`, Palermo University, Italy Hezy YESHURUN, Haifa University, Israel Zhaoping LI, Gatsby, University College, London, UK Luca LOMBARDI, Pavia University, Italy Carlo Alberto MARZI, Verona University, Italy Alain MERIGOT, University of Paris Sud, France Eliano PESSA, Roma University, Italy Alfredo PETROSINO, INFM-Salerno University, Italy Marco PIASTRA, Pavia University, Italy Vito ROBERTO, Udine University, Italy Dov SAGI, Weizmann University, Israel John TSOTSOS, Center for Computer Vision, Canada Daniela ZAMBARBIERI, Pavia University, Italy Harry WECHSLER, George Mason University, USA Steven YANTIS, Johns Hopkins University, USA SITE Vietri sul Mare is located within walking distance from Salerno and marks the beginning of the Amalfi coast. Visit http://www.comune.vietri-sul-mare.sa.it/ for more information. FEE The shool fee is 700 USD including the cofee breaks and a copy of the proceedings. The full school fee, including accommodation in twin room, meals, one day of excursion, and a copy of the proceedings of the school is 1200 USD, reduced to 1000 USD for students For further information please contact : Dr. A.Petrosino Fax: + 39 89 761189 Email: iiass.alfredo at tin.it ============================CUT====================================== APPLICATION FORM Title:_______ Family Name: ________________________________________________________ Other Names:_________________________________________________________ Name to appear on badge: ____________________________________________ MAILING ADDRESS: Institution _________________________________________________________ Department __________________________________________________________ Address _____________________________________________________________ State ____________________________ Country __________________________ Phone:____________________________ Fax: _____________________________ E-mail: _____________________________________________________________ Arrival date: __________________ Departure date: ____________________ Will you be applying for a scholarship ? yes/no (Please include in your application the amount of bursary support and a justification for the request) Will you submit a poster ? yes/no (Please include a one page abstract for review by the organizers). ============================CUT====================================== Please send the application form by electronic mail to: iiass.alfredo at tin.it, subject: Neural Nets school; or by fax to: Neural Nets School, +39 89 761 189 or by ordinary mail to the address: Neural Nets School IIASS, Via Pellegrino 19 I84019 Vietri sul Mare (Sa) Italy From nnsp00 at neuro.kuleuven.ac.be Tue Aug 29 03:58:57 2000 From: nnsp00 at neuro.kuleuven.ac.be (NNSP2000, Sydney) Date: Tue, 29 Aug 2000 09:58:57 +0200 Subject: IEEE NNSP'00, Sydney Message-ID: <39AB6D41.B6B4C78F@neuro.kuleuven.ac.be> ***************************************************************** CALL FOR PARTICIPATION 2000 IEEE Workshop on Neural Networks for Signal Processing December 11-13, 2000, Sydney, Australia (Early Registration: September 15, 2000) Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council ***************************************************************** Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council, the tenth of a series of IEEE workshops on Neural Networks for Signal Processing will be held at the University of Sydney, Australia December 11-13, 2000. The workshop will feature a high quality technical program, and three invited plenary speeches presented by experts in the field: - Professor Tam?s Roska (Hungarian Academy of Sciences) Signal Processing via Neural Networks become practical - via Analogic TeraOPS Visual Microprocessor Chips. - Dr. David Fogel (Natural Selection, Inc.) Evolving Models for Signal Processing. - Dr. Brian Ferguson (Defence Science and Technology Organisation) Signal Processing Applications for Submarines, Surveillance and Survival. In addition, Professor Sun-Yan Kung of Princeton University will present a joint keynote speech on "Adaptive Techniques for Intelligent Internet Multimedia Communication". to the workshop and the First IEEE Pacific-Rim Conference on Multimedia. The workshop is being held in conjuction with the First IEEE Pacific-Rim Conference on Multimedia (2000 International Symposium on Multimedia Information Processing). The public are cordially invited to participate in this important event. Early registration can be made before September 15, 2000 through our homepage: http://eivind.imm.dtu.dk/nnsp2000/ For further information, please contact the NNSP 2000 Organizing Committee at: TEL: +61 2 9351 5642 Fax: +61 2 9351 3847 or Email: nnsp2000org at eivind.imm.dtu.dk ORGANIZATION Honorary Chair Bernard WIDROW Stanford University General Chairs Ling GUAN University of Sydney email: ling at ee.usyd.edu.au Kuldip PALIWA Griffith University email: kkp at shiva2.me.gu.edu.au Program Chairs T?lay ADALI University of Maryland, Baltimore County email: adali at umbc.edu Jan LARSEN Technical University of Denmark email: jl at imm.dtu.dk Finance Chair Raymond Hau-San WONG University of Sydney email: hswong at ee.usyd.edu.au Proceedings Chairs Elizabeth J. WILSON Raytheon Co. email: bwilson at ed.ray.com Scott C. DOUGLAS Southern Methodist University email: douglas at seas.smu.edu Publicity Chair Marc van HULLE Katholieke Universiteit, Leuven email: marc at neuro.kuleuven.ac.be Registration and Local Arrangements Stuart PERRY Defence Science and Technology Organisation email: Stuart.Perry at dsto.defence.gov.au Europe Liaison Jean-Francois CARDOSO ENST email: cardoso at sig.enst.fr America Liaison Amir ASSADI University of Wisconsin at Madison email: ahassadi at facstaff.wisc.edu Asia Liaison Andrew BACK Katestone Scientific email: andrew.back at usa.net PROGRAM COMMITTEE: Amir Assadi Yianni Attikiouzel John Asenstorfer Andrew Back Geoff Barton Herv? Bourlard Andy Chalmers Zheru Chi Andrzej Cichocki Tharam Dillon Tom Downs Hsin Chia Fu Suresh Hangenahally Marwan Jabri Haosong Kong Shigeru Katagiri Anthony Kuh Yi Liu Fa-Long Luo David Miller Christophe Molina M Mohammadian Erkki Oja Soo-Chang Pei Jose Principe Ponnuthurai Suganthan Ah Chung Tsoi Marc Van Hulle A.N. Venetsanopoulos Yue Wang Wilson Wen From paolo at dsi.unifi.it Wed Aug 30 04:48:14 2000 From: paolo at dsi.unifi.it (Paolo Frasconi) Date: Wed, 30 Aug 2000 10:48:14 +0200 (ora legale Europa occ.) Subject: Special issue on integration of symbolic and connectionist systems Message-ID: Integration of symbolic and connectionist systems Special issue of the journal Cognitive Systems Research CALL FOR PAPERS BACKGROUND A successful integration of connectionist and other statistical learning systems with symbol based techniques could bridge reasoning and knowledge representation with empirical learning, significantly advancing our ability of modeling cognitive processes under a unified perspective. This area has attracted many researchers, both in computer and cognitive sciences, but is still replete of serious difficulties and challenges. While existing models can address particular aspects in this integration (often by making use of different assumptions and techniques) only few unified approaches have been proposed and they are still very limited, showing both the lack of a full understanding of the relevant aspects of this discipline and the broad complexity in scope and tasks. One of the main difficulties is that symbolic techniques can easily deal with rich and expressive representation languages, whereas connectionist/statistical learners are mostly effective in the case of simple propositional languages. As a result, it is customary to exploit the learning capabilities of these models by operating on "flat" (vector-based, or attribute-value) representations, but this often requires additional machinery for interfacing the learner with the actual domain of interest. For example, if data are available in structured or semi-structured way, a conversion into a lower level (propositional) representation is often performed. Similarly, since internal representations associated with the learner are not easily interpretable, a conversion into a higher level representation language is also necessary. Different aspects of integration have been investigated and probed independently from each other and thus a higher level of cross-interaction among these issues is necessary, making use of all the computational tools we have available, such as deterministic and probabilistic approaches, event-based modeling, computational logic, computational learning theory, and so on. TOPICS In this special issue we aim at collecting high quality papers that show novel methods, ideas, and positions, advancing to the state of the art in this area. We encourage submissions of papers addressing, in addition to other relevant issues, the following topics: - Algorithms for extraction, injection and refinement of symbolic knowledge from, into and by neural networks. - Inductive discovery/formation of structured knowledge. - Classification, recognition, prediction, matching and manipulation of structured information. - Relational learning using connectionist and belief network techniques. - Comparisons between connectionist/statistical, and symbolic learning systems that can exploit super-propositional representation languages. - Applications of hybrid symbolic-connectionist models to real-world problems. - Taxonomies that may be useful for selecting the best suited integration paradigms and techniques to be used in particular applications. - Investigations of fundamental aspects of and approaches to the integration of symbolic and connectionist methods. SUBMISSIONS GUIDELINES Papers should be submitted electronically (PostScript and PDF are the only acceptable formats) by using the anonymous ftp site ftp.dsi.unifi.it. Use the /csr directory to deposit submissions. Please choose a unique clearly identifying filename. For convenience, common compression utilities (gzip, winzip, compress) can be used. We also require a follow up email message to paolo at dsi.unifi.it to let us know that the file has been posted. In the email message please include title, keywords, abstract, and full address of the contacting author. For details about the journal of Cognitive Systems Research, and to download the journal template files (LaTeX, Word, etc.) please visit http://www.elsevier.nl/locate/cogsys. For updates about the special issue, please visit http://www.dsi.unifi.it/~paolo/csr. IMPORTANT DATES Submission Deadline: December 15, 2000 Notification to authors: March 15, 2000 GUEST EDITORS Paolo Frasconi, University of Florence, Italy (paolo at dsi.unifi.it) Marco Gori, University of Siena, Italy, (marco at ing.unisi.it) Franz Kurfess, California Polytechnic State University (fkurfess at csc.calpoly.edu) Alessandro Sperduti, University of Pisa, Italy (perso at di.unipi.it) From andre at snowhite.cis.uoguelph.ca Wed Aug 30 15:05:45 2000 From: andre at snowhite.cis.uoguelph.ca (andre de carvalho) Date: Wed, 30 Aug 2000 15:05:45 -0400 Subject: IJNS special issue: call for abstracts Message-ID: <39AD5B09.5BADD8F9@snowhite.cis.uoguelph.ca> **** CALL FOR PAPERS **** Special issue: "Non-Gradient Learning Techniques" International Journal of Neural Systems, IJNS http://www.wspc.com/journals/ijns/ijns.html Guest Editors: Andre de Carvalho and Stefan C. Kremer Submission deadline: Abstract: Sept. 29th, 2000 Final version: Dec. 1st, 2000 BACKGROUND Many of the learning techniques currently employed to train Artificial Neural Networks are based on a gradient descendent of an error function. Although they provide solutions for a large number of applications, they have a number of limitations, including ending up in local minima, spending large amounts of time traversing flat regions of the error-space and requiring differentiable activation functions. While modifications to gradient-based algorithms have been proposed to deal with these issues, there have also recently been a number of initiatives to develop non-gradient-based techniques for learning. The latter are the focus of this special issue. TOPICS The aim of this special issue is to solicit and publish valuable papers that provide a clear picture of the state of the art in this area. We encourage submissions of articles addressing, in addition to other relevant issues, the following topics: - Analysis of limitations to gradient learning approaches - Non-gradient based learning algorithms - Evolutionary training - Unsupervised learning - Auto-associative memories - Analysis and solutions for applications where non-gradient approaches are required. - Survey of non-gradient techniques - Real world applications of non-gradient learning algorithms INSTRUCTIONS This special issue will have a mix of invited and openly solicited papers. All contributed and invited papers will be refereed to ensure high quality and relevance to IJNS readers. Authors are encouraged to use LaTeX format. The IJNS LaTeX style files can be downloaded from http://www.wspc.com/others/style_files/journal/ijns/128-ijns.zip The submissions should be sent by e-mail or computer disk in Postscript or PDF format (other file formats cannot be accepted). They should be in English, not exceed 12 double spaced (excluding Figures and Tables) pages and be formatted according to the Journal submission guidelines found in http://www.wspc.com/journals/ijns/ijns.html The title and the abstract of proposed papers should be sent separately in ASCII format by September 29th. With your submission, please provide a cover letter with: - The title of the paper; - The author names together with their affiliation address, email and telephone number. IMPORTANT DATES Submission of title and abstract (e-mail): Sept. 29th, 2000 Submission of final version deadline: Dec. 1st, 2000 Notification of acceptance: Jan 5th, 2001 Expected publication date: Mid-to-late 2001. The materials should be submitted to one of the Guest Editors: GUEST EDITORS Dr. Andre de Carvalho Guelph Natural Computation Group Department of Computing and Information Science University of Guelph Guelph, Canada, N1G 2W1 E-mail: andre at snowhite.cis.uoguelph.ca Dr. Stefan C. Kremer Guelph Natural Computation Group Department of Computing and Information Science University of Guelph Guelph, Canada, N1G 2W1 E-mail: skremer at snowhite.cis.uoguelph.ca From wiskott at itb.biologie.hu-berlin.de Tue Aug 1 03:59:21 2000 From: wiskott at itb.biologie.hu-berlin.de (Laurenz Wiskott) Date: Tue, 1 Aug 2000 09:59:21 +0200 Subject: Jobs for PhD-Stud. and Postdocs in Berlin Message-ID: <200008010759.JAA26017@huxley.biologie.hu-berlin.de> Dear Connectionists, could you please forward this job advertisement to students and postdocs who might be interested. Thanks, Laurenz Wiskott. ___________________________________________________________________________ Open Positions for PhD-Students and Postdocs in Neuroinformatics at the Innovationskolleg Theoretische Biologie, Berlin ___________________________________________________________________________ Institute: Innovationskolleg Theoretische Biologie Humboldt-Universitt zu Berlin Invalidenstrae 43 D-10115 Berlin, Germany The Innovationskolleg Theoretische Biologie is a young and dynamic lab with three full professors and about 30 students and researchers doing interdisciplinary and innovative research in different areas of theoretical biology, such as Molecular and Cellular Evolution (Prof. Hanspeter Herzel), Evolution of Organismic Systems (Prof. Peter Hammerstein), Computational Neurobiology (Prof. Andreas V.M. Herz), and Neuroinformatics (Dr. Laurenz Wiskott). Research group: The positions are available in the junior research group led by Dr. Laurenz Wiskott and funded by the Volkswagen Foundation. Research topics: Learning Invariances from Sensory Data: Biological Principles and Technical Applications. The central issue addressed by the junior research group is the question of how invariant object perception can be achieved and learned in sensory systems, especially in the primate visual system. Extensive computer simulations and analytical considerations shall demonstrate that invariances can be learned based on visual experience. On the biological side, the existing learning rules have to be transferred to models of spiking neuron. On the technical side the existing system shall be extended to the processing of video sequences and technical data. Time: The positions will be available beginning November 2000. The initial appointment will be for 1 year with a possible extension by additional 2 to 4 years. Postdocs may also apply for part-time employment. Requirements: Candidates should have a degree in physics, electrical engineering, computer science, biology, or a related field. Required are strong mathematical and programming skills, as well as the ability to communicate and work well in a team. Salary: Salary will be up to BAT IIa and will depend on qualification, experience, age, and family status. BAT is the regular salary scale for public employees in Germany. Inquiries: Informal inquiries can be addressed to Dr. Laurenz Wiskott (wiskott at itb.biologie.hu-berlin.de). Application: Applications, including a detailed CV, a statement of research interests, and the names and addresses of two professional referees, should be sent to Dr. Laurenz Wiskott at the address given above. Please send only copies and not original documents, since the applications will not be sent back. Applications via e-mail, possibly including a reference to a well maintained web-page, are equally acceptable. Deadline: None. Applications will be accepted until the positions have been filled, as will be indicated on the web-page of this job advertisement; see below. PS: Could you please indicate in your application where you first read about this job advertisement. WWW: http://itb.biologie.hu-berlin.de/~wiskott/jobs.html. From skremer at q.cis.uoguelph.ca Tue Aug 1 23:41:33 2000 From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer) Date: Tue, 1 Aug 2000 23:41:33 -0400 (EDT) Subject: No subject Message-ID: Dear Connectionists: We are pleased to announce the official start of the NIPS 2000 Unlabeled Data Supervised Learning Competition! This competition is designed to compare algorithms and architectures that use unlabeled data to help supervised learning. The first four datasets, and more details, are now available at the competition web-site: http://q.cis.uoguelph.ca/~skremer/NIPS2000/ May the best algorithm win! Stefan, Deb, and Kristin -- -- Dr. Stefan C. Kremer, Assistant Prof., Dept. of Computing and Information Science University of Guelph, Guelph, Ontario N1G 2W1 WWW: http://hebb.cis.uoguelph.ca/~skremer Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323 E-mail: skremer at snowhite.cis.uoguelph.ca From herbert.jaeger at gmd.de Thu Aug 3 13:33:47 2000 From: herbert.jaeger at gmd.de (Herbert Jaeger) Date: Thu, 03 Aug 2000 19:33:47 +0200 Subject: PhD and PostDoc positions Message-ID: <3989ACFB.61243CE2@gmd.de> DOCTORATE AND POSTDOCTORATE POSITIONS FOR RESEARCH ON BIOLOGICALLY ORIENTED INFORMATION PROCESSING New doctorate and Postdoc research positions are open immediately at the German National Research Center for Information Technology (GMD, http://www.gmd.de/) in the Institute for Autonomous Intelligent Systems (AiS, http://ais.gmd.de/). In the interdisciplinary arena of neural information processing, robotics, cognitive science and neurobiology, AiS will intensify its research on biologically inspired neural information processing systems. Research will focus on the aspects of stochasticity, parallelism, asynchrony, self-organization and high-bandwidth communication between subsystems. The methodological focus within AiS is on mathematical modeling and robot implementations. The biological side is substantiated by collaborations with leading neuroscience and ethological research institutions. Within this wide field, we solicit applications of young researchers who combine a broad scientific curiosity with the desire and the training to achieve "hard" results (mathematical models, algorithms, robot implementations). The candidate should have an education in computational neuroscience, (neuro-)biology, mathematics, control engineering, computer science, physics, cognitive science or related areas. A characteristic of both PhD and postdoc positions at GMD is intellectual freedom. The positions are not connected to pre-defined projects but admit (indeed, require) that candidates define their own research objectives within the general research framework of AiS. For the PhD positions we expect a university degree (master, diploma). Payment will be in agreement with German BAT 2a salary (corresponding to university research assistants) in a part-time (19 hrs/week) contract. The position will be granted initially for one year, with a possible extension for two further years on mutual agreement. For the postdoc positions, we expect a PhD and a significant research record. Contracts will be full-time BAT 2a (equivalent to university research assistants) for 2 years initially with a possible extension of one further year. GMD is an equal opportunity employer. Full applications should comprise a CV, copies of university degrees (including marks), letters of reference, a publication list, and - importantly - a research plan. We encourage sending a preliminary application by email, consisting of a CV, a publication list, and a statement of research interests. Please direct applications and inquiries to Dr. Herbert Jaeger Phone +49-2241-14-2253 GMD, AiS.BE Fax 2241-14-2384 Schloss Birlinghoven D-53754 Sankt Augustin, Germany email: herbert.jaeger at gmd.de web: http://www.gmd.de/People/Herbert.Jaeger/ From oby at cs.tu-berlin.de Thu Aug 3 10:01:51 2000 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Thu, 3 Aug 2000 16:01:51 +0200 (MET DST) Subject: postdoctoral position Message-ID: <200008031401.QAA03258@pollux.cs.tu-berlin.de> Postdoctoral Position (salary level C1) Neural Information Processing Group, Department of Computer Science Technical University of Berlin, Berlin, Germany A postdoctoral position is available within the NI group at the Technical University of Berlin, Germany, beginning Oct. 1st 2000. The position is initially for a duration of three years, but an extension up to six years is possible. The successful candidate may choose between the following research areas: - computational models of visual cortex - theory of unsupervised learning - analysis of functional imaging data Teaching duties include 4 hours tutoring per week (in German) during the winter and summer terms, i.e. the successful candidate must be fluent in the German language. Interested candidates please send their CV, transcripts of their certificates, a short statement of their research interest and a list of publications to: Prof. Klaus Obermayer FR2-1, NI, Informatik, Technische Universitaet Berlin Franklinstrasse 28/29, 10587 Berlin, Germany phone: ++49-30-314-73120, fax: -73121, email: oby at cs.tu-berlin.de preferably by email. For a list of relevant publications and an overview of current research projects please refer to our web-page at: http://ni.cs.tu-berlin.de/ Cheers Klaus --------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ From cjlin at csie.ntu.edu.tw Mon Aug 7 00:52:06 2000 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Mon, 7 Aug 2000 12:52:06 +0800 (CST) Subject: new version of two software for SVM Message-ID: <200008070452.MAA02961@ntucsa.csie.ntu.edu.tw> Dear Colleagues: We announce the new version of two software for support vector machines. 1. LIBSVM 2.0 http://www.csie.ntu.edu.tw/~cjlin/libsvm Not only for large classification, LIBSVM now solves regression, and two variants: nu-SVM and one-class SVM. Like version 1.0, the new release is still a simple and easy-to-use tool. All different SVMs are implemented in one short file (1000-line C++ code). Graphics interfaces for different SVMs are also provided. 2. BSVM 1.1 http://www.csie.ntu.edu.tw/~cjlin/bsvm The main improvement is on solving the optimization sub-problem of the decomposition method. This makes the use of larger working sets possible. Unlike the previous version which was implemented by combining C and Fortran programs, BSVM 1.1 uses only C and the code is simpler. Any comments are very welcome. Sincerely, Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan cjlin at csie.ntu.edu.tw From gary at nd.com Mon Aug 7 11:54:22 2000 From: gary at nd.com (Gary Lynn) Date: Mon, 7 Aug 2000 11:54:22 -0400 Subject: JOB: Engineer needed in Florida Message-ID: NeuroDimension will be starting a research project in August 2000 to design, develop, and prototype a neural network controlled ventilator. This project involves cutting-edge neural network and bioengineering technology and is completely funded for a minimum of 2 years. NeuroDimension is looking for full-time and part-time engineers or computer scientists with bachelor?s degree or better. In addition, we can also provide research assistantships for masters or Ph.D. students. Good candidates for these positions should have skills in ONE OR MORE of the following areas and be willing to learn many of the others: - Neural networks - Digital control - Digital signal processing - Bioengineering (especially knowledge of the respiratory system) - Electrical circuit design and microcontroller software - Software design and development NeuroDimension is located in Gainesville (link http://www.state.fl.us/gvl) in north-central Florida. The pay for a full-time engineering position ranges from $37K to $45K per year. Benefits include health, dental, and retirement plans. If you are interested, please email a resume to neil at nd.com From lpk at ai.mit.edu Mon Aug 7 16:38:00 2000 From: lpk at ai.mit.edu (Leslie Pack Kaelbling) Date: Mon, 7 Aug 2000 16:38:00 -0400 (EDT) Subject: Announcing: The Journal of Machine Learning Research Message-ID: <200008072038.QAA14274@soggy-fibers.ai.mit.edu> Announcing the JOURNAL of MACHINE LEARNING RESEARCH The Journal of Machine Learning Research (JMLR) provides an international forum for the electronic and paper publication of high-quality scholarly articles in all areas of machine learning. JMLR seeks previously unpublished papers that contain: - new algorithms with empirical, theoretical, psychological, or biological justification; - experimental and/or theoretical studies yielding new insight into the design and behavior of learning in intelligent systems; - accounts of applications of existing techniques that shed light on the strengths and weaknesses of the methods; - formalization of new learning tasks (e.g., in the context of new applications) and of methods for assessing performance on those tasks; - development of new analytical frameworks that advance theoretical studies of practical learning methods; - computational models of data from natural learning systems at the behavioral or neural level; or - extremely well-written surveys of existing work. JMLR has a commitment to rigorous yet rapid reviewing; reviews are returned within six weeks of paper submission. Final versions are published electronically immediately upon receipt, and an annual paper volume is published by MIT Press and sold to libraries and individuals. JMLR is accepting new submissions. Please see http://www.jmlr.org for submission information. Accepted papers, when published on the web, will be announced via an email list. To subscribe, send an email message to majordomo at ai.mit.edu with body text "subscribe jmlr-announce" (without quotation marks). Editor: Leslie Pack Kaelbling Managing Editor: David Cohn Action Editors: Peter Bartlett, Australian National University, Australia Craig Boutilier, University of Toronto, Canada Claire Cardie, Cornell University, US Peter Dayan, University College, London, UK Thomas Dietterich, Oregon State University, US Donald Geman, University of Massachusetts at Amherst, US Michael Jordan, University of California at Berkeley, US Michael Kearns, AT&T Research, US John Lafferty, Carnegie Mellon University, US Heikki Mannila, Helsinki University of Technology, Finland Fernando Pereira, Whizbang! Laboratories, US Pietro Perona, California Institute of Technology, US Stuart Russell, University of California at Berkeley, US Claude Sammut, University of New South Wales, Australia Bernhard Schoelkopf, Microsoft Research, Cambridge, UK Larry Wasserman, Carnegie Mellon University, US Stefan Wrobel, Otto-von-Guericke-Universitat Magdeburg, Germany Editorial Board: Naoki Abe, NEC Corporation, Japan Christopher M. Bishop, Microsoft Research, UK Andrew G. Barto, University of Massachusetts, Amherst, USA Henrik Bostrom, Stockholm University/KTH, Sweden Carla Brodley, Purdue University, USA Nello Cristianini, Royal Holloway, University of London, UK William W. Cohen, Whizbang! Laboratories, USA David Cohn, Burning Glass Technologies, USA Luc De Raedt, University of Freiburg, Germany Saso Dzeroski, Jozef Stefan Institute, Slovenia Nir Friedman, Hebrew University, Israel Dan Geiger, The Technion, Israel Zoubin Ghahramani, University College London, UK Sally Goldman, Washington University, St. Louis, USA Russ Greiner, University of Alberta, Canada David Heckerman, Microsoft Research, USA Thomas Hofmann, Brown University, USA Tommi Jaakkola, Massachusetts Institute of Technology, USA Daphne Koller, Stanford University, USA Michael Littman, AT&T Research, USA Sridhar Mahadevan, Michigan State University, USA Yishay Mansour, Tel-Aviv University, Israel Andrew McCallum, Whizbang! Laboratories, USA Raymond J. Mooney, University of Texas at Austin, USA Stephen Muggleton, York University, UK Foster Provost, New York University, USA Dana Ron, Tel-Aviv University, Israel Lawrence Saul, AT&T Labs, USA John Shawe-Taylor, Royal Holloway, University of London, UK Dale Schuurmans, University of Waterloo, Canada Yoram Singer, The Hebrew University, Israel Alex Smola, Australian National University, Australia Padhraic Smyth, University of California at Irvine, USA Moshe Tennenholtz, The Technion, Israel Sebastian Thrun, Carnegie Mellon University, USA Naftali Tishby, Hebrew University, Israel David Touretzky, Carnegie Mellon University, USA Chris Watkins, Royal Holloway, University of London, UK Robert C. Williamson, Australian National University, Australia Advisory Board: Shun-Ichi Amari, RIKEN Brain Science Institute, Japan Andrew Barto, University of Massachusetts at Amherst, USA Thomas Dietterich, Oregon State University, USA Jerome Friedman, Stanford University, USA Stuart Geman, Brown University, USA Geoffrey Hinton, University College London, UK Michael Jordan, University of California at Berkeley, USA Michael Kearns, AT&T Research, USA Steven Minton, University of Southern California, USA Thomas Mitchell, Carnegie Mellon University, USA Stephen Muggleton, University of York, UK Nils Nilsson, Stanford University, USA Tomaso Poggio, Massachusetts Institute of Technology Ross Quinlan, University of New South Wales, Australia Stuart Russell, University of California at Berkeley, USA Terrence Sejnowski, Salk Institute for Biological Studies, USA Richard Sutton, AT&T Research, USA Leslie Valiant, Harvard University, USA Stefan Wrobel, Otto-von-Guericke-Universitaet, Germany From degaris at starlab.net Tue Aug 8 08:19:55 2000 From: degaris at starlab.net (Hugo de Garis) Date: Tue, 08 Aug 2000 14:19:55 +0200 Subject: CFP,Special Issue,Neurocomputing Journal, "Evolutionary Neural Systems" References: <397ED7DD.4E575005@starlab.net> Message-ID: <398FFAEB.6405B464@starlab.net> > CALL FOR PAPERS > > NEUROCOMPUTING Journal (Elsevier) > > Special Issue > "Evolutionary Neural Systems" > > Guest Editor > > Prof. Dr. Hugo de Garis (Starlab, Belgium) > > Editorial Committee > > Dr. David Fogel (Natural Selection Inc., USA) > Prof. Dr. Michael Conrad (Wayne State University, USA) > Prof. Dr. Xin Yao (Birmingham University, UK) > > Submission deadline: October 1st, 2000 > > Evolutionary Neural Systems - The application of evolutionary > algorithms to the creation of neural network based systems, in > software and/or hardware. > > The Neurocomputing Journal (http://www.elsevier.nl/locate/neucom) > invites original contributions for a forthcoming special issue on > "Evolutionary Neural Systems". > > Examples of topics relevant to this special issue include : > > -- Evolved Neural Software Systems > -- Evolved Neural Hardware Systems > -- Discussion/Analysis of Neural System Evolvability > -- Scaling Issues of Evolved Multicomponent Neural Systems > -- Evolved Artificial Nervous Systems/Artificial Brains > -- Large Scale, Real World, Applications of Evolved Neural > Systems > -- Novel Evolutionary Computation Techniques for Neural Systems > -- Related Issues > > Manuscripts (in English) should not normally exceed 10,000 words > in length and should be formatted and submitted according to the > requirements found at the journal web site (above). > > Please provide a cover page containing > > (i) the title of the paper, > (ii) family names with initial(s) of the personal name(s), > (iii) address and email of each author, and > (iv) an abstract. > > Three to five keywords should be supplied after the abstract for > indexing purposes. Figures should be cited in the text and marked in > > the left margin of the manuscript. Four hard copies of the manuscript > > should be submitted to: > > Prof. Dr. Hugo de Garis > Starlab, Blvd. St. Michel 47, B-1040, > Brussels, Belgium, Europe. > > E-mails: > > degaris at starlab.net > dfogel at natural-selection.com > conrad at cs.wayne.edu > x.yao at cs.bham.ac.uk > > This special issue will consist of a mix of invited papers and openly > solicited papers. > > =================== From kagan at ptolemy.arc.nasa.gov Wed Aug 9 18:10:26 2000 From: kagan at ptolemy.arc.nasa.gov (Kagan Tumer) Date: Wed, 9 Aug 2000 15:10:26 -0700 (PDT) Subject: ML position at NASA Ames Research Center Message-ID: <200008092210.PAA17929@avogadro.arc.nasa.gov> *** JOB ANNOUNCEMENT *** Machine Learning Research Position at NASA Ames Research Center. The Automated Learning Group at NASA Ames Research Center is seeking applicants for the position of Principal Investigator. The successful applicant for this position would initiate new research in the general field of machine learning and statistical inference. Although not restricted to such problems, the main thrust of the investigator's work is expected to involve projects to be carried out by NASA. Applicants must have a doctoral degree in a relevant field and an outstanding research record. Current members of the Automated Learning Group include David Wolpert, Peter Cheeseman, Kagan Tumer, David Maluf, Vadim Smelyanskiy, Robin Morris and Esfandiar Bandari. The Automated Learning Group is part of the the Automated Software Engineering Research Area led by Michael Lowry. Applicants should send a curriculum vitae (including a list of publications) their citizenship status and the names of at least three references to: Peter Cheeseman (cheesem at ptolemy.arc.nasa.gov), Kagan Tumer (kagan at ptolemy.arc.nasa.gov), and Mike Lowry (lowry at ptolemy.arc.nasa.gov) Post mail can be sent to all three at: Mail Stop 269-2 (For Kagan Tumer, use Mail Stop 269-3) NASA Ames Research Center Moffett Field, CA 94035 The current position is restricted to US citizens only. Future positions may be open to non-citizens. Women and minorities are strongly encouraged to apply. -- Kagan Tumer, PhD email: kagan at ptolemy.arc.nasa.gov NASA Ames Research Center http://ic.arc.nasa.gov/people/kagan/ Mail Stop 269-4 phone: (650) 604-4940 Moffett Field, CA 94035-1000 fax : (650) 604-3594 From stefan.wermter at sunderland.ac.uk Thu Aug 10 14:17:28 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Thu, 10 Aug 2000 19:17:28 +0100 Subject: 8 new positions at University of Sunderland, Computing and Engineering. Message-ID: <3992F1B8.D1720E29@sunderland.ac.uk> 8 new positions are created at University of Sunderland, Computing and Engineering. Wrt to this list, we would like to hear from qualified researchers in intelligent systems, neural networks, hybrid systems or language processing and would like to encourage them to apply for a position in intelligent systems. If you have specific questions or interests within intelligent systems, neural networks/neuroscience, hybrid systems or language processing please let me know. The brief general text for all positions is given below. Stefan Wermter stefan.wermter at sunderland.ac.uk --------------------------------------------- The School of Computing, Engineering and Technology draws together the disciplines of Computing, Information Systems, Engineering and Technology. It enjoys a strong student base and a growing research profile, and provides first-class computing facilities. The School has almost 2M per annum income from research and over 100 registered PhD research students. The School is looking to strengthen its staffing complement with a number of key appointments. All posts require an honours degree, and a Higher Degree in a relevant area. A relevant PhD and Membership/Fellowship of the British Computer Society would be a strong advantage. Excellent communication skills are essential. Professor of Software Engineering With an international research profile and a good publication record, you will provide strong leadership in Software Engineering. You will conduct teaching and research, and will make a major contribution to our research culture, demonstrating experience in leading research; developing external links; managing research funding; and supervising research students. Reference: CET Principal Lecturer in Computing 28,978 - 36,436 A proven team-worker with experience of leadership and module/programme design, you will lead, motivate and develop a team of staff on our MSc Programme. You must be currently research active, and able to deliver high quality, innovative teaching and research. You will create and develop external links in the UK and Europe, and some travel will be required. Reference: CET Lecturer / Senior Lecturer in Computing (6 posts) 14,902 - 30,636 Experienced in research and with proven teamworking, organisational and administrative skills, you will undertake high quality teaching, research and reach-out activities. We require staff with specialised knowledge in: Computer Systems; Software Engineering; Multimedia/ Media Systems/ Human-Computer Interaction; Intelligent Systems; Information Systems; Systems Design. Reference: CET Informal enquiries are welcomed by Professor Peter Smith on tel (0191) 5152761 or email peter.smith at sunderland.ac.uk Application is by CV with a covering letter detailing current salary and full contact details of two referees, to the Personnel Department, University of Sunderland, Langham Tower, Ryhope Road, Sunderland, SR2 7EE or email employee.recruitment at sunderland.ac.uk quoting appropriate reference number. A relocation package is available in approved cases. Working Towards Equal Opportunities *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ http://www.his.sunderland.ac.uk/intelligent/ **************************************** From krichmar at nsi.edu Fri Aug 11 19:30:57 2000 From: krichmar at nsi.edu (Jeff Krichmar) Date: Fri, 11 Aug 2000 16:30:57 -0700 Subject: Postdoctoral position Message-ID: <000601c003ec$337a6b80$0db985c6@nsi.edu> Please post and circulate as you see fit. Thank you. POSTDOCTORAL FELLOWSHIP W.M. KECK MACHINE PSYCHOLOGY LABORATORY The Neurosciences Institute, located in San Diego, California, invites applications for a POSTDOCTORAL FELLOWSHIP to study biologically based models of behaving real world devices (robots) as part of the W.M. Keck Machine Psychology Laboratory. Continuing previous research conducted at the Institute, this Laboratory will be focusing on the construction of autonomous robots, the design of simulated models of large-scale neuronal networks that are capable of guiding behavior in the real world, and on developing methods for the simultaneous analysis of neural and behavioral states. Applicants should have a background in computational neuroscience, robotics, computer science, behavioral science or cognitive science. Fellows will receive stipends appropriate to their qualifications and experience. Submit a curriculum vitae, statement of research interests, and names of three references to: Dr. Jeffrey L. Krichmar The Neurosciences Institute 10640 John Jay Hopkins Drive San Diego, California 92121 Email: krichmar at nsi.edu Fax: 858-626-2099 For a description of the project, refer to http://www.nsi.edu/nomad/. For a description of The Neurosciences Institute, refer to http://www.nsi.edu. From oreilly at grey.colorado.edu Sat Aug 12 15:27:28 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Sat, 12 Aug 2000 13:27:28 -0600 Subject: PDP++ version 2.0 Message-ID: <200008121927.NAA19000@grey.colorado.edu> ANNOUNCING: The PDP++ Software, version 2.0 Authors: Randall C. O'Reilly, Chadley K. Dawson, and James L. McClelland The PDP++ software is a neural-network simulation system written in C++. It represents the next generation of the PDP software released with the McClelland and Rumelhart "Explorations in Parallel Distributed Processing Handbook", MIT Press, 1987. It is easy enough for novice users, but very powerful and flexible for research use. The current version is 2.0, released August, 2000, which is a major upgrade from previous versions, as detailed below. The software can be obtained by anonymous ftp from: Anonymous FTP Site: ftp://grey.colorado.edu/pub/oreilly/pdp++ *or* ftp://cnbc.cmu.edu/pub/pdp++/ *or* unix.hensa.ac.uk/mirrors/pdp++/ For more information, see our web page: WWW Page: http://www.cnbc.cmu.edu/PDP++/PDP++.html There is a 250 page (printed) manual and an HTML version available on-line at the above address. The new features in 2.0 include: --------------------------------- o MS Windows platform fully supported (using CYGWIN environment) o Project View window for GUI onto project/processes & specs o Enviro View rewritten, GUI for event/pattern layout, etc. o Grid View rewritten, interactively configurable grid layout o Easy viewing of entire network weights in grid log o Easy cluster plot interface, displayed in graph log o GUI for interactive construction improved o Context-senstive help via "Help" menu on all objects (via HTML) o Lots and lots of bug fixes, minor improvements: every known way to crash software has been fixed! Software Features: ================== o Full Graphical User Interface (GUI) based on the InterViews toolkit. Allows user-selected "look and feel". o Network Viewer shows network architecture and processing in real- time, allows network to be constructed with simple point-and-click actions. o Training and testing data can be graphed on-line and network state can be displayed over time numerically or using a wide range of color or size-based graphical representations. o Environment Viewer shows training patterns using color or size-based graphical representations; interactive configuration. o Flexible object-oriented design allows mix-and-match simulation construction and easy extension by deriving new object types from existing ones. o Built-in 'CSS' scripting language uses C++ syntax, allows full access to simulation object data and functions. Transition between script code and compiled code is simplified since both are C++. Script has command-line completion, source-level debugger, and provides standard C/C++ library functions and objects. o Scripts can control processing, generate training and testing patterns, automate routine tasks, etc. o Scripts can be generated from GUI actions, and the user can create GUI interfaces from script objects to extend and customize the simulation environment. Supported Algorithms: ===================== o Feedforward and recurrent error backpropagation. Recurrent BP includes continuous, real-time models, and Almeida-Pineda. o Constraint satisfaction algorithms and associated learning algorithms including Boltzmann Machine, Hopfield models, mean-field networks (DBM), Interactive Activation and Competition (IAC), and continuous stochastic networks. o Self-organizing learning including Competitive Learning, Soft Competitive Learning, simple Hebbian, and Self-organizing Maps ("Kohonen Nets"). o Leabra algorithm that combines error-driven and Hebbian learning with k-Winners-Take-All inhibitory competition. Over 40 research-grade simulations available for this algorithm in association with new book: "Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain", O'Reilly & Munakata, 2000, MIT Press. From jrw at cns.bu.edu Wed Aug 16 10:54:19 2000 From: jrw at cns.bu.edu (Jim Williamson) Date: Wed, 16 Aug 2000 10:54:19 -0400 Subject: Paper on self-organizing topographic networks for classification Message-ID: <399AAB1B.31DFF4F5@cns.bu.edu> I would like to announce the availability of a new paper, accepted for publication in Neural Computation. Available at: http://cns-web.bu.edu/pub/jrw/www/pubs.html ------------------------------------------------------------ SELF-ORGANIZATION OF TOPOGRAPHIC MIXTURE NETWORKS USING ATTENTIONAL FEEDBACK James R. Williamson Department of Cognitive and Neural Systems Boston University, Boston, MA 02215 Neural Computation, in press. ABSTRACT This paper proposes a neural network model of supervised learning which employs biologically-motivated constraints of using local, on-line, constructive learning. The model possesses two novel learning mechanisms. The first is a network for learning topographic mixtures. The network's internal category nodes are the mixture components, which learn to encode smooth distributions in the input space by taking advantage of topography in the input feature maps. The second mechanism is an attentional biasing feedback circuit. When the network makes an incorrect output prediction, this feedback circuit modulates the learning rates of the category nodes, by amounts based on the sharpness of their tuning, in order to improve the network's prediction accuracy. The network is evaluated on several standard classification benchmarks and shown to perform well in comparison to other classifiers. From bgiven at gmu.edu Wed Aug 16 15:14:06 2000 From: bgiven at gmu.edu (Barbara K. Given) Date: Wed, 16 Aug 2000 15:14:06 -0400 Subject: post-doc position Message-ID: <399AE7FE.7414BCDE@gmu.edu> Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA. Post-doctoral Research Fellow in Neuropsychology / Event Related Potential / EEG research for a 3-year or longer federally funded project on adolescent learning to: * Identify neural characteristics of chronically disruptive youth with receptive language deficits, * Compare these characteristics with academically and socially successful age mates, * Determine if electrophysiological markers can be identified that clearly differentiate the two groups, * Investigate frontal and temporal brain activity pre and post educational intervention. Candidates should have a doctorate in a field relevant to the cognitive neuroscience, psychological testing, quantitative analysis of EEG, ERP, and signal processing required to carry out this project. FTE position to $34,000. Available Sept 1, 2000 or asap thereafter. Research team includes: Steven Schiff (Co-PI), Paul Rapp, Martha Farah, Steven Weinstein, William Gaillard, and Kristen Jerger. Address inquiries and send resume' with two references to: bgiven at gmu.edu -- Barbara K. Given, Ph.D. Director, Adolescent Learning Research Center, Krasnow Institute for Advanced Study, and Associate Professor, Graduate School of Education George Mason University Fairfax, VA 22030-4444 Phone: 703-993-4406 Fax: 703-993-4325 From vera at cs.cas.cz Fri Aug 18 14:26:56 2000 From: vera at cs.cas.cz (Vera Kurkova) Date: Fri, 18 Aug 00 14:26:56 CET Subject: ICANNGA 2001 (Prague) Message-ID: <52016.vera@uivt1.uivt.cas.cz> **************************************************************** * * * >>>>> ICANNGA 2001 <<<<< * * * * 5th International Conference * * on Artificial Neural Networks and Genetic Algorithms * * * * Prague, April 22-25, 2001 * * * * 2nd call for papers (due September 20, 2000) * * * * for details see * * http://www.cs.cas.cz/icannga * * * * Proposals for special sessions, tutorials * * and software demonstrations are welcome * * * **************************************************************** From p.tino at aston.ac.uk Fri Aug 18 12:56:46 2000 From: p.tino at aston.ac.uk (Peter Tino) Date: Fri, 18 Aug 2000 17:56:46 +0100 Subject: fractal representations of symbolic sequences Message-ID: <399D6ACE.33B0B27E@aston.ac.uk> Dear Connectionists, I would like to announce the availability of papers dealing with theoretical and practical aspects of fractal representations of (possibly very long and complex) symbolic sequences via iterative function systems. 1. P. Tino: Spatial Representation of Symbolic Sequences through Iterative Function Systems. IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans. 1999. - A theoretical study connecting multifractal properties of such representations with entropic measures on symbolic sequences 2. P. Tino, G. Dorffner: Predicting the future of discrete sequences from fractal representations of the past. Machine Learning, accepted. - Mostly empirical study of predictive models constructed on fractal representations. Predictive models are closely related to variable memory length Markov models. 3. P. Tino, G. Dorffner, Ch. Schittenkopf: Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics. In Hybrid Neural Symbolic Integration, 2000. - A connection between recurrent nets and fractal representations. 4. P. Tino, M. Koteles: Extracting finite state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE Transactions on Neural Networks, 1999. - Contains an example of using fractal representations to monitor the processes of training recurrent nets and extracting knowledge from trained nets. The papers can be downloaded from http://www.ncrg.aston.ac.uk/~tinop/my.publ.html Also available are some minor applications of this methodology in finance and natural language modeling. Best regards, Peter T. -- Peter Tino - Neural Computing Research Group Aston University, Aston Triangle, Birmingham, B4 7ET, UK (+44 (0)121) 359 3611 ext. 4285, fax: 333 6215 http://www.ncrg.aston.ac.uk/~tinop/ From steve at cns.bu.edu Fri Aug 18 18:19:22 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Fri, 18 Aug 2000 18:19:22 -0400 Subject: a neural model of learning to write Message-ID: The following article can be accessed at http://www.cns.bu.edu/Profiles/Grossberg Paper copies can also be gotten by writing Mr. Robin Amos, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215 or amos at cns.bu.edu. Grossberg S. and Paine R. W. (2000). A neural model of corticocerebellar interactions during attentive imitation and predictive learning of sequential handwriting movement. Special Issue of Neural Networks on "The global Brain: Imaging and Neural Modeling", in press. A preliminary version is available as Boston University Technical Report, CAS/CNS TR-2000-009. The paper is available in PDF format GrossbergPaine2000.pdf, or in Gzipped postscript format GrossbergPaine2000.ps.gz. ABSTRACT Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an inverse relation between curvature and speed. How are such complex movements learned through attentive imitation? Novel movements may be made as a series of distinct segments, but a practiced movement can be made smoothly, with a continuous, often bell-shaped, velocity profile. How does learning of complex movements transform reactive imitation into predictive, automatic performance? A neural model is developed which suggests how parietal and motor cortical mechanisms, such as difference vector encoding, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. To initiate movement, visual attention shifts along the shape to be imitated and generates vector movement using motor cortical cells. During such an imitative movement, cerebellar Purkinje cells with a spectrum of delayed response profiles sample and learn the changing directional information and, in turn, send that learned information back to the cortex and eventually to the muscle synergies involved. If the imitative movement deviates from an attentional focus around a shape to be imitated, the visual system shifts attention, and may make an eye movement, back to the shape, thereby providing corrective directional information to the arm movement system. This imitative movement cycle repeats until the corticocerebellar system can accurately drive the movement based on memory alone. A cortical working memory buffer transiently stores the cerebellar output and releases it at a variable rate, allowing speed scaling of learned movements which is limited by the rate of cerebellar memory readout. Movements can be learned at variable speeds if the density of the spectrum of delayed cellular responses in the cerebellum varies with speed. Learning at slower speeds facilitates learning at faster speeds. Size can be varied after learning while keeping the movement duration constant (isochrony). Context-effects arise from the overlap of cerebellar memory outputs. The model is used to simulate key psychophysical and neural data about learning to make curved movements, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a Two-Thirds Power Law relation between angular velocity and curvature. From steve at cns.bu.edu Fri Aug 18 18:53:33 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Fri, 18 Aug 2000 18:53:33 -0400 Subject: neural model of horizontal and interlaminar cortical development and adult perceptual grouping Message-ID: The following article can be accessed at http://www.cns.bu.edu/Profiles/Grossberg Paper copies can also be gotten by writing Mr. Robin Amos, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215 or amos at cns.bu.ed Grossberg S. and Williamson J. R. (2000). A neural model of how horizontal and interlaminar connections of visual cortex develop into adult circuits that carry out perceptual grouping and learning. Cerebral cortex, in press. The paper is available in PDF format GroWil00.pdf, or in Gzipped postscript format GroWil00.ps.gz. ABSTRACT: A neural model suggests how horizontal and interlaminar connections in visual cortical areas V1 and V2 develop within a laminar cortical architecture and give rise to adult visual percepts. The model suggests how mechanisms that control cortical development in the infant lead to properties of adult cortical anatomy, neurophysiology, and visual perception. The model clarifies how excitatory and inhibitory connections can develop stably by maintaining a balance between excitation and inhibition. The growth of long-range excitatory horizontal connections between layer 2/3 pyramidal cells is balanced against that of short-range disynaptic interneuronal connections. The growth of excitatory on-center connections from layer 6-to-4 is balanced against that of inhibitory interneuronal off-surround connections. These balanced connections interact via intracortical and intercortical feedback to realize properties of perceptual grouping, attention, and perceptual learning in the adult, and help to explain the observed variability in the number and temporal distribution of spikes emitted by cortical neurons. The model replicates cortical point spread functions and psychophysical data on the strength of real and illusory contours. The on-center off-surround layer 6-to-4 circuit enables top-down attentional signals from area V2 to modulate, or attentionally prime, layer 4 cells in area V1 without fully activating them. This modulatory circuit also enables adult perceptual learning within cortical area V1 and V2 to proceed in a stable way. From max at maccs.mq.edu.au Fri Aug 18 20:25:25 2000 From: max at maccs.mq.edu.au (max) Date: Sat, 19 Aug 2000 10:25:25 +1000 Subject: Computational modeling of reading Message-ID: Dear Connectionists, I would like to announce the availability of: Coltheart, M., Rastle, K., Perry, C., Langdon, R. & Ziegler, J. DRC: A Dual Route Cascaded model of visual word recognition and reading aloud. Psychological Review, in press. This can be obtained (.pdf 340K, 183 pp.) from the DRC Home Page at http://www.maccs.mq.edu.au/~max/DRC/ where other papers relevant to this model are also listed. Here's the Abstract: Abstract In this paper we describe a computational model of visual word recognition and reading aloud, the Dual Route Cascaded (DRC) model, a computational realisation of the dual-route theory of reading. The two tasks most commonly used to study reading are lexical decision and reading aloud; the DRC model is the only computational model of reading which can perform both of these tasks. The model is evaluated by comparing its lexical decision latency data to lexical decision latency data from various published studies of human subjects, and by comparing its reading aloud latency data to reading aloud latency data from various published studies of reading aloud by human subjects. For both tasks, it is found that a wide variety of variables which affect human latencies affect the DRC model's latencies in exactly the same way. We note a number of such effects which the DRC model simulates but which other computational models of reading do not, whereas as far as we are aware there are no effects which any other current computational model of reading can simulate but which the DRC model cannot. We conclude that the DRC model is the most successful of existing computational models of reading. M.C. Max Coltheart, Macquarie Centre for Cognitive Science, Macquarie University Sydney NSW 2109 Australia tel +61 2 9850 8086 (work) +61 2 9418 7269 (home) fax +61 2 9850 6059 (work) +61 2 9418 7101 (home) my diary http://calendar.yahoo.com/public/maxcoltheart my home page http://www.maccs.mq.edu.au/~max/ DRC's home page http://www.maccs.mq.edu.au/~max/DRC From Connectionists-Request at cs.cmu.edu Thu Aug 17 16:00:11 2000 From: Connectionists-Request at cs.cmu.edu (Connectionists-Request@cs.cmu.edu) Date: Thu, 17 Aug 2000 16:00:11 -0400 (EDT) Subject: CONNECTIONISTS Bi-Monthly reminder Message-ID: *** DO NOT FORWARD TO ANY OTHER LISTS *** This note was last updated August 15, 2000. This is an automatically posted bi-monthly reminder about how the CONNECTIONISTS list works and how to access various online resources. CONNECTIONISTS is a moderated forum for enlightened technical discussions and professional announcements. It is not a random free-for-all like comp.ai.neural-nets. The following posting guidelines are designed to reduce the amount of irrelevant messages sent to the list. Before you post, please remember that this list is distributed to thousands of busy people who don't want their time wasted on trivia. Also, some subscribers pay cash for each kbyte; they shouldn't be forced to pay for junk mail. -- Dave Touretzky & Mark C. Fuhs --------------------------------------------------------------------- To send mail to everyone on the list, address it to Connectionists at CS.CMU.EDU Note that in many mail programs a reply to a message is automatically "CC"-ed to all the addresses on the "To" and "CC" lines of the original message. If the mailer you use has this property, please make sure your personal response (request for a Tech Report etc.) is NOT broadcast over the net. Requests for address changes, deletions, etc., should be sent to: Connectionists-Request at CS.CMU.EDU The address list is now managed by Majordomo. If you are not familiar with Majordomo, send the word "help" by itself in the body of an email message to Connectionists-Request and you will receive a detailed explanation of its use. If you mention our mailing list to someone who may apply to be added to it, please make sure they use the "-requests" address and NOT "Connectionists at cs.cmu.edu". --------------------------------------------------------------------- What to post to CONNECTIONISTS ------------------------------ - The list is primarily intended to support the discussion of technical issues relating to neural computation. - We encourage people to post the abstracts of their latest papers and tech reports, provided that the report itself is available on-line (please give the URL) or the author is accepting requests for hardcopies. - Conferences and workshops should be announced on this list at most twice: once to send out a call for papers, and once to remind non-authors about the registration deadline. A flood of repetitive announcements about the same conference is not welcome here. For major neural net conferences (e.g., NIPS, IJCNN, INNS) we'll allow a second call for papers close (but not unreasonably close) to the deadline. - Announcements of job openings related to neural computation. - Announcements of new books related to neural computation. - Requests for ADDITIONAL references. This has been a particularly sensitive subject. Please try to demonstrate that you have already pursued the quick, obvious routes to finding the information you desire. You should also give people something back in return for bothering them. The easiest way to do both these things is to FIRST do the library work to find the basic references, then POST these as part of your query. Here's an example: WRONG WAY: "Can someone please mail me all references to cascade correlation?" RIGHT WAY: Enclosed is a bibliography I've compiled of papers referencing cascade correlation. If you are aware of additional papers not listed here, please send me the citations and I'll include them in the next version. What NOT to post to CONNECTIONISTS: ----------------------------------- * Requests for free software or databases. Try comp.ai.neural-nets. * Requests for reprints of papers, or for persons' email addresses. * Announcements of conferences not directly relevant to this list. Example: generic AI or computer vision conferences have their own newsgroups and mailing lists, and won't be advertised here. * Job postings, unless the posting makes specific mention of neural nets or a closely related topic (e.g., computational neuroscience.) * Postings not properly formatted. 80 columns is the maximum line width. Do not post HTML, LaTeX, Microsoft Word, or Postscript files. Do not include any attachments. ------------------------------------------------------------------------------- The CONNECTIONISTS Archive: --------------------------- All e-mail messages sent to "Connectionists at cs.cmu.edu" starting 27-Feb-88 are available for public perusal. A separate file exists for each month. The files' names are: arch.yymm where yymm stand for the obvious thing. Thus the earliest available data are in the file: arch.8802 Files ending with .Z are compressed using the standard unix compress program. The files ending with .gz are compressed using the GNU gzip program. In the event that you do not already have gzip, it is available via ftp from "prep.ai.mit.edu" in the "/pub/gnu" directory. To browse through these files (as well as through other files, see below) you must FTP them to your local machine. The file "current" in the same directory contains the archives for the current month. ------------------------------------------------------------------------------- How to Access Files from the CONNECTIONISTS Archive --------------------------------------------------- There are two ways to access the CONNECTIONISTS archive: 1. Using your World Wide Web browser. Enter the following location: http://www.cs.cmu.edu/afs/cs/project/connect/connect-archives/ 2. Using an FTP client. a) Open an FTP connection to host FTP.CS.CMU.EDU b) Login as user anonymous with password your username. c) 'cd' directly to the following directory: /afs/cs/project/connect/connect-archives The archive directory is the ONLY one you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into this directory. Problems? - contact us at "Connectionists-Owner at cs.cmu.edu". From sml at essex.ac.uk Wed Aug 23 12:55:14 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Wed, 23 Aug 2000 17:55:14 +0100 Subject: OCR competition (first announcement) Message-ID: <7051D4D8783DD411A7EF00A0C9DD67E1664127@sernt14.essex.ac.uk> Dear All, I would like to draw your attention to an OCR contest that we intend to run in the near future. This will be sponsored by the Post Office (UK) who intend to contribute a modest prize for the winner - but the intention is also to give developers useful feedback on how their algorithms compare with other ones on a range of OCR datasets and a range of different criteria. Currently, all submissions must be in Java, either in the form of an archive of class files (jar, zip or gzip etc) or a single body of source code. The problem set up is described on Algoval: http://algoval.essex.ac.uk Follow link to problems then to Trainable OCR. We have yet to decide the exact criteria for winning, but will likely be based on a combination of accuracy, speed and memory considerations - this will be some function of the performance criteria that are already mentioned on the web page. The intention is to run the competition in October of this year. This announcement is to give researchers chance to comply with the interface for the problem. If you wish to have solutions in languages other than Java considered, please send me a brief email to this effect. I don't think this will be possible for this competition, but if I get a strong enough response I'll increase the priority of allowing other-language submissions in the future. Best regards, Simon Lucas ps. After Thursday 24th August I'm away for two weeks, so I may be a bit slow responding to your emails. ------------------------------------------------ Dr. Simon Lucas Senior Lecturer Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom http://algoval.essex.ac.uk Email: sml at essex.ac.uk ------------------------------------------------- From schultz at cns.nyu.edu Wed Aug 23 13:26:58 2000 From: schultz at cns.nyu.edu (Simon Schultz) Date: Wed, 23 Aug 2000 13:26:58 -0400 Subject: Preprints on neural coding Message-ID: <39A40962.D16E8C0F@cns.nyu.edu> Dear Connectionists, The following two preprints are available for downloading: 1. ................................................................... "A UNIFIED APPROACH TO THE STUDY OF TEMPORAL, CORRELATIONAL AND RATE CODING", Stefano Panzeri* and Simon R. Schultz+. In press, Neural Computation. * Neural Systems Group, Department of Psychology, University of Newcastle upon Tyne. + Howard Hughes Medical Institute & Center for Neural Science, New York University. ------ Abstract: ------ We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each of which reflect something about potential coding mechanisms. This is possible in the coding r{\'e}gime in which few spikes are emitted in the relevant time window. This approach allows us to study the additional information contributed by spike timing beyond that present in the spike counts; to examine the contributions to the whole information of different statistical properties of spike trains, such as firing rates and correlation functions; and forms the basis for a new quantitative procedure for the analysis of simultaneous multiple neuron recordings. It also provides theoretical constraints upon neural coding strategies. We find a transition between two coding r{\'e}gimes, depending upon the size of the relevant observation timescale. For time windows shorter than the timescale of the stimulus-induced response fluctuations, there exists a spike count coding phase, where the purely temporal information is of third order in time. For time windows much longer than the characteristic timescale, there can be additional timing information of first order, leading to a temporal coding phase in which timing information may affect the instantaneous information rate. In this new framework we study the relative contributions of the dynamic firing rate and correlation variables to the full temporal information; the interaction of signal and noise correlations in temporal coding; synergy between spikes and between cells; and the effect of refractoriness. We illustrate the utility of the technique by analysis of a few cells from the rat barrel cortex. Download alternatives: http://www.cns.nyu.edu/~schultz/tcode.ps.gz http://www.cns.nyu.edu/~schultz/tcode.pdf 2. ................................................................... "SYNCHRONISATION, BINDING AND THE ROLE OF CORRELATED FIRING IN FAST INFORMATION TRANSMISSION", Simon R. Schultz+, Huw D. R. Golledge* and Stefano Panzeri*. To be published in S. Wermter, J. Austin and D. Willshaw (Eds.), Emergent Neural Computational Architectures based on Neuroscience, Springer-Verlag, Heidelberg. + Howard Hughes Medical Institute & Center for Neural Science, New York University. * Neural Systems Group, Department of Psychology, University of Newcastle upon Tyne. ------ Abstract: ------ Does synchronisation between action potentials from different neurons in the visual system play a substantial role in solving the binding problem? The binding problem can be studied quantitatively in the broader framework of the information contained in neural spike trains about some external correlate, which in this case is object configurations in the visual field. We approach this problem by using a mathematical formalism that quantifies the impact of correlated firing in short time scales. Using a power series expansion, the mutual information an ensemble of neurons conveys about external stimuli is broken down into firing rate and correlation components. This leads to a new quantification procedure directly applicable to simultaneous multiple neuron recordings. It theoretically constrains the neural code, showing that correlations contribute less significantly than firing rates to rapid information processing. By using this approach to study the limits upon the amount of information that an ideal observer is able to extract from a synchrony code, it may be possible to determine whether the available amount of information is sufficient to support computational processes such as feature binding. Download alternatives: http://www.cns.nyu.edu/~schultz/emernet.ps.gz http://www.cns.nyu.edu/~schultz/emernet.pdf -- Dr. Simon R. Schultz Phone: +1-212 998 3775 Howard Hughes Medical Institute & Fax: +1-212 995 4011 Center for Neural Science, Email:schultz at cns.nyu.edu New York University, 4 Washington Place, New York NY 10003, U.S.A. http://www.cns.nyu.edu/~schultz/ From ckiw at dai.ed.ac.uk Wed Aug 23 09:39:53 2000 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Wed, 23 Aug 2000 14:39:53 +0100 (BST) Subject: Faculty positions at University of Edinburgh, UK Message-ID: The Division of Informatics at the University of Edinburgh is seeking to hire two lecturers. I am keen to encourage people from the machine learning/probabilistic modelling fields to apply. Note that this area is highlighted in the advert (see below). Apologies if you receive this message multiple times. Informal questions and requests for information can be sent to c.k.i.williams at ed.ac.uk Chris Williams Dr Chris Williams c.k.i.williams at ed.ac.uk Institute for Adaptive and Neural Computation Division of Informatics, University of Edinburgh 5 Forrest Hill, Edinburgh EH1 2QL, Scotland, UK fax: +44 131 650 6899 tel: (direct) +44 131 651 1212 (department switchboard) +44 131 650 3090 http://anc.ed.ac.uk/ -------------------------------------------------------------------- LECTURESHIP IN INFORMATICS The Division of Informatics (http://www.informatics.ed.ac.uk) has inherited a very strong tradition of research in computer systems, theoretical computer science, cognitive science, artificial intelligence, robotics and neural networks. You will add to our existing strengths in research and teaching, encourage the integration of your own research with that of others and contribute to the development of Informatics. You can work in any area of Informatics, but we would particularly welcome applications where algorithms and complexity or machine learning was your primary interest. We would also especially welcome your application if you are working in bioinformatics, computer systems, virtual or augmented reality, knowledge representation or cognitive modelling. The appointment will be at the Lecturer (18,731 - 34,601 pounds) scales (salary scales under review). Please quote ref: 306630 Further particulars can be found at http://www.informatics.ed.ac.uk/events/vacancies/lectureship_306630.html and applications packs can be obtained from the PERSONNEL DEPARTMENT, The University of Edinburgh, 9-16 Chambers Street, Edinburgh EH1 1HT, UK Closing date: 22 September 2000. From Daniel.Memmi at imag.fr Thu Aug 24 06:10:34 2000 From: Daniel.Memmi at imag.fr (Daniel Memmi) Date: Thu, 24 Aug 2000 12:10:34 +0200 Subject: Report on Flexible Word Order Message-ID: The following technical report might of interest to connectionists in natural language processing and to computational linguists: "Processing Flexible Word Order Languages with Neural Networks" D. Memmi (LEIBNIZ-IMAG-CNRS, Grenoble, France) Abstract: most research in computational linguistics is strongly inspired by the particular structure and rigid word order of the English language, although many common languages exhibit in fact a much freer syntax. These languages should be interpreted by using various other cues (morphological or semantic) beside word order. We will show how recurrent neural networks can learn to use a variety of cues as well as word order so as to interpret simple sentences. This connectionist approach will be applied to several different languages: Japanese, French and Spanish. In this way will be demonstrated the diversity of linguistic cues available for natural language processing. The report is available on-line at the following address. - for a PDF version: http://www-leibniz.imag.fr/LesCahiers/CLLeib03.pdf - for a Postscript version: http://www-leibniz.imag.fr/LesCahiers/CLLeib03.ps _________________________________________________________ Daniel Memmi Neural Networks Group LEIBNIZ-IMAG Tel: (33) 4 76 57 46 64 46, avenue Felix Viallet Fax: (33) 4 76 57 46 02 38000 Grenoble (France) Email: memmi at imag.fr www-leibniz.imag.fr www-leibniz.imag.fr/RESEAUX/ _________________________________________________________ From bengio at idiap.ch Fri Aug 25 05:10:20 2000 From: bengio at idiap.ch (Samy Bengio) Date: Fri, 25 Aug 2000 11:10:20 +0200 (MET DST) Subject: new tech report and new version of SVMTorch Message-ID: We would like to announce the following: Regarding SVMTorch, our implementation of Support Vector Machines for Large-Scale Regression and Classification Problems, that was previously announced on the same list, (a) A new version of SVMTorch is now available on the web at the usual place: http://www.idiap.ch/learning/SVMTorch.html The main additions compared with the previous version are the following: + A Multiclass mode (one class against the others). + An input/output sparse mode (it runs faster on sparse data). (b) A new technical report is also available reporting a convergence proof for our regression method. It is available at ftp://ftp.idiap.ch/pub/reports/2000/rr00-17.ps.gz The abstract is as follows: Recently, many researchers have proposed decomposition algorithms for SVM regression problems (see for instance [11, 3, 6, 10]). In a previous paper [1], we also proposed such an algorithm, named SVMTorch. In this paper, we show that while there is actually no convergence proof for any other decomposition algorithm for SVM regression problems to our knowledge, such a proof does exist for SVMTorch for the particular case where no shrinking is used and the size of the working set is equal to 2, which is the size that gave the fastest results on most experiments we have done. This convergence proof is in fact mainly based on the convergence proof given by Keerthi and Gilbert [4] for their SVM classification algorithm. ----- Samy Bengio Research Director. Machine Learning Group Leader. IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Switzerland. tel: +41 27 721 77 39, fax: +41 27 721 77 12. mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio From bengio at idiap.ch Fri Aug 25 05:21:25 2000 From: bengio at idiap.ch (Samy Bengio) Date: Fri, 25 Aug 2000 11:21:25 +0200 (MET DST) Subject: correction for new tech rep regarding SVMTorch: 17 --> 24 Message-ID: Sorry, the new tech report available regarding the convergence proof of our regression algorithm for SVMTorch is ftp://ftp.idiap.ch/pub/reports/2000/rr00-24.ps.gz and not ftp://ftp.idiap.ch/pub/reports/2000/rr00-17.ps.gz as given in the previous mail. Again, sorry, ----- Samy Bengio Research Director. Machine Learning Group Leader. IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Switzerland. tel: +41 27 721 77 39, fax: +41 27 721 77 12. mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio From wichert at neuro.informatik.uni-ulm.de Tue Aug 29 12:07:26 2000 From: wichert at neuro.informatik.uni-ulm.de (Andreas Wichert) Date: Tue, 29 Aug 2000 18:07:26 +0200 (MET DST) Subject: PhD thesis ``Associative Computation'' Message-ID: <10008291607.AA13728@neuro.informatik.uni-ulm.de> Dear Connectionists, Remember, in August 1998 Dave Touretzky asked on the connectionistic e-mailing list: ``..I concluded that connectionist symbol processing had reached a plateau... No one is trying to build distributed connectionist reasoning systems any more, like the connectionist production system I built with Geoff Hinton...'' The PhD thesis ``Associative Computation'' tries to fill the explicatory gap... Abstract Currently neural networks are used in many different domains. But are neural networks also suitable for modeling problem solving, a domain which is traditionally reserved for the symbolic approach? This central question of cognitive science is answered in this work. It is affirmed by corresponding neural network models. The models have the same behavior as the symbolic models. However, also additional properties resulting from the distributed representation emerge. It is shown by comparison of those additional abilities with the basic behavior of the model, that the additional properties lead to a significant algorithmic improvement. This is verified by statistical hypothesis testing. The associative computer, a neural model for a reaction system based on the assembly theory, is introduced. It is shown that planning can be realized by a neural architecture that does not use symbolic representation. A crucial point is the description of states by pictures. The human ability to process images and understand w hat they mean in order to solve a problem holds an important clue to how the human thought process works. This clue is examined by empirical experiments with the associative computer. One general conclusion from the experiments is the claim that it is possible to use systematically associativestructures to perform reasoning by forming chains of associations. In addition, beside symbolical problem solving, pictorial problem solving is possible. Available at: 1) web site, in gzipped postscript formatat: http://www.informatik.uni-ulm.de/ni/mitarbeiter/AWichert.html 2) anonymous ftp, in gzipped postscript format or gzipped pdf format: ftp.neuro.informatik.uni-ulm.de, directory /ni/wichert 3) web site University of Ulm library in pdf format: h ttp://vts.uni-ulm.de/query/longview.meta.asp?document_id=533 4) cdrom (pdf + ps) upon request Andrzej ---------------------------------------------------------------------------- Andrzej Wichert Computer Science Department of Neural Information Processing University of Ulm Oberer Eselsberg Tel.: +49 731 502 4257 89069 Ulm Germany Fax : +49 731 502 4156 http://www.informatik.uni-ulm.de/ni/mitarbeiter/AWichert.html ---------------------------------------------------------------------------- From iiass.alfredo at tin.it Mon Aug 28 14:26:46 2000 From: iiass.alfredo at tin.it (Alfredo Petrosino) Date: Mon, 28 Aug 2000 20:26:46 +0200 Subject: Neural Nets School 2000 Message-ID: <39AAAEE6.3D14CCB9@tin.it> SECOND ANNOUNCE 5th Course of International Summer School "NeuralNets E. R. Caianiello" on Visual Attention Mechanisms 23-29 October 2000 International Institute for Advanced Scientific Studies (IIASS) Vietri sul Mare, Salerno (Italy) DIRECTOR OF THE 5TH COURSE Virginio CANTONI (Pavia University, Italy) DIRECTORS OF THE SCHOOL Michael JORDAN (University of California, Berkely, USA) Maria MARINARO (Salerno University, Italy) The preliminary program of the School is avalilable at http://www.iiass.it/nnschool. Due to holidays the NEW DEADLINE for submitting applications and proposals of poster is SEPTEMBER 24 2000 The school, open to all suitably qualified scientists from around the world, is organized in lectures, panel discussions and poster presentations and will cover a number of broad themes relevant to Visual Attention, among them: - Foundation: Early vision, Visual streams, Perception and action, Log-map analysis - Attentional mechanisms: Pop-out theory, Texton theory, Contour integration and closure, Fuzzy engagement mechanisms - Visual search : Attentional control, Selective attention, Spatial attention, Detection versus discrimination - Multiresolution and planning : Complexity of search tasks, Hierarchical perceptual loops, Multiresolution and associative memory systems, Attention and action planning - Attentional Visual Architectures: Neural models of visual attention, Hierarchical and associative networks, Attentional pyramidal neural mechanisms - Experiences: Eyeputer and scanpath recorders, etc. INVITED SPEAKERS : Virginio CANTONI, Pavia University Leonardo CHELAZZI, Verona University, Italy Vito DI GESU`, Palermo University, Italy Hezy YESHURUN, Haifa University, Israel Zhaoping LI, Gatsby, University College, London, UK Luca LOMBARDI, Pavia University, Italy Carlo Alberto MARZI, Verona University, Italy Alain MERIGOT, University of Paris Sud, France Eliano PESSA, Roma University, Italy Alfredo PETROSINO, INFM-Salerno University, Italy Marco PIASTRA, Pavia University, Italy Vito ROBERTO, Udine University, Italy Dov SAGI, Weizmann University, Israel John TSOTSOS, Center for Computer Vision, Canada Daniela ZAMBARBIERI, Pavia University, Italy Harry WECHSLER, George Mason University, USA Steven YANTIS, Johns Hopkins University, USA SITE Vietri sul Mare is located within walking distance from Salerno and marks the beginning of the Amalfi coast. Visit http://www.comune.vietri-sul-mare.sa.it/ for more information. FEE The shool fee is 700 USD including the cofee breaks and a copy of the proceedings. The full school fee, including accommodation in twin room, meals, one day of excursion, and a copy of the proceedings of the school is 1200 USD, reduced to 1000 USD for students For further information please contact : Dr. A.Petrosino Fax: + 39 89 761189 Email: iiass.alfredo at tin.it ============================CUT====================================== APPLICATION FORM Title:_______ Family Name: ________________________________________________________ Other Names:_________________________________________________________ Name to appear on badge: ____________________________________________ MAILING ADDRESS: Institution _________________________________________________________ Department __________________________________________________________ Address _____________________________________________________________ State ____________________________ Country __________________________ Phone:____________________________ Fax: _____________________________ E-mail: _____________________________________________________________ Arrival date: __________________ Departure date: ____________________ Will you be applying for a scholarship ? yes/no (Please include in your application the amount of bursary support and a justification for the request) Will you submit a poster ? yes/no (Please include a one page abstract for review by the organizers). ============================CUT====================================== Please send the application form by electronic mail to: iiass.alfredo at tin.it, subject: Neural Nets school; or by fax to: Neural Nets School, +39 89 761 189 or by ordinary mail to the address: Neural Nets School IIASS, Via Pellegrino 19 I84019 Vietri sul Mare (Sa) Italy From nnsp00 at neuro.kuleuven.ac.be Tue Aug 29 03:58:57 2000 From: nnsp00 at neuro.kuleuven.ac.be (NNSP2000, Sydney) Date: Tue, 29 Aug 2000 09:58:57 +0200 Subject: IEEE NNSP'00, Sydney Message-ID: <39AB6D41.B6B4C78F@neuro.kuleuven.ac.be> ***************************************************************** CALL FOR PARTICIPATION 2000 IEEE Workshop on Neural Networks for Signal Processing December 11-13, 2000, Sydney, Australia (Early Registration: September 15, 2000) Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council ***************************************************************** Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council, the tenth of a series of IEEE workshops on Neural Networks for Signal Processing will be held at the University of Sydney, Australia December 11-13, 2000. The workshop will feature a high quality technical program, and three invited plenary speeches presented by experts in the field: - Professor Tam?s Roska (Hungarian Academy of Sciences) Signal Processing via Neural Networks become practical - via Analogic TeraOPS Visual Microprocessor Chips. - Dr. David Fogel (Natural Selection, Inc.) Evolving Models for Signal Processing. - Dr. Brian Ferguson (Defence Science and Technology Organisation) Signal Processing Applications for Submarines, Surveillance and Survival. In addition, Professor Sun-Yan Kung of Princeton University will present a joint keynote speech on "Adaptive Techniques for Intelligent Internet Multimedia Communication". to the workshop and the First IEEE Pacific-Rim Conference on Multimedia. The workshop is being held in conjuction with the First IEEE Pacific-Rim Conference on Multimedia (2000 International Symposium on Multimedia Information Processing). The public are cordially invited to participate in this important event. Early registration can be made before September 15, 2000 through our homepage: http://eivind.imm.dtu.dk/nnsp2000/ For further information, please contact the NNSP 2000 Organizing Committee at: TEL: +61 2 9351 5642 Fax: +61 2 9351 3847 or Email: nnsp2000org at eivind.imm.dtu.dk ORGANIZATION Honorary Chair Bernard WIDROW Stanford University General Chairs Ling GUAN University of Sydney email: ling at ee.usyd.edu.au Kuldip PALIWA Griffith University email: kkp at shiva2.me.gu.edu.au Program Chairs T?lay ADALI University of Maryland, Baltimore County email: adali at umbc.edu Jan LARSEN Technical University of Denmark email: jl at imm.dtu.dk Finance Chair Raymond Hau-San WONG University of Sydney email: hswong at ee.usyd.edu.au Proceedings Chairs Elizabeth J. WILSON Raytheon Co. email: bwilson at ed.ray.com Scott C. DOUGLAS Southern Methodist University email: douglas at seas.smu.edu Publicity Chair Marc van HULLE Katholieke Universiteit, Leuven email: marc at neuro.kuleuven.ac.be Registration and Local Arrangements Stuart PERRY Defence Science and Technology Organisation email: Stuart.Perry at dsto.defence.gov.au Europe Liaison Jean-Francois CARDOSO ENST email: cardoso at sig.enst.fr America Liaison Amir ASSADI University of Wisconsin at Madison email: ahassadi at facstaff.wisc.edu Asia Liaison Andrew BACK Katestone Scientific email: andrew.back at usa.net PROGRAM COMMITTEE: Amir Assadi Yianni Attikiouzel John Asenstorfer Andrew Back Geoff Barton Herv? Bourlard Andy Chalmers Zheru Chi Andrzej Cichocki Tharam Dillon Tom Downs Hsin Chia Fu Suresh Hangenahally Marwan Jabri Haosong Kong Shigeru Katagiri Anthony Kuh Yi Liu Fa-Long Luo David Miller Christophe Molina M Mohammadian Erkki Oja Soo-Chang Pei Jose Principe Ponnuthurai Suganthan Ah Chung Tsoi Marc Van Hulle A.N. Venetsanopoulos Yue Wang Wilson Wen From paolo at dsi.unifi.it Wed Aug 30 04:48:14 2000 From: paolo at dsi.unifi.it (Paolo Frasconi) Date: Wed, 30 Aug 2000 10:48:14 +0200 (ora legale Europa occ.) Subject: Special issue on integration of symbolic and connectionist systems Message-ID: Integration of symbolic and connectionist systems Special issue of the journal Cognitive Systems Research CALL FOR PAPERS BACKGROUND A successful integration of connectionist and other statistical learning systems with symbol based techniques could bridge reasoning and knowledge representation with empirical learning, significantly advancing our ability of modeling cognitive processes under a unified perspective. This area has attracted many researchers, both in computer and cognitive sciences, but is still replete of serious difficulties and challenges. While existing models can address particular aspects in this integration (often by making use of different assumptions and techniques) only few unified approaches have been proposed and they are still very limited, showing both the lack of a full understanding of the relevant aspects of this discipline and the broad complexity in scope and tasks. One of the main difficulties is that symbolic techniques can easily deal with rich and expressive representation languages, whereas connectionist/statistical learners are mostly effective in the case of simple propositional languages. As a result, it is customary to exploit the learning capabilities of these models by operating on "flat" (vector-based, or attribute-value) representations, but this often requires additional machinery for interfacing the learner with the actual domain of interest. For example, if data are available in structured or semi-structured way, a conversion into a lower level (propositional) representation is often performed. Similarly, since internal representations associated with the learner are not easily interpretable, a conversion into a higher level representation language is also necessary. Different aspects of integration have been investigated and probed independently from each other and thus a higher level of cross-interaction among these issues is necessary, making use of all the computational tools we have available, such as deterministic and probabilistic approaches, event-based modeling, computational logic, computational learning theory, and so on. TOPICS In this special issue we aim at collecting high quality papers that show novel methods, ideas, and positions, advancing to the state of the art in this area. We encourage submissions of papers addressing, in addition to other relevant issues, the following topics: - Algorithms for extraction, injection and refinement of symbolic knowledge from, into and by neural networks. - Inductive discovery/formation of structured knowledge. - Classification, recognition, prediction, matching and manipulation of structured information. - Relational learning using connectionist and belief network techniques. - Comparisons between connectionist/statistical, and symbolic learning systems that can exploit super-propositional representation languages. - Applications of hybrid symbolic-connectionist models to real-world problems. - Taxonomies that may be useful for selecting the best suited integration paradigms and techniques to be used in particular applications. - Investigations of fundamental aspects of and approaches to the integration of symbolic and connectionist methods. SUBMISSIONS GUIDELINES Papers should be submitted electronically (PostScript and PDF are the only acceptable formats) by using the anonymous ftp site ftp.dsi.unifi.it. Use the /csr directory to deposit submissions. Please choose a unique clearly identifying filename. For convenience, common compression utilities (gzip, winzip, compress) can be used. We also require a follow up email message to paolo at dsi.unifi.it to let us know that the file has been posted. In the email message please include title, keywords, abstract, and full address of the contacting author. For details about the journal of Cognitive Systems Research, and to download the journal template files (LaTeX, Word, etc.) please visit http://www.elsevier.nl/locate/cogsys. For updates about the special issue, please visit http://www.dsi.unifi.it/~paolo/csr. IMPORTANT DATES Submission Deadline: December 15, 2000 Notification to authors: March 15, 2000 GUEST EDITORS Paolo Frasconi, University of Florence, Italy (paolo at dsi.unifi.it) Marco Gori, University of Siena, Italy, (marco at ing.unisi.it) Franz Kurfess, California Polytechnic State University (fkurfess at csc.calpoly.edu) Alessandro Sperduti, University of Pisa, Italy (perso at di.unipi.it) From andre at snowhite.cis.uoguelph.ca Wed Aug 30 15:05:45 2000 From: andre at snowhite.cis.uoguelph.ca (andre de carvalho) Date: Wed, 30 Aug 2000 15:05:45 -0400 Subject: IJNS special issue: call for abstracts Message-ID: <39AD5B09.5BADD8F9@snowhite.cis.uoguelph.ca> **** CALL FOR PAPERS **** Special issue: "Non-Gradient Learning Techniques" International Journal of Neural Systems, IJNS http://www.wspc.com/journals/ijns/ijns.html Guest Editors: Andre de Carvalho and Stefan C. Kremer Submission deadline: Abstract: Sept. 29th, 2000 Final version: Dec. 1st, 2000 BACKGROUND Many of the learning techniques currently employed to train Artificial Neural Networks are based on a gradient descendent of an error function. Although they provide solutions for a large number of applications, they have a number of limitations, including ending up in local minima, spending large amounts of time traversing flat regions of the error-space and requiring differentiable activation functions. While modifications to gradient-based algorithms have been proposed to deal with these issues, there have also recently been a number of initiatives to develop non-gradient-based techniques for learning. The latter are the focus of this special issue. TOPICS The aim of this special issue is to solicit and publish valuable papers that provide a clear picture of the state of the art in this area. We encourage submissions of articles addressing, in addition to other relevant issues, the following topics: - Analysis of limitations to gradient learning approaches - Non-gradient based learning algorithms - Evolutionary training - Unsupervised learning - Auto-associative memories - Analysis and solutions for applications where non-gradient approaches are required. - Survey of non-gradient techniques - Real world applications of non-gradient learning algorithms INSTRUCTIONS This special issue will have a mix of invited and openly solicited papers. All contributed and invited papers will be refereed to ensure high quality and relevance to IJNS readers. Authors are encouraged to use LaTeX format. The IJNS LaTeX style files can be downloaded from http://www.wspc.com/others/style_files/journal/ijns/128-ijns.zip The submissions should be sent by e-mail or computer disk in Postscript or PDF format (other file formats cannot be accepted). They should be in English, not exceed 12 double spaced (excluding Figures and Tables) pages and be formatted according to the Journal submission guidelines found in http://www.wspc.com/journals/ijns/ijns.html The title and the abstract of proposed papers should be sent separately in ASCII format by September 29th. With your submission, please provide a cover letter with: - The title of the paper; - The author names together with their affiliation address, email and telephone number. IMPORTANT DATES Submission of title and abstract (e-mail): Sept. 29th, 2000 Submission of final version deadline: Dec. 1st, 2000 Notification of acceptance: Jan 5th, 2001 Expected publication date: Mid-to-late 2001. The materials should be submitted to one of the Guest Editors: GUEST EDITORS Dr. Andre de Carvalho Guelph Natural Computation Group Department of Computing and Information Science University of Guelph Guelph, Canada, N1G 2W1 E-mail: andre at snowhite.cis.uoguelph.ca Dr. Stefan C. Kremer Guelph Natural Computation Group Department of Computing and Information Science University of Guelph Guelph, Canada, N1G 2W1 E-mail: skremer at snowhite.cis.uoguelph.ca