From mzorzi at ux1.unipd.it Mon Oct 2 18:21:26 2000 From: mzorzi at ux1.unipd.it (Marco Zorzi) Date: Tue, 03 Oct 2000 00:21:26 +0200 Subject: graduate student position in connectionist modelling Message-ID: <4.3.2.7.0.20001003001819.00b1c940@ux1.unipd.it> Please post or pass this ad to anyone interested. Thanks, Marco Zorzi GRADUATE STUDENT POSITION A three-years postgraduate position in cognitive neuroscience is available with Dr. Marco Zorzi and Prof. Carlo Umilt? at University of Padova (Italy), Department of Psychology. The project is part of a EU-funded Research Training Network of six sites (University College London, INSERM U334 Orsay, Universit? Catholique de Louvain, University of Innsbruck, University of Trieste, and University of Padova) investigating ?Mathematics and the Brain". The Padova team will focus on the computational bases of mathematical cognition (basic numerical abilities and simple arithmetic), using both experimental and connectionist modeling techniques. Experience in connectionist modelling or neuropsychology, and good programming skills are desirable. The successful applicant will register for a PhD in Psychology. The international network offers an excellent opportunity for gaining experience in a wide range of methodologies in cognitive neuroscience. Short visits and exchanges between laboratories are also planned. Salary and benefits are highly competitive (23000 Euro per year). Applicants must be citizens of a EU country (excluding Italy), EFTA-EEA states, Candidate States, or Israel. Application deadline is 31 October 2000. Send CV, reprints or preprints and the names of two references (preferably by email) to: Dr. Marco Zorzi Prof. Carlo Umilt? Dipartimento di Psicologia Generale Universit? di Padova Via Venezia 8 35131 Padova (Italy) mzorzi at ux1.unipd.it umilta at ux1.unipd.it Dr. Marco Zorzi Dipartimento di Psicologia Generale Universit? di Padova via Venezia 8 35131 Padova tel: +39 049 8276635 fax: +39 049 8276600 email: mzorzi at psico.unipd.it (and) Institute of Cognitive Neuroscience voice: +44 171 3911151 University College London fax : +44 171 8132835 17 Queen Square London WC1N 3AR (UK) http://www.psychol.ucl.ac.uk/marco.zorzi/marco.html From maggini at dii.unisi.it Tue Oct 3 11:52:44 2000 From: maggini at dii.unisi.it (Marco Maggini) Date: Tue, 03 Oct 2000 17:52:44 +0200 Subject: ESANN 2001- Special Session on ANNs for web computing Message-ID: <39DA00CC.4F966CE6@dii.unisi.it> ---------------------------------------------------- | | | ESANN'2001 | | | | 9th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 25-26-27, 2001 | | | | Special session on | | Artificial neural networks for Web computing | | | | Call for papers | ---------------------------------------------------- The Internet represents a new challenging field for the application of machine learning techniques to devise systems which improve the accessibility to the information available on the web. This domain is particular appealing since it is easy to collect large amounts of data to be used as training sets while it is usually difficult to write manually sets of rules that solve interesting tasks. The aim of this special session is to present the state of the art in the field of connectionist systems applied to web computing. The possible fields for applications involve distributed information retrieval issues like the design of thematic search engines, user modeling algorithms for the personalization of services to access information on the web, automatic security management, design and improvement of web servers through prediction of request patterns, and so on. In particular the suggested topics are: - Personalization of the access to information on the web - Recommender systems on the web - Crawling policies for search engines - Focussed crawlers - Analysis and prediction of requests to web servers - Intelligent chaching and proxies - Security issues (e.g. intrusion detection) For the submission refer to the instruction for authors available on the web site of the conference http://www.dice.ucl.ac.be/esann Please note that contributions are to be sent to the address specified in the call for papers and not directly to me. Anyway you must specify the title of this special session in the accompanying letter. I would appreciate also if you could send me an e-mail to announce your submission. Contributions are submitted to a review process and will be rated according to their scientific value. Deadlines --------- Submission of papers December 8, 2000 Notification of acceptance February 5, 2001 Contacts -------- Session Organizer: Marco Maggini Dipartimento di Ingeneria dell'Informazione Universit di Siena Via Roma 56 I-53100 Siena (Italy) e-mail: maggini at dii.unisi.it Conference secretariat: Michel Verleysen D facto conference services phone: + 32 2 420 37 57 27 rue du Laekenveld Fax: + 32 2 420 02 55 B - 1080 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be From rwessel at ucsd.edu Tue Oct 3 18:03:23 2000 From: rwessel at ucsd.edu (Ralf Wessel) Date: Tue, 3 Oct 2000 15:03:23 -0700 Subject: postdoctoral position Message-ID: A POSTDOCTORAL POSITION IN THE NEUROPHYSIOLOGY OF VISUAL MOTION ANALYSIS is available in the laboratory of Ralf Wessel at Washington University/St. Louis beginning in March 2001 (starting date flexible). We study the biophysical mechanisms of visual motion processing at the level of synapses, single neurons and dendrites, and networks. A slice preparation of the chick tectum is used as a model system for these studies. In this slice preparation, visual motion can be simulated via the spatio-temporal stimulation of retinal ganglion cell axons. At the same time motion, sensitive tectal neurons are accessible for electrical recordings, optical imaging, and pharmacological manipulation. Experimental approaches are complemented by computational analysis to understand how the interaction of biophysical mechanisms at multiple levels of neural organization leads to motion and motion contrast sensitivity in vertebrate brains (http://www.physics.wustl.edu/~rw/). The highly interactive Neuroscience community at Washington University provides opportunities for collaborative investigations with experts who study the nervous system at different levels of neural organization (http://www.dbbs.wustl.edu/). St. Louis offers a range of recreational activities and affordable living (http://medinfo.wustl.edu/pub/diso/). Please email CV, brief summary of research experience, contact information of two references, and list of abstracts and publications to: rw at wuphys.wustl.edu. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 1576 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/67ea602b/attachment.bin From tleen at cse.ogi.edu Tue Oct 3 21:46:06 2000 From: tleen at cse.ogi.edu (Todd K. Leen) Date: Tue, 03 Oct 2000 18:46:06 -0700 Subject: Postdoc Position at Oregon Graduate Institute Message-ID: <39DA8BDE.C087A85E@cse.ogi.edu> Postdoctoral Research Position in Statistical Pattern Recognition and Machine Learning for Environmental Forecasting Systems Professors Todd Leen (Computer Science and Engineering, http://www.cse.ogi.edu/~tleen) and Antonio Baptista (Environmental Science and Engineering, http://www.ccalmr.ogi.edu/baptista/baptista.html) are seeking a highly qualified Postdoctoral Research Associate to develop and apply statistical signal processing and pattern recognition technology to Environmental Observation and Forecasting Systems (EOFS). The goals of this NSF ITR project are to develop techniques that enhance the utility of the massive amounts of observational and model data produced by EOFS. Specific aims include detecting sensor corruption in non-stationary spatial-temporal systems, estimating true signals from corrupted sensor data, and detecting and characterizing regimes where numerical model anomalies are likely. The project centers around the COlumbia RIver Estuary (CORIE) an EOFS system that includes numerical models and observing stations for the river estuary and adjacent coastal waters (http://www.ccalmr.ogi.edu/CORIE). The project offers an opportunity to work in a interdisciplinary team with expertise in statistical machine learning, two and three-dimensional finite element models of river dynamics, and year-round data collection stations throughout the estuary. We expect that the project will provide significant challenges and advances in the combination of timeseries with spatial statistics, and machine learning and datamining. Candidates should have a PhD in Statistics, EE, CS, or a related field with experience in machine learning, nonparametric statistics, or neural network modeling. Experience in either timeseries analysis or spatial statistics is an advantage. The initial appointment is for one year, but may be extended depending upon the availability of funding. Candidates able to start by January 1, 2001 will be given preference, although highly qualified candidates who seek a later start date should not hesitate to apply. If you are interested in applying for this position, please mail, fax, or email your CV (ascii text, postscript or pdf file), a letter of application, and a list of three or more references (with name, address, email, and phone number) to: Position #CSE10-1-00 Human Resources Department Oregon Graduate Institute 20000 NW Walker Road Beaverton, OR 97006-8921 E-mail submissions are accepted at jobs at admin.ogi.edu and must include the above information and position number in the message subject line. EEO/AA employer: women, minorities, and individuals with disabilities are encouraged to apply. Questions (not application material) may be director to Professor Todd Leen, tleen at cse.ogi.edu, 503-748-1160 and Professor Antonio Baptista, baptista at ccalmr.ogi.edu, 503-748-1147. ================================================================== Oregon Graduate Institute is an independent graduate school and research institute located 10 miles west of downtown Portland, Oregon. OGI offers Masters and PhD programs in Biochemistry and Molecular Biology, Computer Science and Engineering, Electrical and Computer Engineering, Environmental Science and Engineering, and a Masters program in Management in Science and Technology. Portland offers diverse cultural activities including art, music, theater, entertainment and sports. Portland's extensive park system includes the largest wilderness park within the limits of any city in the US. Beyond the city limits, a 1-2 hour drive brings one to downhill skiing, high desert, forest hiking trails, the Pacific Ocean, and numerous wineries. From S.Singh at exeter.ac.uk Tue Oct 3 05:46:07 2000 From: S.Singh at exeter.ac.uk (S.Singh@exeter.ac.uk) Date: Tue, 3 Oct 2000 10:46:07 +0100 (GMT Daylight Time) Subject: 3 PhD Studentships Available Message-ID: UNIVERSITY OF EXETER, UK SCHOOL OF ENGINEERING AND COMPUTER SCIENCE Department of Computer Science We now invite applications for the following three PhD studentships within our department. Please note the different deadlines for application and project details. 1. PhD Studentship as a Graduate Teaching Assistant Deadline for Application: 13 October, 2000 "Digital video retrieval based on multiple sources of information." The project will develop student skills in areas including neural networks, image analysis, pattern recognition and multimedia information retrieval. Candidates for this studentship should have a degree in computer science, engineering or a related subject. They should have programming skills in C/C++/JAVA and knowledge of the Unix operating system. Previous knowledge of image processing, multimedia or information retrieval is useful but not absolutely necessary. Further information may be obtained from the project supervisors Dr. S Singh (S.Singh at exeter.ac.uk) and Dr. G. Jones (G.J.F.Jones at exeter.ac.uk). The student will also work as a Graduate Teaching Assistant within the department, helping with the teaching of core undergraduate modules also postgraduate modules in their area of research. Assistants are required to attend the SEDA training course at the University and will be expected to undertake up to 150 hours teaching per annum. The SEDA training is aligned to the new National Institute of Learning and Teaching's Associate Part 1 award. 2. PhD Studentship Deadline for Application: 30 October, 2000 "Adaptive and Autonomous Image Understanding Systems" The project will explore intelligent algorithms for adaptive image understanding including context based reasoning. Generic methodology will be developed based on image processing and pattern recognition methods to allow automatic processing of scene analysis data using video sequences. The goal of the project is to develop a seamless system that continuously evolves in real-time with video images. It is expected that prospective applicants have good mathematical and analytical background, with programming experience in C/C++ and Unix/Linux operating systems. 3. PhD Studentship Deadline for Application: 30 October, 2000 "Ultrasound based object recognition and integration with image analysis" Only recently it has been demonstrated that ultrasound can be used to detect objects in controlled environments. It has been used for face recognition and obstacle avoidance. In this project we develop techniques for ultrasound based object recognition and couple this ability with image processing on a mobile platform (robot). The ultrasound method will provide the low resolution cheaper method as a front end triggering the image processing component for more detailed analysis. The research will be tested on indoor and outdoor environment. It is expected that prospective applicants have good mathematical and analytical background, with programming experience in C/C++ and Unix/Linux operating systems. The studentship covers UK/EU fees and maintenance (currently £6,620 pa). Funding for project (1) is for up to four years, and for projects (2,3) for up to three years. The successful candidates should expect to take up the studentship no later than 1 December, 2000 or as soon as possible when negotiated with the department. Applicants should send a letter of application and a CV, including the names and addresses of two referees, to project supervisor, Dr. Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK (email S.Singh at ex.ac.uk). Applicants should ask their referees to send their references directly to the above address. Informal enquiries can be made 01392-264053; further details of the Department's work in the above areas may be found on the web at www.dcs.ex.ac.uk/research/pann ------------------------------------------- Sameer Singh Department of Computer Science University of Exeter Exeter EX4 4PT United Kingdom Tel: +44-1392-264053 Fax: +44-1392-264067 e-mail: s.singh at ex.ac.uk WEB: http://www.ex.ac.uk/academics/sameer/index.htm ------------------------------------------- From bert at mbfys.kun.nl Thu Oct 5 09:05:36 2000 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 05 Oct 2000 15:05:36 +0200 Subject: Tenure track senior researcher at SNN University Nijmegen Message-ID: <39DC7CA0.3EC96DFA@mbfys.kun.nl> Senior researcher position (tenure track) available at SNN, University of Nijmegen, the Netherlands. Extended deadline. Background: The group consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and Bayesian methods. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focused on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines and other graphical models using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks. Applied research is conducted on computer assisted medical diagnosis, music modeling, genetics and time-series prediction tasks. Since 1997, SNN Nijmegen has founded a company which sells commercial services and products in the field of neural networks, AI and statistics. For more information see http://www.smart-research.nl and http://www.mbfys.kun.nl/SNN Job specification: SENIOR RESEARCHER (TENURE TRACK) The tasks of the senior researcher will be to conduct independent research in one of the above areas. In addition, it is expected that the he/she will initiate novel research and will assist in the supervision of PhD students. The senior researcher should have a PhD in physics, mathematics or computer science and a strong theoretical background in neural networks. The salary will be between Dfl. 5589 and Dfl. 7694 per month, depending on experience. The position is available for 2 years with possible extension to 4 years. After 4 years, a permanent position is anticipated. Applications: Interested candidates should send a letter with a CV and list of publications before november 1 2000 to dr. H.J. Kappen, Stichting Neurale Netwerken, University of Nijmegen, Geert Grooteplein 21, 6525 EZ Nijmegen. For information contact dr. H.J. Kappen, +31 24 3614241. From s.holden at cs.ucl.ac.uk Thu Oct 5 10:58:57 2000 From: s.holden at cs.ucl.ac.uk (Sean Holden) Date: Thu, 05 Oct 2000 15:58:57 +0100 Subject: Chair and Lectureship Message-ID: <39DC9730.17F0A987@cs.ucl.ac.uk> Dear Colleagues, Appended is an announcement of opportunities for a Chair of Bioinformatics and a Lectureship in Intelligent Systems at University College London, part of London University. Readers should note that a Lectureship in the U.K. is roughly the equivalent of an Assistant Professorship in the U.S. Best wishes, Sean. ----------------------------------------------------------------- Dr. Sean B. Holden email: S.Holden at cs.ucl.ac.uk Department of Computer Science phone: +44 (0)20 7679 3708 University College London fax: +44 (0)20 7387 1397 Gower Street, London WC1E 6BT, U.K. http://www.cs.ucl.ac.uk/staff/S.Holden/ ----------------------------------------------------------------- UNIVERSITY COLLEGE LONDON Department of Computer Science and Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX) CHAIR OF BIOINFORMATICS Applications are sought for this new chair, funded through a joint initiative of the UK Research Councils.* The appointee will be expected to create and lead a new, world-class Bioinformatics Unit in the Department of Computer Science at UCL, applying state-of-the-art mathematical and computer science techniques to problems now arising in the life sciences, driven by the post-genomic era. The Unit will also work closely with the UCL Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX) which has a proven track record for interdisciplinary collaboration in these areas. A joint appointment will be made in Computer Science and a Life Sciences department appropriate to the specific life sciences interest of the appointee. We are seeking an individual to lead innovative, interdisciplinary research at the interface between mathematics, computer science and the life sciences. Our aim is to provide the intellectual and practical tools for deriving models of the interactions that determine function of molecules, cells, tissues, and organisms. The successful candidate will have an outstanding reputatation, with experience of leading interdisciplinary research, working effectively with industry and, where appropriate, of successfully managing the commercial application of research. Applications from individuals in any relevant discipline are encouraged. The successful candidate will benefit from a rich environment of relevant collaborative work at UCL. The Research Councils are also providing funding for a lectureship and a significant level of infrastructure support. The appointed Chair will be involved in appointing this lecturer. *In recognition of the crucial role of bioinformatics, the Biotechnology and Biological Research Council (BBSRC), the Engineering and Physical Sciences Research Council (EPSRC), the Medical Research Council (MRC), the Natural Environment Research Council (NERC) and the particle Physics and Astronomy Research Council (PPARC), are funding a new Unit and Chair at UCL. Lectureship in Intelligent Systems In addition to the above, we are creating a further new lectureship in the area of Intelligent Systems or Bioinformatics. Applications are invited from candidates with research experience in either of these areas. Prospective applicants are invited to make informal enquiries to and can obtain further particulars from Professor Steve Wilbur, Head, Department of Computer Science, (telephone +44 (0)20 7679 1397, fax +44 (0)20 7387 1397, e-mail: s.wilbur at cs.ucl.ac.uk) or Professor Anne Warner, FRS, Director of CoMPLEX (telephone +44 (0)20 7679 7279, fax +44 (0)20 7387 3014, e-mail: a.warner at .ucl.ac.uk), Gower Street, London WC1E 6BT, UK. Further details can also be found at http://www.ucl.ac.uk/CoMPLEX. Applications (10 copies for UK-based candidates, one copy for overseas candidates), including a curriculum vitae and the names and addresses of three referees (at least one of whom should be based in a country other than the candidate's country of residence), should be sent to the Provost and President, UCL, Gower Street, London, WC1E 6BT, UK, to arrive no later than 6 November 2000. Working towards Equal Opportunity ------------------------------------------------------------------------------- From P.J.Lisboa at livjm.ac.uk Thu Oct 5 11:58:12 2000 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Thu, 5 Oct 2000 16:58:12 +0100 Subject: First Call for Papers for NNESMED'2001 Message-ID: NNESMED 2001 4th International Conference on Neural Networks and Expert Systems in Medicine and Healthcare Milos Island, Greece, 20-22 June 2001 Web site address: http://www.heliotopos.net/conf/nnesmed2001/ FIRST ANNOUNCEMENT AND CALL FOR PAPERS NNESMED 2001 is organised by the Department of Applied Informatics and Multimedia of the Technological Educational Institute of Crete. Previous conferences held in Plymouth (94,96) and in Pisa (98), attracted a strong international participation. The conference brings together researchers from different AI modalities with a focus on integrated, practical healthcare applications. The traditional single-track format for all presentations fosters discussion and information exchange among those working in these inter-related fields, reflecting increasing interest from clinicians in healthcare informatics at every level. CONFERENCE CHAIR GM Papadourakis Technological Educational Institute of Crete INTERNATIONAL ORGANIZING COMMITTEE EC Ifeachor, B Jervis, P Ktonas, P Lisboa and A Starita INTERNATIONAL PROGRAMME COMMITTEE Y Attikiouzel, G Bebis, N Bourbakis, B Dawant, M Gori, B Jansen, N Karayiannis,D Koutsouris,T Leo, D Linkens, D Lowe, S Michelogiannis, K Rosen, C Schizas, A Sperduti,A Taktak and M Zervakis LOCATION The island of Milos, an entirely volcanic island is located 86 nautical miles from the port of Piraeus, in the south-western part of the Cyclades group of islands, in the Aegean Sea. It is well known for the largest natural harbour in the Mediterranean, its mineral wealth, as well as for its unique beaches of white sand and crystal clear waters. It is also famous for the marble statue known as the Venus de Milo, now located in the Louvre. TOPICS OF INTEREST * Artificial Networks * Data mining * Expert Systems * Verification and validation * Fuzzy logic and fuzzy experts systems * Risk assessment * Neuro-fuzzy * Safety-critical issues * Uncertainty handling * User interface, human computer interaction * Casual probability networks * Computer aided tutoring * Model based reasoning * Intelligent signal processing * Genetic and adaptive search algorithms * Multimedia * Hybrid intelligent systems * Telemedicine and telecare * Chaos theory * Computer assisted diagnosis and prognosis * Machine learning * Auditing and evaluation of healthcare practices * Novelty detection CALL FOR PAPERS Authors are asked to submit, by email to the conference secretariat, 2 copies of extended abstracts, 2 pages in length with single spacing, in English, by 15 January 2001. Extended abstracts should clearly identify the healthcare context of the work, the methodology used, advances made and significance of the results obtained, as well as containing contact details including the corresponding author, mail address, fax numbers and email address. Authors whose abstracts are accepted will be asked to develop them into full papers of 4-6 pages length for inclusion in the conference proceedings. Important deadlines are: Submission of extended abstracts: 15 January 2001 Notification of provisional acceptance: 15 March 2001 Submission of full papers (camera ready): 15 May 2001 The website contains a pre-registration form to ensure that anyone interested in attending the conference receive further information in electronic form. From caruana+ at cs.cmu.edu Wed Oct 4 10:37:21 2000 From: caruana+ at cs.cmu.edu (Rich Caruana) Date: Wed, 04 Oct 2000 10:37:21 -0400 Subject: NIPS*2000 workshop information Message-ID: * * * Post-NIPS*2000 Workshops * * * * * * Breckenridge, Colorado * * * * * * December 1-2, 2000 * * * The NIPS*2000 Workshops will be held Friday and Saturday, December 1 and 2, in Breckenridge Colorado after the NIPS conference in Denver Monday-Thursday, November 27-30. This year there are 18 workshops: - Affective Computing - Algorithms and Technologies for Neuroprosthetics and Neurorobotics - Computation in the Cortical Column - Computational Molecular Biology - Computational Neuropsychology - Cross-Validation, Bootstrap, and Model Selection - Data Fusion -- Theory and Applications - Data Mining and Learning on the Web - Explorative Analysis and Data Modeling in Functional Neuroimaging - Geometric Methods in Learning Theory - Information and Statistical Structure in Spike Trains - Learn the Policy or Learn the Value-Function? - New Perspectives in Kernel-based Learning Methods - Quantum Neural Computing - Representing the Structure of Visual Objects - Real-Time Modeling for Complex Learning Tasks - Software Support for Bayesian Analysis Systems - Using Unlabeled Data for Supervised Learning All workshops are open to all registered attendees. Many workshops also invite submissions. Submissions, and questions about individual workshops, should be directed to each workshop's organizers. Included below is a short description of each workshop. Additional information is available at the NIPS*2000 Workshop Web Page: http://www.cs.cmu.edu/Groups/NIPS/NIPS2000/Workshops/ Information about registration, travel, and accommodations for the main conference and the workshops is available at: http://www.cs.cmu.edu/Web/Groups/NIPS/ Breckenridge is a ski resort a few hours drive from Denver. The daily workshop schedule is designed to allow participants to ski half days, or enjoy other extra-curricular activities. Some may wish to extend their visit to take advantage of the relatively low pre-season rates. We look forward to seeing you in Breckenridge. Rich Caruana and Virginia de Sa NIPS Workshops Co-chairs * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Affective Computing Workshop Co-Chairs: Javier R. Movellan, Institute for Neural Computation, UCSD Marian Bartlett, Institute for Neural Computation, UCSD Gary Cottrell, Computer Science, UCSD Rosalind W. Picard, Media Lab, MIT Description: The goal of this workshop is to explore and discuss the idea of affective computers, i.e., computers that have the ability to express emotions, recognize emotions, and whose behavior is modulated by emotional states. Emotions are a fundamental part of humans intelligence. It may be argued that emotions provide an "operating system" for autonomous agents that need to handle uncertainty of natural environments in a flexible and efficient manner. Connectionist models of emotion dynamics have been developed (Velasquez, 1996) providing examples of computational systems that incorporate emotional dynamics and that are being used in actual autonomous agents (e.g., robotic pets). Emotional skills, especially the ability to recognize and express emotions, are essential for natural communication between humans, and until recently, have been absent from the computer side of the human-computer interaction. For example, autonomous teaching agents and pet robots would greatly benefit from detecting affective cues from the users (curiosity, frustration, insight, anger) and adjusting to them, and also from displaying emotion appropriate to the context. The workshop will bring together leaders in the main research areas of affective computing: emotion recognition, emotion synthesis, emotion dynamics, applications. Speakers from industry will discuss current applications of affective computing, including synthesizing facial expressions in the entertainment industry, increasing the appeal of the pet robots through emotion recognition and synthesis, and measuring galvanic skin response through the mouse to determine user frustration. Format: This will be a one day workshop. The speakers will be encouraged to talk about challenges and controversial topics both in their prepared talks and in the insuing discussions. Since one of the goals of the workshop is to facilitate communication between researchers in different subfields, ample time will be given to questions. The last part of the workshop will be devoted to a discussion of the most promising approaches and ideas that will have emerged during the workshop. Contact Info: Javier R. Movellan Institute for Neural Computation University of California San Diego La Jolla, CA 92093-0515 movellan at inc.ucsd.edu Marian Stewart Bartlett Institute for Neural Computation University of California San Diego La Jolla, CA 92093-0515 marni at inc.ucsd.edu * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Algorithms, technologies, and neural representations for neuroprosthetics and Neurorobotics. Organizers: Simon F Giszter and Karen A Moxon http://neurobio.mcphu.edu/GiszterWeb/nips_ws_2k.html Goals and objectives: The goal of the workshop is to bring together researchers interested in neuroprosthetics and neurorobotics and intermediate representations in the brain with a view to generating a lively discussion of the design principles for a brain to artificial device interface. Speakers will be charged to address (favourably or unfavourably) the idea that the nervous system is built around, or dynamically organizes, low dimensional representations which may be used in (or need to be designed into) neurosprosthetic interfaces and controllers. Some current prosthetics are built around explicit motor representations e.g. kinematic plans. Though controversial, the notion of primitives and low dimensional representations of input and output are gaining favor. These may or may not contain or be used in explicit plans. It is very likely that the appropriate choices of sensory and motor representations and motor elements are critical for the design of an integrated sensory motor prostheses that enables rapid adaptive learning, and creative construction of new motions and planning and execution. With a burgeoning interest in neuroprosthetics it is therefore timely to address how the interfaces to neuroprostheses should be conceptualized: what reprsentations should be extracted, what control elements should be provided and how should these be integrated. We hope to engage a wide range of perspectives to address needs for research and the possibilities enabled by neuroprosthetics. We intend to assemble presentations and discussions from the perspectives of both neural data and theory, of new technologies and algorithms, and of applications or experimental approaches enabled by new and current technologies. Anticipated or fully confirmed speaker/ participants: John Chapin: Neurorobotics Nikos Hatzopoulos: Neural coding in cortex of primates Scott Makeig or Terry Sejnowski: EEG based controllers: representations James Abbas: peripheral FES with CPG models Warren Grill and Michel Lemay Intraspinal FES and force-fields Karen Moxon : sensory prostheses Gerry Loeb : intramuscular prostheses and spinal controls Simon Giszter : spinal primitives and interface Emo Todorov : cortical encoding and representation Igo Krebs : rehabilitation with robots * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * COMPUTATION IN THE CORTICAL COLUMN Organizers: Henry Markram and Jennifer Linden URL: http://www.keck.ucsf.edu/~linden/ColumnWorkshop.html Understanding computation in the cortical column is a holy grail for both experimental and theoretical neuroscience. The basic six-layered neocortical columnar microcircuit, implemented most extensively (and perhaps in its most sophisticated form) in the human brain, supports a huge variety of sensory, cognitive, and motor functions. The secret behind the incredible flexibility and power of cortical columns has remained elusive, but new insights are emerging from several different areas of research. It is a great time for cortical anatomists, physiologists, modellers, and theoreticians to join forces in attempting to decipher computation in the cortical column. In this workshop, leading experimental and theoretical neuroscientists will present their own visions of computation in the cortical column, and will debate their views with an interdisciplinary audience. During the morning session, speakers and panel members will analyze columnar computation from their perspectives as authorities on the anatomy, physiology, evolution, and network properties of cortical microcircuitry. Speakers and panelists in the afternoon session will consider the functional significance of the cortical column in light of their expert knowledge of two columnar systems which have attracted intensive experimental attention to date: the visual cortex of cats and primates, and the barrel cortex of rodents. The goal of the workshop will be to define answers to four questions. ANATOMY: Does a common denominator, a repeating microcircuit element, exist in all neocortex? PHYSIOLOGY: What are the electrical dynamics, the computations, of the six-layered cortical microcircuit? FUNCTION: How do cortical columns contribute to perception? EVOLUTION: How does the neocortex confer such immense adaptability? * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Computational Molecular Biology Organizers: Tommi Jaakkola, MIT Nir Friedman, Hebrew University For more information contact the workshop organizers at: tommi at ai.mit.edu nir at cs.huji.ac.il * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * NIPS*2000 Workshop on Computational Neuropsychology Workshop Organizers: Sara Solla Northwestern University Michael Mozer University of Colorado Martha Farah University of Pennsylvania The 1980's saw two important developments in the sciences of the mind: The development of neural network models in cognitive psychology, and the rise of cognitive neuroscience. In the 1990's, these two separate approaches converged, and one of the results was a new field that we call "Computational Neuropsychology." In contrast to traditional cognitive neuropsychology, computational neuropsychology uses the concepts and methods of computational modeling to infer the normal cognitive architecture from the behavior of brain-damaged patients. In contrast to traditional neural network modeling in psychology, computational neuropsychology derives constraints on network architectures and dynamics from functional neuroanatomy and neurophysiology. Unfortunately, work in computational neuropsychology has had relatively little contact with the Neural Information Processing Systems (NIPS) community. Our workshop aims to expose the NIPS community to the unusual patient cases in neuropsychology and the sorts of inferences that can be drawn from these patients based on computational models, and to expose researchers in computational neuropsychology to some of the more sophisticated modeling techniques and concepts that have emerged from the NIPS community in recent years. We are interested in speakers from all aspects of neuropsychology, including: * attention (neglect) * visual and auditory perception (agnosia) * reading (acquired dyslexia) * face recognition (prosopagnosia) * memory (Alzheimer's, amnesia, category-specific deficits) * language (aphasia) * executive function (schizophrenia, frontal deficits). Further information about the workshop can be obtained at: http://www.cs.colorado.edu/~mozer/nips2000workshop.html Contact Sara Solla (solla at nwu.edu) or Mike Mozer (mozer at colorado.edu) if you are interested in speaking at the workshop. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Call for Papers NIPS*2000 Workshop: Cross-Validation, Bootstrap and Model Selection Organizers: Rahul Sukthankar Compaq CRL and Robotics Institute, Carnegie Mellon Larry Wasserman Department of Statistics, Carnegie Mellon Rich Caruana Center for Automated Learning and Discovery, Carnegie Mellon Electronic Submission Deadline: October 18, 2000 (extended abstracts) Description Cross-validation and bootstrap are popular methods for estimating generalization error based on resampling a limited pool of data, and have become widely-used for model selection. The aim of this workshop is to bring together researchers from both matchine learning and statistics in an informal setting to discuss current issues in resampling-based techniques. These include: * Improving theoretical bounds on cross-validation, bootstrap or other resampling-based methods; * Empirical or theoretical comparisons between resampling-based methods and other forms of model selection; * Exploring the issue of overfitting in sampling-based methods; * Efficient algorithms for estimating generalization error; * Novel resampling-based approaches to model selection. The format for this one day workshop consists of invited talks, a panel discussion and short presentations from accepted submissions. Participants are encouraged to submit extended abstracts describing their current research in this area. Results presented at other conferences are eligible, provided that they are of broad interest to the community and clearly identified as such. Submissions for workshop presentations must be received by October 18, 2000, and should be sent to rahuls=nips at cs.cmu.edu. Extended abstracts should be in Postscript or Acrobat format and 1-2 pages in length. Contact Information The workshop organizers can be contacted by email at , or at the phone/fax numbers listed below. Organizer Email Phone Fax Rahul Sukthankar rahuls at cs.cmu.edu +1-617-551-7694 +1-617-551-7650 Larry Wasserman larry at stat.cmu.edu +1-412-268-8727 +1-412-268-7828 Rich Caruana caruana at cs.cmu.edu +1-412-268-7664 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Workshop Title: Data Fusion -- Theory and Applications Multisensor data fusion refers to the acquisition, processing, and synergistic combination of information gathered by various knowledge sources and sensors to provide a better understanding of the phenomenon under consideration. The concept of fusion underlies many information processing mechanisms in machines and in biological systems. In biological/perceptual systems, information fusion seems to account for remarkable performance and robustness when confronted with a variety of uncertainties. The complexity of fusion processes is due to many factors including uncertainties associated with different information sources, complimentarily of individual sources. For example, modeling, processing, fusion, and interpretation of diverse sensor data for knowledge assimilation and inferencing pose challenging problems, especially when available information is incomplete, inconsistent, and/or imprecise. The potential for significantly enhanced performance and robustness has motivated vigorous ongoing research in both biological and artificial multisensor data fusion algorithms, architectures, and applications. Such efforts deal with fundamental issues including modeling process, architecture and algorithms, information extraction, fusion process, optimization of fused performance, real time (dynamic) fusion etc. The goal of this workshop is to bring together researchers from various diverse fields (learning, human computer interaction, vision, speech, neural biology, etc) to discuss both theoretical and application issues that are relevant across different fields. It aims is to make the NIPS community aware of the various aspects and current status of this field, as well as the problems that remain unsolved. We are calling for participations. Submissions should be sent to the workshop organizers. Workshop Organizers: Misha Pavel pavel at ece.ogi.edu (503)748-1155 (o) Dept. of Electrical and Computer Engineering Oregon Graduate Institute of Science and Technology 20000 NW Walker Road Beaverton, OR 97006 Xubo Song xubosong at ece.ogi.edu (503) 748-1311 (o) Dept. of Electrical and Computer Engineering Oregon Graduate Institute of Science and Technology 20000 NW Walker Road Beaverton, OR 97006 Workshop Web Page: http://www.ece.ogi.edu/~xubosong/FusionWorkshop.html * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Data Mining and Learning on the Web Organizers: Gary William Flake (flake at research.nj.nec.com), Frans Coetzee (coetzee at research.nj.nec.com), and David Pennock (dpennock at research.nj.nec.com) No doubt about it, the web is big. So big, in fact, that many classical algorithms for databases and graphs cannot scale to the distributed multi-terabyte anarchy that is the web. How, then, do we best use, mine, and model this rich collection of data? Arguably, the best approach to developing scalable non-trivial applications and algorithms is to exploit the fact that, both as a graph and as a database, the web is highly non-random. Furthermore, since the web is mostly created and organized by humans, the graph structure (in the form of hyperlinks) encodes aspects of the content, and vice-versa. =20 We will discuss methods that exploit these properties of the web. We will further consider how many of the classical algorithms, which were formulated to minimize worst-case performance over all possible problem instances, can be adapted to the more regular structure of the web. Finally, we will attempt to identify major open research directions. This workshop will be organized into three mini-sessions: (1) Systematic Web Regularities (2) Web Mining Algorithms, and (3) Inferable Web Regularities. All speakers are invited and a partial list of confirmed speakers includes: Albert-L=E1szl=F3 Barab=E1si, Justin Boyan, Rich Caruana, Soumen Chakrabarti, Monika Henzinger, Ravi Kumar, Steve Lawrence, and Andrew McCallum. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * EXPLORATIVE ANALYSIS AND DATA MODELING IN FUNCTIONAL NEUROIMAGING: Arriving at, not starting with a hypothesis Advanced analysis methods and models of neuroimaging data are only marginally represented at the big international brain mapping meetings. This contrasts the broad belief in the Neuroimaging community that these approaches are crucial in the further development of the field. The purpose of this NIPS workshop is to bring together theoreticians developing and applying new methods of neuroimaging data interpretation. The workshop focuses on explorative analysis (a) and modeling (b) of neuroimaging data: a) Higher-order explorative analysis (for example: ICA/PCA, clustering algorithms) can reveal data properties in a data-driven, not hypothesis-driven manner. b) Models for neuroimaging data can guide the data interpretation. The universe of possible functional hypotheses can be constrained by models linking the data to other empirical data, for instance anatomical connectivity (pathway analysis to yield effective connectivity), encephalograghy data, or to behavior (computational function). The talks will introduce the new approaches, in discussions we hope to address benefits and problems of the various methods and of their possible combination. It is intended to discuss not only the theory behind the various approaches, but as well their value for improved data interpretation. Therefore, we strongly encourage theparticipation of neuroimaging experimenters concerned with functional paradigms suggesting the use of nonstandard data interpretation methods. More information can be found on the workshop webpage: http://www.informatik.uni-ulm.de/ni/staff/FSommer/workshops/nips_ws00.html For any requests please contact the workshop organizers: Fritz Sommer and Andrzej Wichert Department of Neural Information Processing University of Ulm D-89069 Ulm Germany Tel. 49(731)502-4154 49(731)502-4257 FAX 49(731)502-4156 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Geometric and Quantum Methods in Learning Organization: Shunichi Amari, Amir Assadi (Chair), and Tomaso Poggio Description. The purpose of this workshop is to attract the attention of the learning community to geometric methods and to take on an endeavor to: 1. lay out a geometric paradigm for formulating profound ideas in learning; 2. to facilitate the development of geometric methods suitable of investigation of new ideas in learning theory. Today's continuing advances in computation make it possible to infuse geometric ideas into learning that would otherwise have been computationally prohibitive. Quantum computation has created great excitement, offering a broad spectrum of new ideas for discovery of parallel-distributed algorithms, a hallmark of learning theory. In addition, geometry and quantum computation together offer a more profound picture of the physical world, and how it interacts with the brain, the ultimate learning system. Among the discussion topics, we envision the following: Information geometry, differential topological and quantum methods for turning local estimates into global quantities and invariants, Riemannian geometry and Feynman path integration as a framework to explore nonlinearity, and information theory of massive data sets. We will also examine the potential impact of learning theory on future development of geometry, and examples of how quantum computation has opened new vistas on design of parallel-distributed algorithms. The participants of the Workshop on Quantum Computation will find this workshop's geometric ideas beneficial for the theoretical aspects of quantum algorithms and quantum information theory. We plan to prepare a volume based on the materials for the workshops and other contributions to be proposed to the NIPS Program Committee. Contact Information Amir Assadi University of Wisconsin-Madison. URL: www.cms.wisc.edu/~cvg E-mail: ahassadi at facstaff.wisc.edu Partial List of Speakers and Panelists Shun-Ichi Amari Amir Assadi Zubin Ghahramani Geoffrey Hinton Tomaso Poggio Jose Principe Scott Mackeig Naoki Saito (tentative) * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Information and Statistical Structure in Spike Trains: How can we calculate what we really want to know? Organizer: Jonathan D. Victor jdvicto at med.cornell.edu Advances in understanding how neurons represent and manipulate information in their spike trains will require a combination of appropriate theoretical, computational, and experimental strategies. The workshop has several aims: (1) By presenting currently available methods in a tutorial-like fashion, we hope to lower the energy barrier to experimentalists who are interested in using information-theoretic and related approaches, but have not yet done so. The presentation of current methods is to be done in a manner that emphasizes the theoretical underpinnings of different strategies and the assumptions and tradeoffs that they make. (2) By provide a forum for open discussion among current practitioners, we hope to make progress towards understanding the relationships of the available techniques, guidelines for their application, and the basis of the differences in findings across preparations. (3) By presenting the (not fully satisfactory) state of the art to an audience that includes theorists, we hope to spur progress towards the development of better techniques, with a particular emphasis on exploiting more refined hypotheses for spike train structure, and developing techniques that are applicable to multi-unit recordings. A limited number of slots are available for contributed presentations. Individuals interested in presenting a talk (approximately 20 minutes, with 10 to 20 minutes for discussion) should submit a title and abstract, 200-300 words, to the organizer by October 22, 2000. Please indicate projection needs (overheads, 2x2 slides, LCD data projector). For further information, please see http://www-users.med.cornell.edu/~jdvicto/nips2000.html * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Title: ====== Reinforcement Learning: Learn the Policy or Learn the Value-Function? Organising Committee ==================== Peter Bartlett (Peter.Bartlett at anu.edu.au) Jonathan Baxter (jbaxter at whizbang.com) David McAllester (dmac at research.att.com) Home page ========= http://csl.anu.edu.au/~bartlett/rlworkshop Workshop Outline ============== There are essentially three main approaches to reinforcement learning in large state spaces: 1) Learn an approximate value function and use that to generate a policy, 2) Learn the parameters of the policy directly, typically using a Monte-Carlo estimate of the performance gradient, and 3) "Actor-Critic" methods that seek to combine the best features of 1) and 2). There has been a recent revival of interest in this area, with many new algorithms being proposed in the past two years. It seems the time is right to bring together researchers for an open discussion of the three different approaches. Submissions are sought on any topic of related interest, such as new algorithms for reinforcement learning, but we are particularly keen to solicit contributions that shed theoretical or experimental light on the relative merits of the three approaches, or that provide a synthesis or cross-fertilization between the different disciplines. Format ====== There will be invited talks and a series of short contributed talks (15 minutes), with plenty of discussion time. If you are interested in presenting at the workshop, please send a title and short abstract to jbaxter at whizbang.com * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * New Perspectives in Kernel-based Learning Methods Nello Cristianini, John Shawe-Taylor, Bob Williamson http://www.cs.rhbnc.ac.uk/colt/nips2000.html Abstract: The aim of the workshop is to present new perspectives and new directions in kernel methods for machine learning. Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel functions in learning systems. Support Vector Machines, Gaussian Processes, kernel PCA, kernel Gram-Schmidt, Bayes Point Machines, Relevance and Leverage Vector Machines, are just some of the algorithms that make crucial use of kernels for problems of classification, regression, density estimation, novelty detection and clustering. At the same time as these algorithms have been under development, novel techniques specifically designed for kernel-based systems have resulted in methods for assessing generalisation, implementing model selection, and analysing performance. The choice of model may be simply determined by parameters of the kernel, as for example the width of a Gaussian kernel. More recently, however, methods for designing and combining kernels have created a toolkit of options for choosing a kernel in a particular application. These methods have extended the applicability of the techniques beyond the natural Euclidean spaces to more general discrete structures. The field is witnessing growth on a number of fronts, with the publication of books, editing of special issues, organization of special sessions and web-sites. Moreover, a convergence of ideas and concepts from different disciplines is occurring. The growth is concentrated in four main directions: 1) design of novel kernel-based algorithms 2) design of novel types of kernel functions 3) development of new learning theory concepts 4) application of the techniques to new problem areas Extended abstracts may be submitted before October 30th to nello at dcs.rhbnc.ac.uk * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Quantum Neural Computing http://web.physics.twsu.edu/behrman/NIPS.htm Recently there has been a resurgence of interest in quantum computers because of their potential for being very much smaller and faster than classical computers, and because of their ability in principle to do heretofore impossible calculations, such as factorization of large numbers in polynomial time. This workshop will explore ways to implement quantum computing in network topologies, thus exploiting both the intrinsic advantages of quantum computing and the adaptability of neural computing. Aspects/approaches to be explored will include: quantum hardware, e.g. nmr, quantumdots, and molecular computing; theoretical and practical limits to quantum and quantum neural computing, e.g., noise and measurability; and simulations. Targeted groups: computer scientists, physicists and mathematicians interested in quantum computing and next-generation computing hardware. Invited speakers will include: Paul Werbos, NSF Program Director, Control, Networks & Computational Intelligence Program, Electrical and Communications Systems Division, who will keynote the workshop. Thaddeus Ladd, Stanford, "Crystal lattice quantum computation." Mitja Perus, Institute BION, Stegne 21, SI-1000 Ljubljana, Slovenia, "Quantum associative nets: A new phase processing model" Ron Spencer, Texas A&M University, "Spectral associative memories." E.C. Behrman, J.E. Steck, and S.R. Skinner, Wichita State University, "Simulations of quantum neural networks." Dan Ventura, Penn State: "Linear optics implementation of quantum algorithms." Ron Chrisley, TBA Send contributed papers, by October 20th, to: Co-chairs: Elizabeth C. Behrman behrman at wsuhub.uc.twsu.edu James E. Steck steck at bravo.engr.twsu.edu This Workshop is partially supported by the National Science Foundation, Grant #ECS-9820606. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Workshop title: Representing the Structure of Visual Objects Web page: http://kybele.psych.cornell.edu/~edelman/NIPS00/index.html Organizers: Nathan Intrator (Nathan_Intrator at brown.edu) Shimon Edelman (se37 at cornell.edu) Confirmed invited speakers: Ron Chrisley (Sussex) John Hummel (UCLA) Christoph von der Malsburg (USC) Pietro Perona (Caltech) Tomaso Poggio (MIT) Greg Rainer (Tuebingen) Manabu Tanifuji (RIKEN) Shimon Ullman (Weizmann) Description: The focus of theoretical discussion in visual object processing has recently started to shift from problems of recognition and categorization to the representation of object structure. The main challenges there are productivity and systematicity, two traits commonly attributed to human cognition. Intuitively, a cognitive system is productive if it is open-ended, that is, if the set of entities with which it can deal is, at least potentially, infinite. Systematicity, even more than productivity, is at the crux of the debate focusing on the representational theory of mind. A visual representation could be considered systematic if a well-defined change in the spatial configuration of the object (e.g., swapping top and bottom parts) were to cause a principled change in the representation (the representations of top and bottom parts are swapped). In vision, this issue (as well as compositionality, commonly seen as the perfect means of attaining systematicity) is, at present, wide open. The workshop will start with an introductory survey of the notions of productivity, systematicity and compositionality, and will consist of presentations by the proponents of some of the leading theories in the field of structure representation, interspersed with open-floor discussion. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Title: Real-Time Modeling for Complex Learning Tasks The goal of this workshop is to develop a better understanding how to create new statistical learning techniques that can deal with complex, high dimensional data sets, where (possibly redundant and/or irrelevant) data is received continuously from sensors and needs to be incorporated in learning models that may have to change their structure during learning under real time constraints. The workshop aims at bringing together researchers from various theoretical learning frameworks (Bayesian, Nonparametric statistics, Kernel methods, Gaussian processes etc.) and application domains to discuss future research direction for principled approaches towards real-time learning. For further details, please refer to the URL: http://www-slab.usc.edu/events/NIPS2000 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Software Support for Bayesian Analysis System URL: http://ase.arc.nasa.gov/nips2000 Bayesian Analysis is an established technique for many data analysis applications. The development of application software for any specific analysis problem, however, is a difficult and time-consuming task. Programs must be tailored to the specific problem, need to represent the given statistical model correctly, and should preferably run efficiently. Over the last years, a variety of different libraries, shells, and synthesis systems for Bayesian data analysis has been implemented which are intended to simplify application software development. The goal of this workshop is to bring together developers of such generic Bayesian software packages and tools (e.g., JavaBayes, AutoClass, BayesPack, BUGS, BayesNet Toolbox, PDP++) together with the developers of generic algorithm schemas (more recent ones amenable to automated effort include Structural EM, Fisher Kernel method, mean-field, etc.), and software engineering experts. It is intended as a forum to discuss and exchange the different technical approaches as for example usage of libraries, interpretation of statistical models (e.g., Gibbs sampling), or software synthesis based on generic algorithm schemas. The workshop aims to discuss the potential and problems of generic tools for the development of efficient Bayesian data analysis software tailored towards specific applications. If you are planning to attend this workshop as a particpiant and/or are interested to present your work, please send a short (1-4 pages) system description, technical paper, or position paper to fisch at ptolemy.arc.nasa.gov no later than Wednesday, October, 18 2000. Preliminary PC: Organizers: L. Getoor, Stanford W. Buntine, Dynaptics P. Smyth, UC Irvine B. Fischer, RIACS/NASA Ames M. Turmon, JPL J. Schumann, RIACS/NASA Ames K. Murphy, UC Berkeley * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * NIPS 2000 Workshop and Unlabeled Data Supervised Learning Competition! We are pleased to announce the NIPS 2000 Unlabeled Data Supervised Learning Competition! This competition is designed to compare algorithms and architectures that use unlabeled data to help supervised learning, and will culminate in a NIPS workshop, where approaches and results will be compared. Round three begins soon, so don't delay (it is also still possible to submit results for rounds 1 and 2). More details, are now available at the competition web-site: http://q.cis.uoguelph.ca/~skremer/NIPS2000/ May the best algorithm win! Stefan, Deb, and Kristin * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * From Darryl.Charles at ntlworld.com Fri Oct 6 10:58:32 2000 From: Darryl.Charles at ntlworld.com (Darryl Charles) Date: Fri, 6 Oct 2000 15:58:32 +0100 Subject: Special session at ESANN'2001 on Artificial_Neural_Networks and Early Vision Processing Message-ID: Bruges (Belgium) April 25-26-27, 2001 Call for submission of papers to a special session at ESANN2001 on "Artificial Neural Networks and Early Vision Processing". Organized by Darryl Charles and Colin Fyfe from the University of Paisley. Submission of papers 8 December 2000 Notification of acceptance 5 February 2001 ESANN conference 25-27 April 2001 It is well known that biological visual systems, and in particular the human visual system, are extraordinarily good at extracting deciphering very complex visual scenes. Certainly, if we consider the human visual system to be solving inverse graphic problems then we have not really come close to building artificial systems which are as effective as biological ones. We have much to learn from studying biological visual architecture and the implementation of practical vision based products could be improved by gaining inspiration from these systems. The following are some suggested areas of interest: Unsupervised preprocessing methods e.g. development of local filters, edge filtering. Statistical structure identification e.g. Independent Component Analysis, Factor Analysis, Principal Components Analysis, Canonical Correlation Analysis, Projection pursuit. Information theoretic techniques for the extraction/preservation of information in visual data. Coding strategies e.g. sparse coding, complexity reduction. Binocular disparity. Motion, invariances, colour encoding e.g. optical flow, space/time filters. Topography preservation. The practical application of techniques relating to these topics. Submission to esann at dice.ucl.ac.be by the 8th December 2000. ESSAN2001 web site http://www.dice.ucl.ac.be/esann Darryl Charles and Colin Fyfe darryl.charles at pailsey.ac.uk and colin.fyfe at pailsey.ac.uk Applied Computational Intelligence Research Group School of Information and Communication Technologies University of Paisley Scotland From atascill at ford.com Fri Oct 6 13:58:00 2000 From: atascill at ford.com (Tascillo, Anya (A.L.)) Date: Fri, 6 Oct 2000 13:58:00 -0400 Subject: INNS-IEEE IJCNN 2001 www.geocities.com/ijcnn Message-ID: Greetings Connectionists! www.geocities.com/ijcnn We are pleased to announce the website for the INNS-IEEE International Joint Conference for Neural Networks, 2001, in Washington, D.C., July 14-19, 2001. Mark your calendars! Several neural network challenge problems will be offered this year, and an automated abstract submission form will be online November 1. After the legal experts get done reviewing my T-shirt design (fingers crossed), it will be displayed and offered on the website for sale along with registration. One less shirt to pack! A reminder will be sent on or near November 1. Please email atascill at ford.com if you would like to be placed on the mailing list for reminders to all open and close dates. All comments and suggestions are welcome (sightseeing, venue, etc.), and anyone who feels that they should have been asked to help organize/referee, and haven't heard from someone yet, email me now, and I'll forward your name (and qualifications) to the appropriate parties..... Publicity Chair Anya Tascillo From magnus at cs.man.ac.uk Mon Oct 9 09:45:21 2000 From: magnus at cs.man.ac.uk (Magnus Rattray) Date: Mon, 09 Oct 2000 14:45:21 +0100 Subject: Chair and lectureship in Bioinformatics Message-ID: <39E1CBF1.AF14B817@cs.man.ac.uk> Readers of this list may be interested in the following posts: The Department of Computer Science and the School of Biological Sciences at the University of Manchester seek applications for a Research Councils funded Chair in Bioinformatics, and for an associated lectureship. The University already has a strong collaborative research activity in Bioinformatics, and with the support of the UK Research Councils, is seeking to grow and strengthen that activity, specifically in the area of Post Genomic Bioinformatics. The Chair: The appointee will be expected to establish and lead collaborative research activities that bring advanced computational techniques to bear on problems involving information management, analysis and visualization, specifically in the context of genomic data. It is anticipated that the appointee will have an international reputation for work in bioinformatics, and will possess skills that complement those of people currently working at Manchester. The salary will be negotiable from ?37,500 p.a. The Lectureship: The appointee will be expected to bring computational skills of relevance to post genomic bioinformatics. Specifically, applicants with experience of statistical modelling, machine learning and information visualization are particularly encouraged to apply, but these topics should not be considered as excluding other relevant areas. The salary will be in the range ?18731 - ?23256 or ?24227 - ?30967 per annum. For further particulars and an application form, please contact the Director of Personnel, The University of Manchester, Manchester M139PL (tel: (+44) 161 275 2028, fax: (+44) 161 275 2471, quoting appropriate reference number. Informal enquiries can be made to Prof. Norman Paton (tel: (+44) 161 275 6910, email: norm at cs.man.ac.uk or Prof. Steve Oliver (tel: (+44) 161 606 7260, email steve.oliver at man.ac.uk). Closing date: 27 October 2000. From lemm at uni-muenster.de Mon Oct 9 11:50:16 2000 From: lemm at uni-muenster.de (Joerg_Lemm) Date: Mon, 9 Oct 2000 17:50:16 +0200 (CEST) Subject: Papers on Bayesian Quantum Theory Message-ID: Dear Colleagues, The following papers are available at http://pauli.uni-muenster.de/~lemm/ 1. Bayesian Reconstruction of Approximately Periodic Potentials for Quantum Systems at Finite Temperatures. (Lemm, Uhlig, Weiguny) 2. Inverse Time--Dependent Quantum Mechanics. (Lemm) (to appear in Phys. Lett. A) 3. Bayesian Inverse Quantum Theory. (Lemm, Uhlig) (to appear in Few-Body Systems) 4. Hartree-Fock Approximation for Inverse Many-Body Problems. (Lemm,Uhlig) Phys. Rev. Lett. 84, 4517--4120 (2000) 5. A Bayesian Approach to Inverse Quantum Statistics. (Lemm,Uhlig,Weiguny) Phys. Rev. Lett. 84, 2068-2071 (2000) In this series of papers a nonparametric Bayesian approach is developed and applied to the inverse quantum problem of reconstructing potentials from observational data. While the specific likelihood model of quantum mechanics may be mainly of interest for physicists, the presented techniques for constructing adapted situation specific prior processes (Gaussian processes for approximate invariances, mixtures of Gaussian processes, hyperparameters and hyperfields) are also useful for general empirical learning problems including density estimation, classification and regression. PAPERS: ======================================================================== Bayesian Reconstruction of Approximately Periodic Potentials for Quantum Systems at Finite Temperatures. by Lemm, J. C. , Uhlig, J., and A. Weiguny MS-TP1-00-4, arXiv:quant-ph/0005122 http://pauli.uni-muenster.de/~lemm/papers/pp.ps.gz Abstract: The paper discusses the reconstruction of potentials for quantum systems at finite temperatures from observational data. A nonparametric approach is developed, based on the framework of Bayesian statistics, to solve such inverse problems. Besides the specific model of quantum statistics giving the probability of observational data, a Bayesian approach is essentially based on "a priori" information available for the potential. Different possibilities to implement "a priori" information are discussed in detail, including hyperparameters, hyperfields, and non--Gaussian auxiliary fields. Special emphasis is put on the reconstruction of potentials with approximate periodicity. Such potentials might for example correspond to periodic surfaces modified by point defects and observed by atomic force microscopy. The feasibility of the approach is demonstrated for a numerical model. ======================================================================== Inverse Time--Dependent Quantum Mechanics. by Lemm, J. C. MS-TP1-00-1, arXiv:quant-ph/0002010 (to appear in Phys.Lett. A) http://pauli.uni-muenster.de/~lemm/papers/tdq.ps.gz Abstract: Using a new Bayesian method for solving inverse quantum problems, potentials of quantum systems are reconstructed from time series obtained by coordinate measurements in non--stationary states. The approach is based on two basic inputs: 1. a likelihood model, providing the probabilistic description of the measurement process as given by the axioms of quantum mechanics, and 2. additional "a priori" information implemented in form of stochastic processes over potentials. ======================================================================== Bayesian Inverse Quantum Theory. by Lemm, J. C. and Uhlig, J. MS-TP1-99-15, arXiv:quant-ph/0006027 (to appear in Few-Body Systems) http://pauli.uni-muenster.de/~lemm/papers/biqt.ps.gz Abstract: A Bayesian approach is developed to determine quantum mechanical potentials from empirical data. Bayesian methods, combining empirical measurements and "a priori" information, provide flexible tools for such empirical learning problems. The paper presents the basic theory, concentrating in particular on measurements of particle coordinates in quantum mechanical systems at finite temperature. The computational feasibility of the approach is demonstrated by numerical case studies. Finally, it is shown how the approach can be generalized to such many--body and few--body systems for which a mean field description is appropriate. This is done by means of a Bayesian inverse Hartree--Fock approximation. ======================================================================== Hartree-Fock Approximation for Inverse Many-Body Problems. by Lemm, J. C. and Uhlig, J. MS-TP1-99-10, arXiv:nucl-th/9908056 Phys. Rev. Lett. 84, 4517--4120 (2000) http://pauli.uni-muenster.de/~lemm/papers/ihf3.ps.gz Abstract: A new method is presented to reconstruct the potential of a quantum mechanical many--body system from observational data, combining a nonparametric Bayesian approach with a Hartree--Fock approximation. "A priori" information is implemented as a stochastic process, defined on the space of potentials.The method is computationally feasible and provides a general framework to treat inverse problems for quantum mechanical many--body systems. ======================================================================== A Bayesian Approach to Inverse Quantum Statistics. by Lemm, J. C., Uhlig, J., and Weiguny, A. MS-TP1-99-6, arXiv:cond-mat/9907013 Phys. Rev. Lett. 84, 2068-2071 (2000) http://pauli.uni-muenster.de/~lemm/papers/iqs.ps.gz Abstract: A nonparametric Bayesian approach is developed to determine quantum potentials from empirical data for quantum systems at finite temperature. The approach combines the likelihood model of quantum mechanics with a priori information on potentials implemented in form of stochastic processes. Its specific advantages are the possibilities to deal with heterogeneous data and to express a priori information explicitly in terms of the potential of interest. A numerical solution in maximum a posteriori approximation is obtained for one--dimensional problems. The number of measurements being small compared to the degrees of freedom of a nonparametric estimate, the results depend strongly on the implemented a priori information. ======================================================================== Dr. Joerg Lemm Universitaet Muenster Email: lemm at uni-muenster.de Institut fuer Theoretische Physik Phone: +49(251)83-34922 Wilhelm-Klemm-Str.9 Fax: +49(251)83-36328 D-48149 Muenster, Germany http://pauli.uni-muenster.de/~lemm ======================================================================== From DominikD at cybernetics.com.au Tue Oct 10 20:44:24 2000 From: DominikD at cybernetics.com.au (Dominik Dersch) Date: Wed, 11 Oct 2000 11:44:24 +1100 Subject: career opportunities at Crux Cybernetics Message-ID: <2F8D50B141B3D311AF5100A0C9FC05580416EF@UCSMAILSVR> We have two open positions at Crux Cybernetics for a Research and Development Team Leader and a Research and Development Software Developer at our Sydney, CBD location. Please reply only to recruitment at cybernetics.com.au --------------------------------------------------- Research and Development Team Leader The Research and Development Group (RDG) at Crux Cybernetics is responsible for research, development and commercial implementation of machine learning, artificial intelligence and constraint satisfaction application software using web, wireless, and e-commerce technologies. The group has a broad range of skills covering areas like speech recognition, image processing and financial time series analysis. To find out more about Crux Cybernetics go to our web page www.cybernetics.com.au RDG requires the services of a person to act as a research and development specialist and team leader. The primary responsibilities are leading a team to carry out research, consultancy, design, & implementation of artificial intelligent software solutions. Utilisation and interfacing with several related technologies and business process models and methodologies and keeping up to date with relevant research will be required. Essential: =B7 A Ph. D. in Physics, Engineering or applied Mathematics in an Artificial Intelligence topic =B7 Publication track record in relevant areas =B7 At least three years experience in applied research and development =B7 Excellent communication and presentation skills =B7 Able to write clear and accurate research proposals and reports =B7 Fluent in C++, C, PERL and shell scripting languages =B7 Familiar in at least one of the following prototyping and analysis tools: Matlab, IDL, Splus, Mathematica or Statit =B7 Experience in html and cgi =B7 NT, Unix & Linux operation system experience =B7 Leadership qualities =B7 Management potential Desirable: =B7 Grant application experience =B7 Track record of successful managing all phases of a project. =B7 Object-oriented design experience =B7 UML experience =B7 Business or Systems analysis =B7 Other OO languages =B7 Commercial e/m-commerce and e/m-finance experience Reference: 091000 Posted: 091000 Location of Position: CBD, Sydney, Australia To apply: All applicants must provide a letter of application that specially addresses your suitability for the role described above. Please email this letter together with a current resume to recruitment at cybernetics.com.au -------------------------------------------------------- Research and Development Software Developer The Research and Development Group (RDG) at Crux Cybernetics is responsible for research, development and commercial implementation of machine learning, artificial intelligence and constraint satisfaction application software using web, wireless, and e-commerce technologies. The group has a broad range of skills covering areas like speech recognition, image processing and financial time series analysis. To find out more about Crux Cybernetics go to our web page www.cybernetics.com.au RDG requires the services of a person to act as a research and development specialist. The primary responsibilities are to carry out research, consultancy, design, & implementation of artificial intelligent software solutions under the supervision of the team leader. Utilisation and interfacing with several related technologies and business process models and methodologies and keeping up to date with relevant research will be required. Essential: =B7 A recent degree in Physics, Engineering or applied Mathematics in an Artificial Intelligence topic =B7 Excellent communication and presentation skills =B7 Able to write clear and accurate research proposals and reports =B7 Fluent in C++, C, PERL and shell scripting languages =B7 Familiar in at least one of the following prototyping and analysis tools: Matlab, IDL, Splus, Mathematica or Statit =B7 NT, Unix & Linux operation system experience =B7 Willingness to explore new research areas =B7 Enthusiasm and genuine interest in commercialising AI research Desirable: =B7 Publication track record in relevant areas =B7 Experience in applied research and development =B7 Experience in html and cgi =B7 Experience in managing a project. =B7 Object-oriented design experience =B7 UML experience =B7 Business or Systems analysis =B7 Other OO languages =B7 Commercial e/m-commerce and e/m-finance experience Reference: 091000B Posted: 091000 Location of Position: CBD, Sydney To apply: All applicants must provide a letter of application that specially addresses your suitability for the role described above. Please email this letter together with a current resume to recruitment at cybernetics.com.au From philh at cogs.susx.ac.uk Wed Oct 11 09:08:02 2000 From: philh at cogs.susx.ac.uk (Phil Husbands) Date: Wed, 11 Oct 2000 14:08:02 +0100 Subject: Research Professorship Message-ID: <39E46620.82E99578@cogs.susx.ac.uk> Research Professorship School Of Cognitive And Computing Sciences, University of Sussex, UK Professor Of Neural Computation Ref 484 Applicants are invited to apply for a permanent professorship position within the Computer Science and Artificial Intelligence Subject Group of the School of Cognitive and Computing Sciences. The expected start date is 1 January 2001 or as soon as possible thereafter. Candidates should be able to show evidence of significant research achievement in Neural Computation. The successful applicant will be expected to expand significantly the existing high research profile of the Group in this area. It is intended that the post will carry a reduced teaching load. The salary is negotiable - the current minimum professorial salary is 37,493 per annum. Informal enquiries may be made to Dr Des Watson, tel +44 1273 678045, email desw at cogs.susx.ac.uk. Details of the School are available at http://www.cogs.susx.ac.uk Closing date: Friday 27 October 2000. Application forms and further particulars are available from and should be returned to Staffing Services Office, Sussex House, University of Sussex, Falmer, Brighton, BN1 9RH, tel +44 1273 678706. Further details, downloadable forms etc. at http://www.susx.ac.uk/Units/staffing/personnl/vacs/vac484.shtml From zhaoping at gatsby.ucl.ac.uk Thu Oct 12 09:38:47 2000 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Thu, 12 Oct 2000 14:38:47 +0100 (BST) Subject: Paper on computational design and nonlinear dynamics of a V1 model Message-ID: Title: Computational design and nonlinear dynamics of a recurrent network model of the primary visual cortex Author: Zhaoping Li Accepted for publication in Neural Computation available at: http://www.gatsby.ucl.ac.uk/~zhaoping/preattentivevision.html Abstract: Recurrent interactions in the primary visual cortex makes its output a complex nonlinear transform of its input. This transform serves pre-attentive visual segmentation, i.e., autonomously processing visual inputs to give outputs that selectively emphasize certain features for segmentation. An analytical understanding of the nonlinear dynamics of the recurrent neural circuit is essential to harness its computational power. We derive requirements on the neural architecture, components, and connection weights of a biologically plausible model of the cortex such that region segmentation, figure-ground segregation, and contour enhancement can be achieved simultaneously. In addition, we analyze the conditions governing neural oscillations, illusory contours, and the absence of visual hallucinations. Many of our analytical techniques can be applied to other recurrent networks with translation invariant neural connection structures. From bvr at stanford.edu Thu Oct 12 21:24:17 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Thu, 12 Oct 2000 18:24:17 -0700 Subject: NIPS*2000: Reminder -- Registration and Accommodations Deadlines Message-ID: <4.2.0.58.20001012182340.00dd7cb0@bvr.pobox.stanford.edu> Neural Information Processing Systems -- Natural and Synthetic Monday November 27 - Saturday December 2, 2000 Denver and Breckenridge, Colorado Early registration for NIPS*2000 ends November 1. Hotel rooms for the main conference at the Denver Marriott City Center (1-800-228-9290) will be held at the special conference rate ($80/night single, $90/night double) only until November 6. Rooms for the workshops at the Beaver Run Resort (1-800-288-1282) will be held at the special conference rates only until October 31, and at the Great Divide Lodge (1-800-321-8444) until November 9. Detailed information on hotel accommodations, transportation, registration, and the conference program is available at the NIPS web site http://www.cs.cmu.edu/Web/Groups/NIPS From oreilly at grey.colorado.edu Fri Oct 13 13:54:23 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Fri, 13 Oct 2000 11:54:23 -0600 Subject: Paper on Generalization in Interactive Networks Message-ID: <200010131754.LAA13846@grey.colorado.edu> The following preprint is now available for downloading: ftp://grey.colorado.edu/pub/oreilly/papers/oreilly00_gen_nc.pdf *or* ftp://grey.colorado.edu/pub/oreilly/papers/oreilly00_gen_nc.ps Generalization in Interactive Networks: The Benefits of Inhibitory Competition and Hebbian Learning Randall C. O'Reilly Department of Psychology University of Colorado at Boulder In press at Neural Computation Abstract: Computational models in cognitive neuroscience should ideally use biological properties and powerful computational principles to produce behavior consistent with psychological findings. Error-driven backpropagation is computationally powerful, and has proven useful for modeling a range of psychological data, but is not biologically plausible. Several approaches to implementing backpropagation in a biologically plausible fashion converge on the idea of using bidirectional activation propagation in interactive networks to convey error signals. This paper demonstrates two main points about these error-driven interactive networks: (a) they generalize poorly due to attractor dynamics that interfere with the network's ability to systematically produce novel combinatorial representations in response to novel inputs; and (b) this generalization problem can be remedied by adding two widely used mechanistic principles, inhibitory competition and Hebbian learning, that can be independently motivated for a variety of biological, psychological and computational reasons. Simulations using the Leabra algorithm, which combines the generalized recirculation (GeneRec) biologically-plausible error-driven learning algorithm with inhibitory competition and Hebbian learning, show that these mechanisms can result in good generalization in interactive networks. These results support the general conclusion that cognitive neuroscience models that incorporate the core mechanistic principles of interactivity, inhibitory competition, and error-driven and Hebbian learning satisfy a wider range of biological, psychological and computational constraints than models employing a subset of these principles. - Randy +----------------------------------------------------------------+ | Dr. Randall C. O'Reilly | | | Assistant Professor | Phone: (303) 492-0054 | | Department of Psychology | Fax: (303) 492-2967 | | Univ. of Colorado Boulder | Home: (303) 448-1810 | | Muenzinger D251C | Cell: (720) 839-7751 | | 345 UCB | email: oreilly at psych.colorado.edu | | Boulder, CO 80309-0345 | www: psych.colorado.edu/~oreilly | +----------------------------------------------------------------+ From masuoka at flab.fujitsu.co.jp Sun Oct 15 18:48:27 2000 From: masuoka at flab.fujitsu.co.jp (Ryusuke Masuoka) Date: Sun, 15 Oct 2000 18:48:27 -0400 Subject: Ph.D. Thesis available: Neural networks learning differential data In-Reply-To: <200010131754.LAA13846@grey.colorado.edu> Message-ID: Dear Connectionists, I am pleased to announce the availability of my Ph.D. thesis for download in electronic format. Comments are welcome. Regrads, Ryusuke ------------------------------------------------------------ Thesis: ------- Title: "Neural Networks Learning Differential Data" Advisor: Michio Yamada URL: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/thesis/thesis.html Abstract: -------- Learning systems that learn from previous experiences and/or provided examples of appropriate behaviors, allow the people to specify {\em what} the systems should do for each case, not {\em how} systems should act for each step. That eases system users' burdens to a great extent. It is essential in efficient and accurate learning for supervised learning systems such as neural networks to be able to utilize knowledge in the forms of such as logical expressions, probability distributions, and constraint on differential data along with provided desirable input and output pairs. Neural networks, which can learn constraint on differential data, have already been applied to pattern recognition and differential equations. Other applications such as robotics have been suggested as applications of neural networks learning differential data. In this dissertation, we investigate the extended framework introduce constraints on differential data into neural networks' learning. We also investigate other items that form the foundations for the applications of neural networks learning differential data. First, new and very general architecture and an algorithm are introduced for multilayer perceptrons to learn differential data The algorithm is applicable to learning differential data of orders not only first but also higher than first and completely localized to each unit in the multilayer perceptrons like the back propagation algorithm. Then the architecture and the algorithm are implemented as computer programs. This required high programming skills and great amount of care. The main module is programmed in C++. The implementation is used to conduct experiments among others to show convergence of neural networks with differential data of up to third order. Along with the architecture and the algorithm, we give analyses of neural networks learning differential data such as comparison with extra pattern scheme, how learnings work, sample complexity, effects of irrelevant features, and noise robustness. A new application of neural networks learning differential data to continuous action generation in reinforcement learning and its experiments using the implementation are described. The problem is reduced to realization of a random vector generator for a given probability distribution, which corresponds to solving a differential equation of first order. In addition to the above application to reinforcement learning, two other possible applications of neural networks learning differential data are proposed. Those are differential equations and simulation of human arm. For differential equations, we propose a very general framework, which unifies differential equations, boundary conditions, and other constraints. For the simulation, we propose a natural neural network implementation of the minimum-torque-change model. Finally, we present results on higher order extensions to radial basis function (RBF) networks of minimizing solutions with differential error terms, best approximation property of the above solutions, and a proof of $C^l$ denseness of RBF networks. Through these detailed accounts of architecture, an algorithm, an implementation, analyses, and applications, this dissertation as a whole lays the foundations for applications of neural networks learning differential data as learning systems and will help promote their further applications. ------------------------------------------------------------ Ryusuke Masuoka, Ph.D. Senior Researcher Intelligent Systems Laboratory Fujitsu Laboratories Ltd. 1-4-3 Nakase, Mihama-ku Chiba, 261-8588, Japan Email: masuoka at flab.fujitsu.co.jp Web: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/ From jf218 at hermes.cam.ac.uk Mon Oct 16 08:53:31 2000 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Mon, 16 Oct 2000 13:53:31 +0100 (BST) Subject: Five years post-doc position on Comput. Neurosci. In-Reply-To: Message-ID: The Babraham Institute, Cambridge Two Postdoctoral positions available in Neuroscience Applications are invited for two postdoctoral scientists to join a group of systems neuroscientists within the Laboratory of Cognitive and Developmental Neuroscience investigating how the brain encodes visual and olfactory cues associated with recognition or both social and non-social objects. Current research employs behavioural, computational, neurophysiological (including multi-array electrophysiological recording) and functional neuroanatomical approaches using rodent and sheep models. Behavioural Neuroscientist/In vivo Electrophysiologist (Ref. KK/EP/3) Applications are invited for a post-doctoral scientist to work on a BBSRC funded grant investigating both behavioural and neurophysiological aspects of face perception and imagery in sheep. Candidates must either be experienced in behavioural assessments of perception/learning in animals or with in vivo electrophysiological recording methods. Training in aspects of behavioural or electrophysiological methodologies required, but not covered by the applicant?s previous experience, will be provided by existing members of the group. Experience with computational analysis of complex data would be an advantage. The appointment is for 3 years in the first instance. Computational Neuroscientist (Ref. JF/P/2) This post is available initially for 5 years. It would suit an individual with experience in computational analysis and modelling of sensory system functioning and will mainly involve utilisation of electrophysiological data from multi-array recording experiments. The individual would also be expected to work closely with electrophysiologists both within the group and the USA and to co-ordinate with other UK-based Computational Neuroscientists involved with the projects. The group already has excellent computational facilities to deal with the large amounts data associated multi-array recording experiments Informal enquiries on these Neuroscience vacancies should be directed to Dr. Keith Kendrick, Head of Neurobiology Programme: tel: 44(0) 1223 496385, fax. 44(0)1223 496028, e-mail keith.kendrick at bbsrc.ac.uk Starting salary for both positions in the range ?19,500 - ?23,000 per annum. Benefits include a non-contributory pension scheme, 25 days leave and 10? public holidays a year. On site Refectory, Nursery and Sports & Social Club as well as free car parking. Further details and an application form available from the Personnel Office, The Babraham Institute, Babraham, Cambridge CB2 4AT. Tel. 01223 496000, e-mail babraham.personnel at bbsrc.ac.uk. The closing date for these positions is 23rd October 2000. AN EQUAL OPPORTUNITIES EMPLOYER An Institute supported by the Biotechnology and Biological Sciences Research Council Jianfeng Feng The Babraham Institute Cambridge CB2 4AT UK http://www.cus.cam.ac.uk/~jf218 From j-patton at northwestern.edu Mon Oct 16 15:55:02 2000 From: j-patton at northwestern.edu (Jim Patton) Date: Mon, 16 Oct 2000 14:55:02 -0500 Subject: postdoc - Rehab Robotics Message-ID: <4.2.0.58.20001016143514.00aca940@merle.acns.nwu.edu> Postdoctoral Fellowship in Rehabilitation Robotics Position: Postdoctoral Fellow Organization: Sensory Motor Performance Program, Northwestern University and the Rehabilitation Institute of Chicago Location: Chicago, Illinois Posted: 10/16/00 Deadline: 12/16/00 Description: The Sensory Motor Performance Program (SMPP) is a multidisciplinary research laboratory located at the Rehabilitation Institute of Chicago (RIC) and affiliated with Northwestern University Medical an Engineering Schools. Members of SMPP perform basic research in the areas of musculoskeletal biomechanics and neural control of motion. Specific emphasis is placed on the study of musculoskeletal and neurological diseases that influence movement control. We are currently seeking a Postdoctoral Research Associate to join our study on movement control in normal and hemiparetic stroke subjects. The research involves the application of robotics technology to the study of the neuropathology of following stroke. The applicant will benefit from the mentorship of W. Z. Rymer and F. A. Mussa-Ivaldi, both established leaders in the field. Salary is contingent on educational background & experience. More information about this position and our research is available at http://www.smpp.nwu.edu/. Qualifications: Applicants will be expected to hold an earned doctorate in Biomedical Engineering or related discipline, with a record of research in motor control, robotics, biomechanics, neuroscience, or related field. The ideal candidate will augment our group's expertise in clinical neuromechanical analysis, modeling of multijoint limb movement, control theory, haptics, system identification, neural networks, or learning theory. Emphasis will be placed on early start time. RIC is an Affirmative Action/Equal Opportunity Employer. Woman and minority applicants are encouraged to apply. Hiring is contingent on eligibility to work in the United States. Contact: Please send a letter, vita, and the names, addresses and email addresses of 3-4 references to: James L. Patton, Ph.D. Rehabilitation Institute of Chicago Room 1406 345 East Superior, Chicago, Illinois USA 60611 ______________________________________________________________________ J A M E S P A T T O N , P H . D . Research Associate, Sensory Motor Performance Program Rehabilitation Institute of Chicago. Postdoctoral Fellow Physical Medicine & Rehabilitation Northwestern University Medical School 345 East Superior Room 1406 Chicago, IL 60611 NOTE: my phone number has changed! 312-238-1277 (OFFICE) -2208 (FAX) -1232 (LAB) WEB: EMAIL: _______________________________________________________________________ From jsteil at TechFak.Uni-Bielefeld.DE Tue Oct 17 02:04:32 2000 From: jsteil at TechFak.Uni-Bielefeld.DE (Jochen Jakob Steil) Date: Tue, 17 Oct 2000 08:04:32 +0200 Subject: Job offer: Neural Methods in Robotics and Visualisation Message-ID: <39EBEBF0.1B327B1B@techfak.uni-bielefeld.de> Dear Colleagues: The research group Neuroinformatics (Prof. Helge Ritter) at the University of Bielefeld is offering two research project positions for a Research Assistant with salary according to BAT-IIa. The positions will be affiliated with the Special Collaborative Research Unit (Sonderforschungsbereich) SFB 360: Situated Artificial Communicators ( http://www.sfb360.uni-bielefeld.de/sfbengl.html ). Research topics of the two projects are (i) neural approaches for visual robot arm instruction based on teaching by showing and (ii) development of a human-machine interface for state visualization and configuration of distributed system components of the situated communicator prototype system. Both projects provide the opportunity for a dissertation. Applicants should have a university degree (Masters or german diploma) in computer science, electrical engineering or physics and should have a good knowledge in Unix/C/C++ programming. A good background in the fields of robotics/computer vision/ visualization/neural networks is desireable. The University of Bielefeld follows a policy to increase the proportion of female employees in fields where women are underrepresented and is therefore particularly encouraging women for an application. Applications of accordingly qualified disabled persons are welcome. Further information can be obtained from our group homepage: http://www.TechFak.Uni-Bielefeld.DE/techfak/ags/ni/ the university homepage: http://www.Uni-Bielefeld.DE Applications should be sent to: Prof. Dr. Helge Ritter Arbeitsgruppe Neuroinformatik Technische Fakultaet Universitaet Bielefeld 33 501 Bielefeld mailto:helge at techfak.uni-bielefeld.de From sylee at ee.kaist.ac.kr Tue Oct 17 02:46:56 2000 From: sylee at ee.kaist.ac.kr (Soo-Young Lee) Date: Tue, 17 Oct 2000 15:46:56 +0900 Subject: Faculty Position at KAIST Message-ID: <007901c03806$0ab27280$329ef88f@kaist.ac.kr> A faculty position is open at the Department of Electrical Engineering at Korea Advanced Institute of Science and Technology (KAIST). Although all areas of electrical engineering are eligible, top researchers in new interdisciplinary areas such as neural networks, biologically-motivated signal processing, artificial life, and bioelectronics will have higher priority. Unlike other BK(Brain Korea' 21) Research Professor position it is a regular faculty position with full faculty responsibility. The new faculty may also work closely with the Brain Science Research Center, the main research organization of Korean Brain Science and Engineering Research Program sponsored by Ministry of Science and Technology. The application deadline is October 27th, 2000. For details please visit www.kaist.ac.kr. Soo-Young Lee Professor, Department of Electrical Engineering Director, Brain Science Research Center Korea Advanced Institute of Science and Technology 373-1 Kusong-dong, Yusong-gu Taejon 305-701 Korea (South) Tel: +82-42-869-3431 / Fax: +82-42-869-8570 From wolfskil at MIT.EDU Tue Oct 17 16:06:40 2000 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Tue, 17 Oct 2000 16:06:40 -0400 Subject: ANN: Advances in Neural Information Processing Systems 12 Message-ID: I thought readers of this list might be interested in this book. For more information please visit http://mitpress.mit.edu/promotions/books/SOLDHF00 Advances in Neural Information Processing Systems 12 edited by Sara A. Solla, Todd K. Leen, and Klaus-Robert M|ller The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented. Sara A. Solla is Professor of Physics and Astronomy at Northwestern University and of Physiology at Northwestern University Medical School. Todd K. Leen is Professor of Computer Science and Engineering, and of Electrical and Computer Engineering, at Oregon Graduate Institute of Science and Technology. Klaus-Robert M|ller is Associate Professor of Computer Science at the University of Potsdam and Senior Researcher at GMD-FIRST. 7 x 10, 1098 pp., cloth ISBN 0-262-19450-3 -------------------------------------------------------------------------------- Jud Wolfskill 617.253.2079 phone Associate Publicist 617.253.1709 fax MIT Press wolfskil at mit.edu 5 Cambridge Center http://mitpress.mit.edu Fourth Floor Cambridge, MA 02142 From sml at essex.ac.uk Wed Oct 18 09:07:57 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Wed, 18 Oct 2000 14:07:57 +0100 Subject: paper available: fast dictionary search Message-ID: <6A8CC2D6487ED411A39F00D0B7847B66E77E64@sernt14.essex.ac.uk> Dear All, The following paper on graph-based dictionary search (due to appear in Pattern Recognition Letters) is available at: http://algoval.essex.ac.uk/papers/dictionary/prl.ps As far as I am aware, the method described is unique in that the retrieval speed is independent of the size (number of entries) in the dictionary. The core of the method is based on a lazy matrix parser for probabilistic context-free grammars, which are a type of probabilistic graphical model. Therefore, it may be interesting to explore other possible uses of the method in that area. As always, comments are welcome. Best regards, Simon Lucas ---------------------------------------------------- Title: Efficient graph-based dictionary search and its application to text-image searching Keywords: dictionary search, text image indexing, graph search Abstract --------- This paper describes a novel method for applying dictionary knowledge to optimally interpret the confidence-rated hypothesis sets produced by lower-level pattern classifiers. This problem arises whenever image or video databases need to be scanned for textual content, and where some of the text strings are expected to be strings from a dictionary. The method is especially appropriate for large dictionaries, as might occur in vehicle registration number recognition for example. The problem is cast as enumerating the paths in a graph in best-first order given the constraint that each complete path is a word in some specified dictionary. The solution described here is of particular interest due to its generality, flexibility and because the time to retrieve each path is independent of the size of the dictionary. Synthetic results are presented for searching dictionaries of up to 1 million UK postcodes given graphs that correspond to insertion, deletion and substitution errors. We also present initial results from processing real noisy text images. -------------------------------------------------- Dr. Simon Lucas Senior Lecturer and MSc E-commerce Director Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom Email: sml at essex.ac.uk http://cswww.essex.ac.uk -------------------------------------------------- From angelo at soc.plymouth.ac.uk Wed Oct 18 09:51:13 2000 From: angelo at soc.plymouth.ac.uk (Angelo Cangelosi) Date: Wed, 18 Oct 2000 06:51:13 -0700 (MST) Subject: Postdoc positions in experimental and connectionist models of spatial language Message-ID: Two 2-year Post-doc Research Fellows (1 neural net modelling + 1 experimental psychology) Salary: #18222 - pay award pending - RF Scale The Development of a Psychological Plausible Computational Model for Spatial Language Use and Comprehension. Two post-doctoral fellows are required for a period of two years in the first instance to work on a collaborative project with Dr Kenny Coventry (Centre for Thinking and Language, Department of Psychology) and Dr Angelo Cangelosi (Centre for Neural and Adaptive Systems, School of Computing) at the University of Plymouth. The objective of the research is to develop a computational model for spatial language using neural networks based on experimental data with human participants. One of the positions will primarily involve the design and collection of experimental data from human participants, and the other post will primarily involve connectionist computational modelling. However, candidates possessing both computational and experimental skills are particularly encouraged to apply. Ref : 3918/HSCI - (based in the Centre for Neural and Adaptive Systems, School of Computing), the applicant must have a PhD in one of the Cognitive Sciences (Computer Science/Psychology/Linguistics), with expertise in cognitive and neural network modelling and programming skills. Ref : 3917/HSCI - (based in the Centre for Thinking and Language, Department of Psychology), the applicant must have a PhD in one of the Cognitive Sciences (Psychology/Linguistics/Computer Science), with expertise in experimental methodologies and analyses. The appointees will be working collaboratively as part of a larger interdisciplinary research team (see http://psy.plym.ac.uk/research/slg/slg.html and www.tech.plymouth.ac.uk/soc/staff/angelo). Anyone wishing to discuss the posts should contact Dr Kenny Coventry (kcoventry at plymouth.ac.uk) or Dr Angelo Cangelosi (acangelosi at plymouth.ac.uk). Application details can be obtained from The Personnel Department (personnel at plymouth.ac.uk, 6 Portland Villas, Drake Circus, Plymouth, PL4 8AA), or from Drs. Coventry and Cangelosi. Plymouth University is an Equal Opportunities employer. CLOSING DATE :THURSDAY 2 NOVEMBER 2000 From heiko.wersing at hre-ftr.f.rd.honda.co.jp Thu Oct 19 11:32:27 2000 From: heiko.wersing at hre-ftr.f.rd.honda.co.jp (Heiko Wersing) Date: Thu, 19 Oct 2000 17:32:27 +0200 Subject: Paper on stability conditions for design of linear threshold recurrent networks Message-ID: <39EF140B.A939BB63@hre-ftr.f.rd.honda.co.jp> Dear Connectionists, the following paper has recently been accepted for publication in Neural Computation. It can be downloaded from my university homepage at http://www.techfak.uni-bielefeld.de/~hwersing/ Comments and questions are highly appreciated. ---------------------------------------------- "Dynamical stability conditions for recurrent neural networks with unsaturating piecewise linear transfer functions" by Heiko Wersing, Wolf-Juergen Beyn, and Helge Ritter. Abstract: We establish two conditions which ensure the non-divergence of additive recurrent networks with unsaturating piecewise linear transfer functions, also called linear threshold or semilinear transfer functions. As was recently shown by Hahnloser (Nature 405, 2000) networks of this type can be efficiently built in silicon and exhibit the coexistence of digital selection and analogue amplification in a single circuit. To obtain this behaviour, the network must be multistable and non-divergent and our conditions allow to determine the regimes where this can be achieved with maximal recurrent amplification. The first condition can be applied to nonsymmetric networks and has a simple interpretation of requiring that the strength of local inhibition must match the sum over excitatory weights converging onto a neuron. The second condition is restricted to symmetric networks, but can also take into account the stabilizing effect of non-local inhibitory interactions. We demonstrate the application of the conditions on a simple example and the orientation-selectivity model of Ben-Yishai et al. (1995). We show that the conditions can be used to identify in their model regions of maximal orientation-selective amplification and symmetry breaking. ------------------------------------------------- Best wishes -- Heiko Wersing Future Technology Research HONDA R&D EUROPE (DEUTSCHLAND) GmbH Carl-Legien-Str. 30 63073 Offenbach /Main Germany Tel.: +49-69-89011741 Fax: +49-69-89011749 e-mail: heiko.wersing at hre-ftr.f.rd.honda.co.jp From erol at starlab.net Fri Oct 20 03:46:15 2000 From: erol at starlab.net (Erol Sahin) Date: Fri, 20 Oct 2000 09:46:15 +0200 (CEST) Subject: Jobs@Starlab: Brussels and Barcelona Message-ID: Dear colleague, Starlab Brussels and Starlab Barcelona are now actively recruiting researchers in various fields from all over the world. Below I append a list of positions that may be of interest to the connectionists community. For a full list of positions--both at Brussels and Barcelona please check this site: http://www.starlab.org/jobs/ Please distribute and post as appropriate (or disregard if not interested!) This site will be updated regularly. Best regards, thanks for your time, Erol Sahin, Ph.D. Chief Scientist Starlab Research Laboratories Tel: +32-2-740 0768 Boulevard Saint-Michel 47 http://www.starlab.org Brussels 1040 BELGIUM E-mail:erol at starlab.net ------------------------------------------------------------ STARLAB IS LOOKING FOR INQUISITIVE AND ENTREPRENEURIAL MINDS TO WORK AT THE ROBOTICS LAB IN STARLAB BRUSSELS, BELGIUM To apply, please e-mail your resume and the names of three references along with a paragraph of your DREAM ROBOTICS PROJECT to: Dr Erol Sahin E-mail: erol at starlab.net Phone : +32-2-740 0768 Fax : +32-2-742 9654 Interest in interdisciplinary research activities a must. Remuneration: the best in the market. Consideration will be given to any outstanding research proposals. No part-timers, please. Starlab is a place where a 100 years means nothing. Here, you will find world-class research scientists not only doing what they do best, but also creating wealth through the entrepreneurship and spin-offs of their scientific ideas and inventions. Starlab's strength lies in the cross-fertilization of multiple disciplines grouped according to Bits, Atoms, Neurons, and Genes (BANG). Book a place on this adventure as Starlab takes you through time and space, riding on the wave of an impending IPO. Senior Scientist (Robotics):The candidate is expected to help the leading of the robotics research at Starlab. He should have a PhD level or more experience in robotics with an active research record. The ideal candidate has been involved in the writing and leading of robotics projects, is experienced in electromechanical building of robots and has a good grasp of ADAPTIVE METHODS such as NEURAL NETWORKS and genetic algorithms. Research Scientist (Robotics):The candidate is expected to involve in research projects which wil include electromechanical robot building, programming. He should have an MS level or more experience in robotics with experience in electromechanical building of robots and a proven competence in programming. Having experience in building robot simulators and using ADAPTIVE METHODS such as NEURAL NETWORKS and genetic algorithms would be a plus. Research Scientist (Robotics): In this position you will be developing hybrid robots. You will be working on the boundary between wearables research, intelligent clothing, augmented reality and robotics, and will be linking into BIOLOGICAL and NEUROSCIENCE as well. You will need knowledge of and experience in wearables research, embedded computing, sensor technology and DSP. You will also need an interest in PHYSIOLOGY and BRAIN SCIENCES. ------------------------------------------------------------ STARLAB IS LOOKING FOR INQUISITIVE AND ENTREPRENEURIAL MINDS TO WORK IN THE NEW STARLAB LABORATORY IN BARCELONA, SPAIN To apply, please send your CV and a letter of interests/research plan to Dr Giulio Ruffini E-mail: giulio at starlab.net Phone : +32-2-740 07 40 Fax : +32-2-742 96 54 Interest in interdisciplinary research activities a must. Remuneration: the best in the market. Consideration will be given to any outstanding research proposals. No part-timers, please. Senior Scientist to lead the Marine biology group: The tasks include setting up a research team. This includes ordering and setting up equipment, possibly including a small research submarine (!), and creating and leading a small (3-5-person) research group. Starlab wants to create a marine biology research center in Barcelona, focusing on extremophiles and ocean pollution studies. Applications are invited from candidates with a Ph.D. in marine biology. The applicant should have a proven record of scholarly publication, as well as several years of postdoctoral experience in the field Senior Scientist to lead the Genes group: The tasks include creating and leading a small (3-5-person) research group. Our goal is to build a center for gene therapy ("virus domestication lab" in the case of viral therapy) in Starlab Barcelona. Applications are invited from candidates with a Ph.D. in genetics and experience in viral therapy. Robotics or AI Senior Scientist: The candidate is expected to lead the robotics and AI research at Starlab Barcelona. He should have a Ph.D. level or more experience in robotics and/or AI with an active research record. The ideal candidate has been involved in the writing and leading of AI/robotics projects, is experienced in electromechanical building of robots and has a good grasp of adaptive/evolutive methods. Possible focus on multi-robotics and emergence. This research will partially be conducted remotely using the CAM-BRAIN computer at Starlab Brussels. Electronics Hardware Engineer for the Earth Observation Group. The tasks include ordering components and setting up the relevant equipment as well as the necessary skills to develop RF hardware prototypes for GPS applications, as well as writing the appropriate embedded software. Applications are invited from candidates with a M.Sc. or PhD in Electrical/Electronics Engineering with GPS experience. Senior Scientist EEG Analysis/Sleep Lab: We are seeking a project leader to build an EEG lab to study memory formation, develop novel diagnostic tools, and carry out research in sleep/lucidity. The chosen candidate will have the resources to put a team together and buy the necessary equipment. Candidates should have the necessary experience, including directing the construction work needed (Faraday cage, etc). The EEG team will have the opportunity to work closely together with outstanding signal processing specialists from the Future Earth Observation Systems group, as well as with the AI and neurology teams. Lab technician position in the Neurons Group: The applicant will be involved in neurodegeneration and neurogenesis research projects by combining neuroanatomical and molecular approaches. We are looking for highly motivated and open-minded people. Preference is given to technicians with experience in morphological and/or molecular Labs. Systems Manager: The tasks include ordering components and setting up the relevant equipment, and the necessary skills to set up and maintain an up to date Linux and Windows network as well as internet connectivity. Applications are invited from candidates with strong experience. Ph.D. and experience with numerical computation highly desired. From mpessk at guppy.mpe.nus.edu.sg Fri Oct 20 04:40:35 2000 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Fri, 20 Oct 2000 16:40:35 +0800 (SGT) Subject: TR on Efficient Incremental SVM Computations Message-ID: A Useful Bound for Incremental Computations in SVM Algorithms S.S. Keerthi and C.J. Ong National University of Singapore Abstract: A simple bound is given that is useful for checking the optimality of points whose lagrange multipliers take bound values. This bound is very inexpensive to compute and is useful in various scenarios. To download a gzipped postscript file containing the report, go to: http://guppy.mpe.nus.edu.sg/~mpessk/svm.shtml From anderson at cs.colostate.edu Fri Oct 20 13:26:52 2000 From: anderson at cs.colostate.edu (Chuck Anderson) Date: Fri, 20 Oct 2000 11:26:52 -0600 Subject: Tenure-track positions open at Colorado State University, Fort Collins, CO Message-ID: <39F0805C.B87096D1@cs.colostate.edu> Several tenure-track positions are open in the computer science at Colorado State University. We have a strong AI group in neural networks, reinforcement learning, planning, genetic algorithms, and computer vision, involving six of our 14 faculty. We have collaborative research projects with math, engineering and neurobiology departments at CSU and with local industry. Read more about our AI program at http://www.cs.colostate.edu/aigroup.html Chuck Anderson associate professor Department of Computer Science anderson at cs.colostate.edu Colorado State University http://www.cs.colostate.edu/~anderson Fort Collins, CO 80523-1873 office: 970-491-7491, FAX: 970-491-2466 Tenure-Track Faculty Positions Colorado State University Department of Computer Science The Department of Computer Science at Colorado State University solicits applications for at least two tenure-track faculty positions, beginning Fall 2001. The appointments will be preferably made at the level of assistant professor, but appointment at a more senior level is also possible for candidates who can demonstrate a strong connection to ongoing department research. Applicants must have a Ph.D. in computer science, computer engineering, or a related field. Applicants will be expected to teach undergraduate and graduate courses, and they must demonstrate potential for excellence in research and teaching. The Computer Science Department has 700 undergraduate majors and 80 graduate students enrolled in Master's and doctoral programs. The department currently has 17 tenure-track faculty, with strong research programs in artificial intelligence, software engineering, and parallel and distributed computation. Computer facilities are excellent, and there are ample opportunities for research collaborations with local industry. Colorado State University, with an enrollment of 22,000 students, is located in Fort Collins, Colorado, an attractive community of over 100,000 people, at the base of the Front Range of the Rocky Mountains, 65 miles north of Denver. The northern Front Range offers a wide range of outdoor recreational activities. More information about the department and its research programs can be obtained from the department's home page at http://www.cs.colostate.edu Applicants should send a curriculum vitae and letters from at least three professional references to: Faculty Search Committee, Computer Science Department, Colorado State University, Fort Collins, CO 80523. Please include a statement indicating how your background and interests match the expectations of the position(s) described above. The department's telephone number is 970-491-5862, and email inquiries should be directed to faculty-search at cs.colostate.edu. Screening of applications will begin November 1, 2000, and continue until the position is filled. Colorado State University is an EEO/AA employer. Office of Equal Opportunity: 101 Student Services. From kasigvardt at ucdavis.edu Fri Oct 13 16:19:01 2000 From: kasigvardt at ucdavis.edu (Karen A. Sigvardt) Date: Fri, 13 Oct 2000 13:19:01 -0700 Subject: Postdoc position at UC Davis: modeling basal ganglia in Parkinson's Message-ID: POSTDOCTORAL FELLOW University of California Davis, Center for Neuroscience and Department of Neurology. A postdoctoral fellow with strong quantitative skills is sought to perform analysis and modeling of neural activity recorded from basal ganglia and thalamus in Parkinson's Disease patients. The goal of the project is to understand the dynamics of the basal ganglia-thalamocortical network and how it relates to behavior and to the motor symptoms of PD. The project is led by a collaborative team of investigators: Drs. Karen Sigvardt of UC Davis, Charles Gray of Montana State University and Nancy Kopell of Boston University. UCD Center for Neuroscience has a strong, diverse faculty with research interests ranging from cellular to cognitive neuroscience (http://neuroscience.ucdavis.edu ). Candidates with a Ph.D. in Neuroscience, Physics, Computer Science or related fields are encouraged to apply. Salary is commensurate with experience. Send current CV, cover letter, relevant reprints and two references to: Dr. Karen Sigvardt, Center for Neuroscience, UC Davis, 1544 Newton Court, Davis CA 95616 or e-mail kasigvardt at ucdavis.edu. UC Davis is an EOE/AA employer. Thanks Karen Karen A. Sigvardt, Ph.D. Adjunct Professor of Neurology Center for Neuroscience University of California Davis 1544 Newton Court Davis CA 95616 Phone: (530) 757-8520 Lab: (530) 754-5022 Fax: (530) 757-8827 email: kasigvardt at ucdavis.edu From halici at metu.edu.tr Sat Oct 21 13:57:07 2000 From: halici at metu.edu.tr (Ugur HALICI) Date: Sat, 21 Oct 2000 20:57:07 +0300 Subject: 2ndCFP: Brain-Machine'2000-Notification of deadline & additional information Message-ID: <39F1D8F3.68B9F3DA@metu.edu.tr> SECOND CALL FOR PAPERS ---------------------------------------------------------- Notification of deadline & additional information ---------------------------------------------------------- BRAIN - MACHINE WORKSHOP 20-22 December 2000, Ankara, Turkey ---------------------------------------------------------- Conference Homepage: http://heaven.eee.metu.edu.tr/~vision/brainmachine.html ---------------------------------------------------------- The workshop aims to bring together the reasearchers working in Brain Research, Vision and Machine Intelligence ---------------------------------------------------------- Detailed list of topics, keyspeakers/invited talks, information on paper submission, proceedings/journals, hotels & tours are available on the conference home page. ---------------------------------------------------------- Important Dates: November 1, 2000 Paper submisison deadline November 10, 2000 Notification of acceptance November 20, 2000 Early registration December 20-22, 2000 Workshop ---------------------------------------------------------- Registration Fee: 200 USD, Before Nov. 20 250 USD, After Nov. 20 Students: 120 USD, Before Nov. 20 150 USD, After Nov. 20 ------------------------------------------------------------- Tours to be organised to ISTANBUL (the former capital of three successive empires - Roman, Byzantine and Ottoman) CAPPADOCIA (seven layer underground cities, fairy chimneys, churches carved out of the tough rocks) -------------------------------------------------------------- Contact Person: UGUR HALICI, Computer Vision and Artificial Neural Networks Res.Lab. Prof. of Dept. of Electrical and Electronics Eng. Middle East Technical University, 06531, Ankara, Turkey email: halici at metu.edu.tr fax: (+90) 312 210 1261 http://heaven.eee.metu.edu.tr/~halici/ http://heaven.eee.metu.edu.tr/~vision/ -------------------------------------------------------------- From pr230 at cus.cam.ac.uk Sun Oct 22 03:15:01 2000 From: pr230 at cus.cam.ac.uk (P. Roper) Date: Sun, 22 Oct 2000 08:15:01 +0100 (BST) Subject: TWO YEAR POSTDOCTORAL POSITION AT THE UNIVERSITY OF UTAH Message-ID: Applications are invited for a two-year postdoctoral position in mathematical/computational neuroscience to work with Paul Bressloff who joins the mathematical biology group at the University of Utah in Jan 2001. The successful candidate will also interact with Jenny Lund and colleagues in the newly-formed systems neuroscience group within the Medical School. RESEARCH AREA: Mathematical/computational models of visual cortex QUALIFICATIONS: Experience in computational modeling of complex systems essential. Some background in computational neuroscience desirable but not essential. START DATE: after Jan 1st 2001 Anyone interested in this position should e-mail Paul Bressloff at P.C.Bressloff at Lboro.ac.uk ---------------------------------------- Dr. Peter Roper Laboratory of Computational Neuroscience Babraham Institute Cambridge University CAMBS, CB2 4AT, UK email pr230 at cam.ac.uk From mozer at cs.colorado.edu Sun Oct 22 15:31:33 2000 From: mozer at cs.colorado.edu (Mike Mozer) Date: Sun, 22 Oct 2000 13:31:33 -0600 Subject: positions in machine learning at University of Colorado at Boulder Message-ID: <200010221931.e9MJVXf09998@neuron.cs.colorado.edu> University of Colorado at Boulder Department of Computer Science Tenure Track Positions The Department of Computer Science of the University of Colorado at Boulder is seeking applications for a number of tenure track faculty positions. While we expect most of the appointments to be at the Assistant Professor level, we will consider outstanding candidates at all levels. The Department is currently recruiting to fill two positions in the area of machine learning. We are particularly interested applicants whose work expands the theoretical foundations of machine learning, and applicants who apply theoretically-grounded techniques to solving practical, real-world problems and/or understanding the brain from a cognitive neuroscience perspective. The University of Colorado has a diverse faculty interested in issues of neural and statistical computation (http://www.cs.colorado.edu/~mozer/ns.html), and is particularly strong in the areas of speech recognition, natural language processing, computational modeling of human cognition, and human-machine interfaces. An explosion of local start-ups with research activities in machine learning enriches the university community and furnishes a wealth of collaborative opportunities. The Department has 39 faculty (including five new members who have joined this year), 191 graduate students, and 565 undergraduates. The Department has received four successive five-year NSF CER and RI awards to support its computing infrastructure and collaborative research among its faculty, most recently for the period 2000-2005. In the recent NSF ITR competition our faculty participated in proposals funded for more than $13.5M. The faculty is also proud of its record in offering an outstanding educational experience to our students. We are seeking new colleagues who share our commitment to the ideals of the research university as a uniquely valuable institution in our society, combining the creation of new knowledge with the shaping of future leaders. Our location in Boulder offers a pleasant college town ambience, easy access to the great outdoors, and participation in a vibrant, rapidly-growing high tech industry community. A recent study by a major telecommunications startup identified Boulder County as the most attractive setting nationally to which to recruit staff. Our alumni are playing important roles in creating the next generation of technology and technology companies here and elsewhere. More information about the Department can be found at http://www.cs.colorado.edu. Review of applications will begin immediately, and continue as long as positions are open. We expect to have positions in future years as well as for appointments beginning in academic year 2001-2002, and we welcome inquiries about these future opportunities. Because we have a number of positions available, we would be glad to receive inquiries from research collaborators interested in joining our faculty together. We also welcome applications from academic couples wishing to co-locate. Applicants should send a current curriculum vitae, the names of four references, and one-page statements of research and teaching interests to Professor Clayton Lewis, Search Committee Chair, Department of Computer Science, Campus Box 430, University of Colorado, Boulder, CO 80309-0430. The University of Colorado at Boulder is committed to diversity and equality in education and employment. From celiasmith at uwaterloo.ca Mon Oct 23 12:28:12 2000 From: celiasmith at uwaterloo.ca (Chris Eliasmith) Date: Mon, 23 Oct 2000 11:28:12 -0500 Subject: Postdoc in Computational Neuroscience Message-ID: <01C03CE4.5D01E830.celiasmith@uwaterloo.ca> ************************************************************************ A POSTDOCTORAL POSITION in COMPUTATIONAL NEUROSCIENCE is available in Charles Anderson's Computational Neuroscience Research Group (CNRG) in the Dept. of Anatomy and Neurobiology, Washington University School of Medicine in St. Louis. This position is available immediately and for a duration of up to 3 years. A competitive salary package will be offered depending on the qualifications of the candidate. We have active research in a wide range of topics that are focused on the development of a general framework for realistic modeling of neural systems. Our approach is based on fundamental principles of signal processing, motor control theory and statistical inference. Ongoing projects in close collaboration with experimentalists include models of visual cortex (with David Van Essen and Greg DeAngelis), parietal working memory (Larry Snyder), and the vestibular ocular motor system (Dora Angelaki and Steve Highstein). We would like to expand our models of cortical processing of visual information and are open to modeling the function of the cerebellum (with Tom Thach). This research position requires strong analytic and computer modeling skills. Candidates with training in engineering or physics are especially encouraged to apply even if they have no background in neuroscience. What is required is a strong interest in learning about these fascinating computational systems (see http://stp.wustl.edu/~compneuro for more information). Washington University is a leading research institute in all areas of neuroscience. Dr. Anderson's research group is in the Dept. of Anatomy and Neurobiology, chaired by David Van Essen. Besides strong collaborations with the many experimental neuroscientists in the department, there are active interactions with the Neural Engineering division of the rapidly growing Biomedical Engineering Department at Washington University, as well as the Biology, Physics, and Electrical Engineering Departments. Please send a CV, a brief statement of research experience and interests, and the names and contact information of two references to: cha at shifter.wustl.edu. Applications can also be mailed to: Dr. Charles H. Anderson Dept. Anatomy and Neurobiology Campus Box 8108 Washington University School of Medicine 660 S. Euclid Ave. St. Louis, MO 63124 ************************************************************************ From rampon at tin.it Tue Oct 24 02:42:50 2000 From: rampon at tin.it (Salvatore Rampone) Date: Tue, 24 Oct 2000 08:42:50 +0200 Subject: Paper on Function approximation from noisy data Message-ID: <001101c03d85$ac39fcc0$35c9d8d4@brainstorm> From stefan.wermter at sunderland.ac.uk Tue Oct 24 09:19:58 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Tue, 24 Oct 2000 14:19:58 +0100 Subject: cognitive systems research Message-ID: <39F58C7E.B186E744@sunderland.ac.uk> The journal of cognitive systems research invites recent and new books to be sent for review, e.g. in cognitive sciences, neural networks, hybrid systems, language processing, vision etc. We are also interested to hear from researchers who are interested to write a review which will get published as a brief article in the journal cognitive systems research. Also, if you have recently read a new interesting, challenging, controversial book in the scope of the journal please let us know at the address below and send two copies of the book to the address below. We are particularly interested in new books and new review article writers. More details and a list of books for review can be found on http://www.his.sunderland.ac.uk/cognitive.html If you interested in writing a article please let me know. best wishes, Stefan Wermter *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From ASIM.ROY at asu.edu Tue Oct 24 13:03:51 2000 From: ASIM.ROY at asu.edu (Asim Roy) Date: Tue, 24 Oct 2000 10:03:51 -0700 Subject: Summary of panel discussion at IJCNN'2000 on the question: DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK? Message-ID: A non-text attachment was scrubbed... Name: not available Type: multipart/alternative Size: 36952 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/885a1105/attachment.bin From ASIM.ROY at asu.edu Tue Oct 24 13:03:51 2000 From: ASIM.ROY at asu.edu (Asim Roy) Date: Tue, 24 Oct 2000 10:03:51 -0700 Subject: [repost] Summary of panel discussion at IJCNN'2000 on the question: DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK? Message-ID: [ Reposted to correct a formatting error. -- moderator ] A panel discussion on the question: "DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK?" took place this July at IJCNN'2000 (International Joint Conference on Neural Networks) in Como, Italy. Following persons were on the panel: 1) DAN LEVINE; 2) LEE GILES; 3) NOEL SHARKEY; 4) ALESSANDRO SPERDUTI; 5) RON SUN; 6) JOHN TAYLOR 7) STEFAN WERMTER; 8) PAUL WERBOS; 9) ASIM ROY. This was the fourth panel discussion at these IJCNN conferences on the fundamental ideas of connectionism. The abstract below summarizes the issues/questions that were addressed by this panel. The fundamental contention was that the basic connectionist framework as outlined by Rumelhart et al. in their many books and papers has no mechanism for rule extraction (reading of weights, etc. from a network) or rule insertion (constructing and embedding rules into a neural network) as is required by many rule-learning mechanisms (both symbolic and fuzzy ones). This is not a dispute about whether humans can and do indeed learn rules from examples or whether such rules can indeed be embedded in a neural network. It is about whether the connectionist framework let's one do that; that is, whether it allows one to create the algorithms required for doing such things as rule insertion and rule extraction. As pointed out in the abstract below, the cell-based distributed control mechanism of connectionism empowers only individual cells (neurons) with the capability to modify/access the connection strengths and other parameters of a network; no other outside agent can do that, as is required by rule insertion and extraction techniques. These rule-learning techniques are, in fact, moving away from the cell-based distributed control notions of connectionism and using broader control theoretic notions where it is assumed that there are parts of the brain that control other parts. In fact, it can be shown that every connectionist algorithm, from back-propagation to ART to SOM, goes beyond the original framework of cell-based distributed control and uses the broader control theoretic notion that there are parts of the brain that control other parts. There is, of course, nothing wrong with this broader control theoretic notion because there is enough neurobiological evidence about neurotransmitters and neuromodulators to support it. This debate once more points out the limitations of connectionism. Ron Sun notes that "Clearly the death knell of strong connectionism has been sounded." With regard to rule extractors in rule-learning schemes, John Taylor notes that "There do not seem to be there similar rule extractors of the connection strengths." On the same issue, Paul Werbos says: "But I would not call it a "neural network" method exactly (even though neural net learning is used) because I do not believe that real organic brains contain that kind of hardwired readout device." Noel Sarkey says: "Currently there seems little reason (or evidence) to even think about the idea of extracting rules from our neural synapses - otherwise why can we not extract our bicycle riding rules from our brain" and "However, the relationship between symbolic rules and how they emerge from connectionist nets or even whether or not they really exist has never been resolved in connectionism." For those interested, summaries of prior debates on the basic ideas of connectionism are available at the CompNeuro website at Caltech. Here is a partial list of the debate summaries available there. www.bbb.caltech.edu/compneuro/cneuro99/0079.html - Some more questions in the search for sources of control in the brain www.bbb.caltech.edu/compneuro/cneuro98/0088.html - BRAINS INTERNAL MECHANISMS - THE NEED FOR A NEW PARADIGM www.bbb.caltech.edu/compneuro/cneuro97/0069.html - COULD THERE BE REAL-TIME, INSTANTANEOUS LEARNING IN THE BRAIN? www.bbb.caltech.edu/compneuro/cneuro97/0043.html - CONNECTIONIST LEARNING: IS IT TIME TO RECONSIDER THE FOUNDATIONS? www.bbb.caltech.edu/compneuro/cneuro97/0040.html - DOES PLASTICITY IMPLY LOCAL LEARNING? AND OTHER QUESTIONS www.bbb.caltech.edu/compneuro/cneuro96/0047.html - Connectionist Learning - Some New Ideas/Questions Some of the summaries are also available at the CONNEC_L website: [ More results from www.shef.ac.uk ] Asim Roy Arizona State University -------------------------------------------------------------------------- DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK? Many scientists believe that the symbolic (crisp) and fuzzy (imprecise and vague) rules learned, used and expressed by humans are embedded in the networks of neurons in the brain - that these rules exist in the connection weights, the node functions and in the structure of the network. It is also believed that when humans verbalize these rules, they simply "read" the rules from the corresponding neural networks in their brains. Thus there is a growing body of work that shows that both fuzzy and symbolic rule systems can be implemented using neural networks. This body of work also shows that these fuzzy and symbolic rules can be retrieved from these networks, once they have been learned, by procedures that generally fall under the category of rule extraction. But the idea of rule extraction from a neural network involves certain procedures - specifically the reading of parameters from a network - that are not allowed by the connectionist framework that these neural networks are based on. Such rule extraction procedures imply a greater freedom and latitude about the internal mechanisms of the brain than is permitted by connectionism, as explained below. In general, the idea of reading (extracting) rules from a neural network has a fundamental conflict with the ideas of connectionism. This is because the connectionist networks by "themselves" are inherently incapable of producing the "rules," that are embedded in the network, as output, since the "rules" are not supposed to be the outputs of connectionist networks. And in connectionism, there is no provision for an external source (a neuron or a network of neurons), in a sense a third party, to read the rules embedded in a particular connectionist network. Some more clarification perhaps is needed on this point. The connectionist framework, in the use mode, has provision only for providing certain inputs (real, binary) to a network through its input nodes and obtaining certain outputs (real, binary) from the network through its output nodes. That is, in fact, the only "mode of operation" of a connectionist network. In other words, that is all one can get from a connectionist network in terms of output - nothing else is allowed in the connectionist framework. So no symbolic or fuzzy rules can be "output" or "read" by a connectionist network. The connectionist network, in a sense, is a "closed entity" in the use mode; no other type of operation, other than the regular input-output operation, can be performed by or with the network. There is no provision for any "extra or outside procedures" in the connectionist framework to examine and interpret a network, to look into the rules it's using or the internal representation it has learned or created. So, for example, the connectionist framework has no provision for "reading" a weight from a network or for finding out the kind of rule/constraint learned by a node. The existence of any "outside procedure" for such a task, in existence outside of the network where the rules are, would go against the basic connectionist philosophy. Connectionism has never stated that the networks can be "examined and accessed in ways" other than the input-output mode. So there is nothing in the connectionist framework that lets one develop procedures to read and extract rules from a network. So a rule extraction procedure violates in a major way the principles of connectionism by invoking a means of extracting the weights and rules and other information from a network. There is no provision/mechanism in the connectionist framework for doing that. So the whole notion of rules existing in a network, that can be accessed and verbalized as necessary, is contradictory to the connectionist philosophy. There is absolutely no provision for "accessing networks/rules" in the connectionist framework. Connectionism forgot about the need to extract rules. -------------------------------------------------------------------------- LEE GILES Early Connectionism/NNs * McCulloch, Pitts (MC) 40's: models that were basically circuit design, suggestive but very primitive. * Kleene, 50's: MC networks as regular expressions and grammars. Early AI. * Minsky, 60's: MC networks as logic, automata, sequential machines, design rules. More AI. (not the perceptron work!) Foundations of early high level VLSI design. Early connectionism/NNs always had rules and logic as part of their philosophy and implementation. Late 20th Century Connectionism/NNs * Rules - vital part of AI. * Empirical & theoretical work on rules extractable, encodeable and trainable in many if not all connectionist systems. * Most recent work in data mining * Future work in SVMs Rules are more important but not essential in some applications; natural language processing, expert systems and speech processing systems used in many applications. 21st Century Connectionism/NNs * Knowledge & information discovery and extraction * Knowledge prediction Rules and laws are important * New connectionism/NN - challenges * Applications will continue to be important * Cheap and plentiful data will be everywhere * Text * Nontext - sensor, audio, video, etc. * Pervasive computing and information access * New connectionism/NN future? * Integration philosophically, theoretically and empirically with other areas of AI, computer and engineering science continues (rules will become more important) * Biology and chips will play a new role Foundations of Connectionism/NNs * Rules were always a theoretical and philosophical part of the connectionist/nn models. * If/then rules, automata, graphs, logic * Importance? * Comfort factor * New knowledge * Autonomous systems - communication -------------------------------------------------------------------------- DAN LEVINE There are really two basic types of problems that involve encoding rules in neural networks. One type involves inferring rules from a series of interactions with the environment that entail some regularity. An example would be a cognitive task (such as the Wisconsin Card Sorting Test used by clinical neuropsychologists) in which a subject is positively reinforced for certain types of actions and needs to discern the general rule guiding the actions which will be rewarded. The other type of problem involves successfully performing cognitive tasks that are guided by an externally given rule. One example is the Rapid Information Processing task, in which the subject is instructed to press a key when he or she sees three odd or three even digits in a row. In other words, the neural network needs to translate the explicit verbal rule into connection strengths that will guide motor performance in a way that accords with that rule. The first type of problem has already been simulated in a range of neural networks including some by my own group and by John Taylor's group. These networks typically require modulatory transmitters that allow reward signals to bias selective attention or to selectively strengthen or weaken existing rule representations. Modulatory transmitters are also involved in models in progress of the second type of problem. In this case, activation of a rule selectively biases appropriate representations of and connections among working memory representations of objects, object categories, and motor actions relevant to following the particular rule. The framing of the question, "Does connectionism allow ...," suggests implicitly that "connectionism" refers to a specified class of neural architectures. However, connectionist and neural network models have been in the last several years increasingly less restricted in their structures. Modelers who began working within specific "schools" such as adaptive resonance (like my own group and Stephen Grossberg's) or back propagation (like Jonathan Cohen's group) have developed models that are guided as much, if not more, by known brain physiology and anatomy as by the original "modeling school." Hence the question should be rephrased "What form of connectionism allows ... ." -------------------------------------------------------------------------- NOEL SHARKEY One of the main themes of 1980s connectionism was the unconscious application of rules. This comes from Cognitive Psychology (where many of the major players started). There are very many charted behaviors, such as reading or riding a bicycle, where participants can perform extremely well and yet cannot explicitly state the rules that they are using. One of the main goals of Cognitive Psychologists was to find tasks that would enable them to probe at unconscious skills. Such skills were labeled as "automatic" in contrast to the slower, more intensive "controlled" or "strategic" processes. From the limited amount that psychologists talk beyond their data, strategic processes were vaguely considered to have something to do with awareness or conscious processes. This was a hot potato to be avoided. But there is no single coherent philosophy covering all of connectionism. Researchers in the 1980s, including myself, began to experiment with methods for extracting the rules from connectionist networks. This was partly motivated by psychological considerations but mainly to advance the field in computing and engineering terms. In my lab, the interest in rule extraction was to help to specify the behavior of physical systems, such as engines, which are notoriously difficult to specify in other ways. For example, if a neural network could learn to perform a difficult-to-specify task, then, if rules could be extracted from that net, a crude specification could be begun. However, the relationship between symbolic rules and how they emerge from connectionist nets or even whether or not they really exist has never been resolved in connectionism. It seems clear that we can propositionalize our rules and pass them on to other people. Simple rules such as, "eating is not allowed in this room" appear to be learned instantly from reading them in linguistic form, yet we have not seen a universally accepted connectionist explanation for this type of phenomenon and we certainly do not extract these rules from our nervous system after the fact. Now imagine we do extract rules directly from our brains, how would this be done. If we follow from the lessons of rule extraction techniques for neural networks, there are two distinctive methods which may be called internal and external. Internal is where the process of extraction operates on network parameters such as weight values or hidden unit activations. External is where the mechanism uses the input and output relation to calculate the rules - it is assumed that the neural network will have filtered out most of the noise. Currently there seems little reason (or evidence) to even think about the idea of extracting rules from our neural synapses - otherwise why can we not extract our bicycle riding rules from our brain. It seems that the only real option would be to "run the net" and calculate the rules from the input output relations. Nonetheless, this is not a well informed answer, nor is there likely to be one at present. This is an issue that needs considerably more research. -------------------------------------------------------------------------- ALESSANDRO SPERDUTI Before arguing about the possibility to extract rules from a neural network, the concept itself of "rule" should be clarified. In fact, when talking about rules in this context, everybody has in mind the concept of "symbolic rule", i.e., a rule that involves discrete entities. Moreover, the semantics of these entities is defined by a subjective "interpretation" function. However, the concept of rule is far more general and it can involve in general any kind of entity or variable, subject to the constraint that a finite description (i.e., representation) of the entity exists and can be used. Thus, a rule involving continuous variables and/or entities is as well legitimate, and in many cases useful. Consequently, the question about the capability of a connectionist system to capture "symbolic rules" in such a form to permit easy reading is conditional to the nature of the learned function: if it is discrete, the posed question is meaningful. Assuming a discrete nature for the learned function, however, there is no guarantee that a trained neural network will encode the function in a way that allows easy reading of "symbolic rules", whose representation, by the way, is in principle arbitrary with respect to the representational primitives (neurons) of the neural network. -------------------------------------------------------------------------- RON SUN Many early connectionist models have some significant shortcomings. For example, the limitations due to the regularity of their structures led to, e.g., difficulty in representing and interpreting symbolic structures (despite some limited successes that we have seen). Other limitations are due to learning algorithms used by such models, which led to, e.g., lengthy training (requiring many repeated trials); complete I/O mappings must be known a priori; etc. There are also limitations in terms of biological relevance. For example, these models may bear only remote resemblance to biological processes; they are far less complex than biological NNs, and so on. In coping with these difficulties, two forms of connectionism emerged: Strong connectionism adheres strictly to the precepts of connectionism, which may be unnecessarily restrictive and incur huge cost for some symbolic processing. On the other hand, weak connectionism (or hybrid connectionism) encourages the incorporation of both symbolic and subsymbolic processes: reaping the benefit of connectionism while avoiding its shortcomings. There have been many theoretical and practical arguments for hybrid connectionism; see e.g. Sun (1994). In light of this background, how do we answer the question of whether there can be ``rule reading" in connectionist models? Here is my three-fold answer: (1) Psychologically speaking, the answer is yes. For example, Smith, Langston and Nisbet (1992), Hadley (1990), Sun (1995) presented strong cases for the existence of EXPLICIT rules in psychological processes, based on psychological experimental data, theoretical arguments, thought experiments, and cognitive modeling. If connectionist models are to become general cognitive models, they should be able to handle the use and the learning of such explicit rules too. (2) Methodologically speaking, the answer is also yes. Connectionism is merely a methodology, and not an exclusive one --- to be used to the exclusion of other methodologies. Considering our lack of sufficient neurobiological understanding at present, a dogmatic or strict view on ``neural plausibility" is not warranted. (3) Computationally speaking, the answer is again yes. By now, we know that we can implement ``rule reading" in many ways computationally, e.g., (a) in symbolic forms (which leads to hybrid connectionism), or (b) in connectionist forms (which leads to connectionist implementationalism). Some such implementations may have as good neurobiological plausibility as any other connectionist models. The key point is: To remove the strait-jacket of strong connectionism: we should advocate (1) methodological connectionism, treating it as one possible approach, not to the exclusion of others. and (2) weak connectionism (hybrid connectionism), encouraging the incorporation of non-NN representations and processes. Clearly, the death knell of strong connectionism has been sounded. It's time for a more open-minded framework in which we conduct our research. My own group has been conducting research in this way for more than a decade. For the work by my group along these lines, see http://www.cecs.missouri.edu/~rsun -------------------------------------------------------------------------- JOHN TAYLOR Getting a Connectionist Network to Explain its Rules. JG Taylor, Dept of Mathematics, King's College, Strand, London WC2R2LS, UK. email: john.g.taylor at kcl.ac.uk Accepting that rules can be extracted from trained neural networks by a range of techniques, I first addressed the problem of how this might occur in the brain. There do not seem to be there similar rule extractors of the connection strengths. In the brain are two extremes: implicit and explicit rules. Implicit skills, which implement rules in motor responses, are not based on an explicit knowledge of the rules implemented by the neural networks of the motor cortex. It is in explicit rules, as supported by language and the inductive/deductive process that rules are created by human experience. Turning to language, I described what is presently known from brain imaging about the coding of semantics and syntax in sites in the brain. These both make heavy use of the frontal recurrent cortico-thalamo-NRT circuits, and it can be conjectured to be the architecture used to build phrase structure analysers (through suitable recurrence), guided by 'virtual actions'. These are the basis for syntactic rules and also rules for causal inference, as seen in what is called 'predictive coding' in frontal lobes in monkeys. Thus rule development is undoubtedly supported by such architectures and styles of processing. There is no reason why it cannot be ultimately be implemented in a connectionist framework. Such a methodology would enable a neural system to learn to talk about, and develop, its own explicit rules (although never the implicit ones), and hence solve part of the problem raised by Asim Roy. Implicit rules can be determined by the rule-extraction methods I noted at the beginning. -------------------------------------------------------------------------- PAUL WERBOS Asim Roy has asked us to address many very different, though related issues. A condensed response: (1) A completely seamless interface between rule-based "white box" descriptions and neural net learning techniques already exists. I have a patent on "elastic fuzzy logic" (see Gupta and Sinha eds); Fukuda and Yaeger have effectively applied essentially the same method. But I would not call it a "neural network" method exactly (even though neural net learning is used) because I do not believe that real organic brains contain that kind of hardwired readout device. (2) Where, in fact, DOES symbolic reasoning arise in biology? Some of my views are summarized in the book "The Evolution of Human Intelligence" (see www.futurefoundation.org). Curiously, there is a connection to a previous panel Asim organized, addressing memory-based learning. In the most primitive mammal brains, I theorized back in 1977 that there is an interplay between two levels of learning: (1) a slow-learning but powerful system which generalizes from current experience AND from memory; (2) a fast-learning but poorly-generalizing heteroassociative memory system. (e.g. See my chapter in Roychowdhury et al, Theoretical Advances...). At IJCNN2000, Mike Denham described the "what when where" system of the brain. I theorize that some (or all) primates extended the heteroassociative memory system, to include "who what when where," using mirror neurons to provide an encoding of the experience of other primates. In other words, even monkeys probably have the power to generalize from the (directly observed) experience of OTHER MONKEYS, which they can reconstruct without any higher reasoning faculties. I theorize that human intelligence is basically an extension of this underlying capability, based on a biological system to reconstruct experience of others communicated first by dance (as in the Bushman dance), and later by "word movies." Symbolic reasoning ala Aristotle and Plato, and propositional language ala English, are not really biologically based as such, but learned based on modern culture, and rooted in the biology which supports dance and word movies. If there can be such a thing as truly biologically rooted symbolic/semiotic intelligence, we aren't there yet; modern humanity is only a kind of missing link, a halfway house between other primates and that next level. (For more detail, see the last chapter of my book "The Roots of Backpropagation," Wiley 1994, which also includes the first published work on true backpropagation, and the chapter in Kunio Yasue et al eds, ...Consciousness... forthcoming from John Benjamins.) -------------------------------------------------------------------------- STEFAN WERMTER Linking Neuroscience, Connectionism and Symbolic Processing Does connectionism permit reading of rules from a network? There are at least two main answers to this question. Researchers from Knowledge Engineering and Representation would argue that it has been done successfully, that it is useful if it helps to understand the networks, or they might not even care whether reading is part of connectionism or external symbolic processes. Connectionist representations can be represented as symbolic knowledge at higher abstraction levels. Symbolic extraction may be not part of connectionism in the strict sense, but symbolic knowledge can emerge from connectionist networks. This may lead to a better understanding and also to the possibility for combining connectionist knowledge with symbolic knowledge sources. Researchers from Cognitive Science, Neuroscience, on the other hand, would argue that in real neurons in the brain there is no symbolic reading mechanism, that symbolic processing emerges based on dynamics of spreading of activation in cortical cell assemblies and that there may be rule-like behavior emerging from neural elements. It would be useful in the future to explore constraints and principles from cognitive neuroscience for building more plausible neural network architectures since there is a lot of new evidence from fmri, eeg, meg experiments. Furthermore, new computational models of spiking neural networks, pulse neural networks, cell assemblies have been designed which promise to link neuroscience with connectionist and even symbolic processing. We are leading the exploration of such efforts in the EmerNet project www.his.sunderland.ac.uk/emernet/. Computational models can benefit from emerging vertical hybridization and abstraction: 1. The symbolic abstraction level is useful for abstract reasoning but lacks preferences. 2. The connectionist knowledge has preferences but still lacks neuroscience reality. 3. Neuroscience knowledge is biologically plausible but architecture and dynamic processing are computationally extremely complex. Therefore we argue for an integration of all three levels for building neural and intelligent systems in the future. ************************************************************************** ******************************** BIOSKETCHES -------------------------------------------------------------------------- DAN LEVINE Web site: www.uta.edu/psychology/faculty/levine -------------------------------------------------------------------------- LEE GILES http://www.neci.nj.nec.com/homepages/giles/html/bio.html -------------------------------------------------------------------------- NOEL SHARKEY Noel Sharkey is an interdisciplinary researcher. Currently a full Professor in the department Computer Science at the university of Sheffield, he holds a Doctorate in Experimental Psychology, is a Fellow of the British Computer Society, a Fellow of the Institution of Electrical Engineers, and a member of the British Experimental Psychology Society. He has worked as a research associate in Computer Science at Yale University, USA, with the AI and Cognitive Science groups and as a senior research associate in psychology at Stanford University, USA, where he has also twice served as a visiting assistant professor. His other jobs have included a "new blood" lecturship (English assistant professor) in Language and Linguistics at Essex University, U.K. and a Readership in Computer Science at Exeter. His editorial work includes Editor-in-Chief of the journal Connection Science, editorial board of Robotics and Autonomous Systems, and editorial board of AI Review. He was Chairman of the IEE professional group A4 (AI) and founding chairman of IEE professional group A9 (Evolutionary and Neural Computing). He has edited special issues on modern developments in autonomous robotics for the journals Robotics and Autonomous Systems, Connection Science, and Autonomous Robots. Noel's intellectual pursuits are in the area of biologically inspired adaptive robotics. In recent years Noel has been involved with the public understanding of science, engineering, technology and the arts. He makes regular appearances on TV as judge and commentator of robot competitions and is director of the Creative Robotics Unit at Magna (CRUM) with projects in flying swarms of robots and in the evolution of cooperation in collective robots. -------------------------------------------------------------------------- ALESSANDRO SPERDUTI Alessandro Sperduti received his education from the University of Pisa, Italy ("laurea" and Doctoral degrees in 1988 and 1993, respectively, all in Computer Science.) In 1993 he spent a period at the International Computer Science Institute, Berkeley, supported by a postdoctoral fellowship. In 1994 he moved back to the Computer Science Department, University of Pisa, where he was Assistant Professor, and where he presently is Associate Professor. His research interests include pattern recognition, image processing, neural networks, hybrid systems. In the field of hybrid systems his work has focused on the integration of symbolic and connectionist systems. He contributed to the organization of several workshops on this subject and he served also in the program committee of conferences on Neural Networks. Alessandro Sperduti is the author or co-author of around 70 refereed papers mainly in the areas of Neural Networks, Fuzzy Systems, Pattern Recognition, and Image Processing. Moreover, he gave several tutorials within international schools and conferences, such as IJCAI `97 and IJCAI `99. He acted as Guest Co-Editor of the IEEE Transactions on Knowledge and Data Engineering for a special issue on Connectionist Models for Learning in Structured Domains, and of the journal Cognitive Systems Research for a special issue on Integration of Symbolic and Connectionist Information Processing Systems. -------------------------------------------------------------------------- RON SUN Ron Sun is an associate professor of computer engineering and computer science at the University of Missouri-Columbia. He received his Ph.D in 1991 from Brandeis University. Dr. Sun's research interests center around the study of intellegence and cognition, especially in the areas of hybrid neural networks model, machine learning, and connectionist knowledge representation and reasoning, He is the author of over 100 papers, and has written, edited or contributed to 15 books, including authoring the book {\it Integrating Rules and Connectionism for Robust Commonsense Reasoning}. and co-editing {\it Computational Architectures Integrating Neural and Symbolic Processes}. For his paper on models of human reasoning, he received the 1991 David Marr Award from Cognitive Science Society He organized and chaired the Workshop on Integrating Neural and Symbolic Processes, 1992, and the Workshop on Connectionist-Symbolic Integration, 1995, as well as co-chairing the Workshop on Cognitive Modeling, 1996 and the Workshop on Hybrid Neural Symbolic Systems, 1998. He has also been on the program committees of the National Conference on Artificial Intelligence (AAAI-93, AAAI-97, AAAI-99), International Joint Conference on Neural Networks (IJCNN-99 and IJCNN-2000), International Two-Stream Conference on Expert Systems and Neural Networks, and other conferences, and has been an invited/plenary speaker for some of them. Dr. Sun is the editor-in-chief of Cognitive Systems Research (Elsevier). He also serves on the editorial boards of Connection Science, Applied Intelligence, and Neural Computing Surveys. He was a guest editor of a special issue of the journal Connection Science and a special issue of IEEE Transactions on Neural Networks, both on hybrid intelligent models. He is a senior member of IEEE. -------------------------------------------------------------------------- JOHN TAYLOR Trained as a theoretical physicist in the Universities of London and Cambridge. Positions in Universities in the UK, USA, Europe in physics and mathematics. Created the Centre for Neural Networks at King's College, London, in 1990, and is still its Director. Appointed Professor of Mathematics, King's College London in 1972, and became Emeritus Professor of Mathematics of London University in 1996. Was Guest Scientist at the Research Centre in Juelich, Germany, 1996-8, working on brain imaging and data analysis. Has been consultant in Neural Networks to several companies. Is presently Director of Research on Global Bond products and Tactical Asset Allocation for a financial investment company involved in time series prediction and European Editor-in-Chief of the journal Neural Networks. He was President of the International Neural Network Society (1995) and the European Neural Network Society (1993/4). He is also editor of the series Perspectives in Neural Computing. Has been on the Advisory Board of the Brain Sciences Institute, RIKEN in Tokyo since 1997. Has published over 500 scientific papers (in theoretical physics, astronomy, particle physics, pure mathematics, neural networks, higher cognitive processes, brain imaging, consciousness), authored 12 books, edited 13 others, including the titles When the Clock Struck Zero (Picador Press, 1994), Artificial Neural Networks (ed, North-Holland, 1992), The Promise of Neural Networks (Springer, 1993), Mathematical Approaches to Neural Networks (ed, Elsevier, 1994), Neural Networks (ed, A Waller, 1995) and The Race for Consciousness (MIT Press, 1999). Started research in neural networks in 1969. Present research interests are: financial and industrial applications; dynamics of learning processes and multi-state synapses; stochastic neural chips and their applications (the pRAM chip); brain imaging and its relation to neural networks; neural modelling of higher cognitive brain processes, including consciousness. Has funded research projects from the EC (on building a hybrid symbolic/subsymbolic processor), from British Telecom.(on Intelligent Agents) and from EPSRC on Building a Neural Network Language System to learn syntax and semantics. -------------------------------------------------------------------------- STEFAN WERMTER Stefan Wermter is Full Professor of Computer Science and Research Chair in Intelligent Systems at the University of Sunderland, UK. He is an Associate Editor of the journal Connection Science and serves on the Editorial Board of the journals Cognitive Systems Research, and Neural Computing Surveys. He has written or edited three books as well as more than 70 articles. He is also Coordinator of the international EmerNet network for neural architectures based on neuroscience and head of the intelligent system group http://www.his.sunderland.ac.uk/. He holds a Diplom from Dortmund University, a MSc from the University of Massachusetts, a PhD and Habilitation from Hamburg University. Stefan Wermters research interests are in Neural Networks, Hybrid Systems, Cognitive Neuroscience, Natural Language Processing, Artificial Intelligence and Bioinformatics. The motivation for this research is twofold: How is it possible to bridge the large gap between real neural networks in the brain and high level cognitive performance? How is it possible to build more effective systems which integrate neural and symbolic technologies in hybrid systems? Based on this motivation Wermter has directed and worked on several projects, e.g. on hybrid neural/symbolic systems for text processing and speech/language integration. Furthermore, he has research interests in Knowledge Extraction from Neural Networks, Interactive Neural Network Agents, Cognitive Neuroscience, Fuzzy Systems as well as the Integration of Speech/Language/Image Processing. -------------------------------------------------------------------------- From stiber at u.washington.edu Thu Oct 26 14:49:50 2000 From: stiber at u.washington.edu (Prof. Michael Stiber) Date: Thu, 26 Oct 2000 11:49:50 -0700 (PDT) Subject: Faculty positions at the University of Washington, Bothell Message-ID: <14840.31950.401515.882178@kasei.bothell.washington.edu> UNIVERSITY OF WASHINGTON, BOTHELL Computing and Software Systems http://www.bothell.washington.edu/CSS Assistant, Associate & Full Professor Positions Innovative and growing computer science program has multiple openings for tenure-track faculty. The Computing & Software Systems (CSS) program at the University of Washington, Bothell (UWB) offers opportunities to: * Establish new research programs, participate in the on-going research efforts, and/or collaborate with local high-tech companies; * Teach state-of-the-art, interdisciplinary computing courses; * Expand program options for undergraduates (e.g., embedded systems, graphics design); and * Participate in designing a new Master's degree program in CSS. Well-qualified candidates in all related areas, including but not limited to databases, networking, distributed systems, and web technology, are encouraged to apply. The CSS Program, in its fifth year of existence, emphasizes and rewards interdisciplinary scholarship and a balance between teaching, research and service. Current research interests include: biocomputing, embedded systems, computer graphics, digital multimedia in distributed environments, HCI, and software engineering. Successful candidates will have the options of participating in these areas and/or establishing new research programs. The undergraduate degree program stresses development of competencies associated with life-long learning, critical thinking, problem solving, communication and management -- as well as contemporary programming and technical skills. Situated in the midst of the Northwest's technology corridor, the fastest growing information technology region in the country, UWB offers opportunities for collaborative work with industry. The University of Washington, Bothell is part of the three-campus University of Washington -- one of the premier institutions of higher education in the U.S. QUALIFICATIONS AND DUTIES Candidates will have a doctorate (required prior to date of appointment) in a relevant field. Successful candidates will be expected to teach a variety of courses, conduct scholarly research, and contribute to program and institution building in this innovative, interdisciplinary environment. Faculty are also expected to advise students during their internship with industry/community partners. A determination of rank and tenure at the assistant, associate or full professor will be commensurate with the qualifications of the individual. APPLICATION INFORMATION Please send a cover letter, statement of teaching philosophy, statement of research interests, and vita. Also, please arrange for three letters of reference to be sent to: Janet McDaniel Coordinator, CSS Search Committee University of Washington, Bothell Campus mail: Box 358534 18115 Campus Way NE Bothell, WA 98011-8246 The University of Washington is building a culturally diverse faculty and strongly encourages applications from female and minority candidates. Review of applications will begin November 15, 2000, and positions will remain open until they are filled. The University of Washington is an Equal Opportunity/Affirmative Action Employer From nello at dcs.rhbnc.ac.uk Fri Oct 27 08:31:43 2000 From: nello at dcs.rhbnc.ac.uk (Nello Cristianini) Date: Fri, 27 Oct 2000 13:31:43 +0100 (BST) Subject: Support Vector Book: Reprint Now Available Message-ID: The second reprint of the Support Vector Book is now finally available via Cambridge University Press, Amazon (US, UK, DE), and other online bookshops. Apologies for the delay in reprinting, we did not anticipate such high demand. Details at: http://www.support-vector.net/ AN INTRODUCTION TO SUPPORT VECTOR MACHINES (and other kernel-based learning methods) N. Cristianini and J. Shawe-Taylor Cambridge University Press 2000 ISBN: 0 521 78019 5 From adr at adrlab.ahc.umn.edu Fri Oct 27 12:29:37 2000 From: adr at adrlab.ahc.umn.edu (A. David Redish) Date: Fri, 27 Oct 2000 11:29:37 -0500 Subject: Postdoctoral position available - Please post Message-ID: <200010271629.LAA03243@adrlab.ahc.umn.edu> Please post. adr POSTDOCTORAL POSITIONS AVAILABLE Postdoctoral positions are immediately available in my laboratory in the Department of Neuroscience at the University of Minnesota. Research interests include behavioral neuroscience, with specific interests in spatial cognition in rats. My laboratory does both experimental and computational work and postdocs would be expected to participate in both aspects. Experimentally, we record from ensembles of neurons in awake, behaving animals. Computationally, we model at the systems-level using models that are tightly constrained by experimental data. Current projects include understanding aspects of hippocampal activity, of the head direction system, and of the basal ganglia. Candidates should have recently completed (or be about to complete) their PhD or MD and should have some experience in either computational or experimental neuroscience. Pay will be commesurate with NIH standards. Those interested should contact me via email at the following address redish at ahc.umn.edu The University of Minnesota is an equal-opportunity employer. ----------------------------------------------------- A. David Redish redish at ahc.umn.edu Assistant Professor http://www.cbc.umn.edu/~redish Department of Neuroscience, University of Minnesota 6-145 Jackson Hall 321 Church St SE Minneapolis MN 55455 ----------------------------------------------------- From Emmanuel.Dauce at cert.fr Fri Oct 27 12:44:52 2000 From: Emmanuel.Dauce at cert.fr (Emmanuel Dauce) Date: Fri, 27 Oct 2000 18:44:52 +0200 (MET DST) Subject: DYNN'2000 Program Message-ID: <200010271644.SAA09955@bigorre.cert.fr> ************************************************************ INTERNATIONAL WORKSHOP ON "DYNAMICAL NEURAL NETWORKS AND APPLICATIONS" Bielefeld, November 20-24, 2000 http://www.cert.fr/anglais/deri/dauce/DYNN2000 ************************************************************ Dear colleagues, The organizing committee of DYNN'2000 is pleased to announce the programm of the international workshop to be held in Bielfeld, Germany from Monday, Nov.24 to Friday Nov 24, 2000. This multidisciplinary workshop is devoted to the study of natural aand artificial neural networks viewed as dynamical systems evolving under environmental pressure For further information, visit our website . After the previous editions of Toulouse (1996) and Stockhom (1998), DYNN'2000 will take place at the ZIF in Bielefeld University in the frame of the Forschungsgruppe "The Sciences of Complexity: From Mathematics to Technology to a Sustainable World." Students are strongly encouraged to attend DYNN'2000. The registration fees are specially low for doctorate students and some university rooms are reserved for their lodging during the workshop. Details regarding how students can apply for these rooms are available on the DYNN'2000 web site. We apologize, in advance, for any redundant messages that you may receive. Thank you for your time in reading this message. If you have any further questions, you may contact me at: samuelid at supaero.fr . Manuel Samuelides DYNN'2000 Workshop Chair ------------------------------------------------------------------------ Program of DYNN'2000 INTERNATIONAL WORKSHOP ON "DYNAMICAL NEURAL NETWORKS AND APPLICATIONS" Bielefeld, November 20-24, 2000 ------------------------------------------------------------------------ Monday November 20, 2000 16:00-16:30 Registration and welcome of registrated people 16:30-19:00 Opening Session Chair: Manuel Samuelides, ENSAE/UPS Toulouse, France. ---------------------------------------------------- 16:30Opening Allocution for the millenium edition of DYNN workshop Philippe Blanchard, ZIF, Bielefeld, Germany 17:00Information Theoretic Approach to Neural Coding Jean-Pierre Nadal, Ecole normale Sup=E9rieure, Paris, France 18:00Modeling the dynamical construction of meaning in language Bernard Victorri, Ecole Normale Sup=E9rieure, Paris, France ------------------------------------------------------------------------ Tuesday November 21, 2000 9:00-12:00 Dynamics and Self-organization 1 Chair: Philippe Blanchard, Bielefeld, Germany. ---------------------------------------------------- 09:00Spike-Time dependent learning rules Wulfram Gerstner, EPFL, Lausanne, Switzerland 09:50An introduction to Self-Organized Criticality Bruno Cessac Institut Non-Lin=E9aire de Nice, France 10:30Coffee Break 10:50Avalanches of activity in neural networks: finite size effects Christian Eurich Universitaet Bremen, Germany 11:40SOC properties of PCNN and application to image processing Xavier Clastres ONERA/UPS, Toulouse, France 12:00-14:00 Lunch at ZIF 14:00-17:00 Control 1 Chair: Jean-Pierre Nadal, ENS, Paris, France ---------------------------------------------------- A Control approach framework for attentional 14:00brain processing John Taylor King's College, London, United Kingdom 15:00Cognition and Autonomy of Robots from the Dynamical Systems Jun Tani Sony Computer Laboratory,Tokyo, Japan 15:40Dynamical systems for perception and action in autonomous robots. Michael Herrmann, MPI fuer Stroemungsforschung, Germany 16:20Information transmission of arms coordination and movement direction in the motor cortex. Valeria Del Prete, SISSA, Trieste, Italy. 16:40Information transmission of arms coordination and movement Jim Vaccaro US Air Force Rome Laboratory, USA 17:00-17:30 Coffee break 17:30-19:00 Mathematical modelling of nervous system. Chair: Jacques Demongeot, IUF, Grenoble, France ---------------------------------------------------- 17:30A Control approach framework for attentional brain processing John Taylor King's College, London, United Kingdom 18:30Fractal model for neural network organization in breathing center in mouse. Arthur Luczak, Jagellonian University, Krakow, Poland. ------------------------------------------------------------------------ Wednesday, November 22, 2000 9:00-12:00 Dynamics and Self-organization 2 Chair: Bruno Cessac, INL, Nice, France ---------------------------------------------------- 09:00 Positive regulation cycles and dynamics of neural network Jacques Demongeot, Grenoble, Institut Universitaire de France. 09:40 Estimation in interacting point processes models: A gibbsian approximation approach. Thierry Herv=E9, IMAG, Grenoble, France. 10:20 Coffee Break 10:40 Dynamical memories on large recurrent neural networks. Sebastiano Stramaglia, Istituto Elaborazione Segnali ed Immagini, Bari, Italy. 11:20 Dynamical memories on large recurrent neural networks. Emmanuel Dauc=E9, ONERA/DTIM, Toulouse, France. 11:40 Using small parameter changes to access various limit-cycles in random networks. Patrick Mc Guire, ZIF Universitatet Bielefeld, University of Arizona, USA. 12:00-14:00 Lunch at ZIF 14:00-16:00 Control 2 Chair : John Taylor, King's College, London, UK ---------------------------------------------------- 14:00Importance of the interaction Dynamics in the imitation processes: From temporal sequence learning to implicit reward communication. Philippe Gaussier, ETIS/ENSEA, Cergy-Pontoise, France. 14:40Recurrent neural networks for robotic control and planning Mathias Quoy, ETIS/ENSEA, Cergy-Pontoise, France. 15:20Dynamic Neural Fields for Robot Control Axel Steinhage, Ruhr-Universit=E4t, Bochum, Germany. 16:00-16:30 Coffee break 16:30-18:30 Spiking Neurons 1 Chair: Jacques Demongeot, IUF, Grenoble, France ---------------------------------------------------- 16:30Recurrent competition, synaptic dynamics and optimal coding in primary visual cortex Klaus Obermayer, Technische Universit=E4t Berlin, Germany. 17:30Influence of noise on neuronal dynamics: from single units to interacting systems Khashayar Pakdaman, INSERM-U244, Paris, France. 18:10Back to spike coding in networks trained with sigmoidal neurons. Christelle Godin CEA Grenoble, France. ------------------------------------------------------------------------ Thursday, November 23, 2000 9:00-12:20 Dynamics and Self-organization 3 Chair: Philippe Blanchard, Bielefeld, Germany ---------------------------------------------------- 09:00 A large deviation approach of random reccurent neural network dynamics. Manuel Samuelides, ENSAE, ONERA, Toulouse, France. 10:00 A mathematical approach to the problem of synchronization in the cortex Ruedi Stoop, University/ETH, Zurich. 10:40 Coffee Break 11:00 Frustrated dynamics in small Hopfield networks. Hugues Bersini, IRIDIA/ULB, Bruxelles, Belgique. 11:40 Geometrical analysis of associative memory dynamics and practical application. Toshiki Kindo Matsushita Research Institute, Tokyo Inc., Japan. 12:00-14:00 Lunch at ZIF 14:00-17:00 Spiking Neurons 2 Chair: Jacques Demongeot, IUF, Grenoble, France ---------------------------------------------------- 14:00Reverse Engineering the Visual System with Networks of Asynchronously Spiking Neurons Simon Thorpe, CNRS/CERCO, Toulouse, France. 15:00Synfire Chain in Coincidence Detectors. Ikeda Kazushi, Kyoto University, Japan. 15:40A generative model for temporal hebbian synaptic plasticity. Laurent Perrinet, ONERA/DTIM, Toulouse, France. 16:00Rapid dynamic feature binding in feedforward spiking neural vector networks. Sander Bohte, University of Amsterdam, Netherlands. 16:20Toward a realistic neuron learning rule. Jianfeng Feng Sussex University Brighton BN1, UK. 16:40-17:00 Coffee break 17:30-19:00 Vision Chair:Philippe Gaussier, ETIS/ENSEA, Cergy-Pontoise, France ---------------------------------------------------- 17:00Advantages in Image Processing using Dynamic Neurons Jason Kinser George Mason University, USA. 17:40Optical flow based on the Pulse Coupled Neural Network. Watanabe Takashi Saitama University, Japan. 18:20Analysis of Development of Oriented Receptive Fields in Linsker's Model Tadashi Yamazaki Tokyo Institute of Technology, Japan. 18:40A Neural Network Approach for Cancer Diagnosis Based on Dynamical Data Sets Andreas Degenhard, Institute of Cancer Research, Great Britain. ------------------------------------------------------------------------ Friday, November 24, 2000 9:00-12:20 Mean-field theory Chair: John Taylor, King's College, London, UK ----------------------------------------------------- 9:50 Mean field theory for stochastic neural networks and graphical models. Bert Kappen University of Nijmege, Netherlands. 10:40 Coffee Break 11:00 On the landscape of attractors in Hopfield-like neural networks. Daniel Gandolfo CPT-CNRS, Luminy, France. 11:40 Dynamics of associative memory networks with synaptic depression. Dimitri Bibitchkov Max-Planck Institute Gottingen, Germany. From Frank.Pasemann at rz.uni-jena.de Sat Oct 28 07:43:36 2000 From: Frank.Pasemann at rz.uni-jena.de (Frank Pasemann) Date: Sat, 28 Oct 2000 13:43:36 +0200 Subject: OPEN POSITION PhD/Postdoc Message-ID: <39FABBE8.C1979173@rz.uni-jena.de> Please post: The TheoLab - Research Unit for Structure Dynamics and the Evolution of Systems - Friedrich Schiller University Jena, Germany - invites PhD candidates and Phd recipients with pronounced interests in the field of "Evolution and Dynamics of Neural Controllers for Autonomous Systems" to apply for a PhD Studentship (BAT IIa/2-O) / Postdoctoral Fellowship (BAT IIa-O). Positions are open immediately and they are limited to May 31, 2002. Extensions may be possible. Applicants should have experience in computer simulations (C, C++, Graphics, Linux). A background in the fields of Dynamical Systems Theory, Neural Networks, and/or Embodied Cognitive Science is favorable. Experiments can be done with Khepera robots or software agents. The theoretically oriented activities will be concerned with the relationship between behavior of agents and driving network structures/dynamics, theory of evolutionary processes, optimization of evolutionary algorithms, development of behavior oriented learning strategies for recurrent networks. For further information see http://www.theorielabor.de. Successful applicants may benefit from TheoLabs association with the Max Planck Institute for Mathematics in the Sciences, Leipzig, and the Institute for Medical Statistics, Informatics and Documentation, University Jena. Applications (CV, two academic referees) should be send as soon as possible to Prof. Dr. Frank Pasemann TheorieLabor Friedrich-Schiller-Universit?t Jena Ernst-Abbe-Platz 4, D-07740 Jena Tel: xx49-3641-949530 frank.pasemann at rz.uni-jena.de From Dave_Touretzky at cs.cmu.edu Mon Oct 30 00:36:35 2000 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Mon, 30 Oct 2000 00:36:35 -0500 Subject: graduate study in the neural basis of cognition Message-ID: <18291.972884195@skinner.boltz.cs.cmu.edu> The Center for the Neural Basis of Cognition (CNBC) offers interdisciplinary doctoral training in affiliation with seven PhD programs at Carnegie Mellon University and the University of Pittsburgh. See our web site http://www.cnbc.cmu.edu for detailed information. The Center is dedicated to the study of the neural basis of cognitive processes. The Center's goals are: 1) To elucidate cellular and molecular mechanisms of information coding and processing by neural circuits. 2) To understand how neural circuits interact in functional systems that form the substrate for cognitive processes, including perception, thought, behavior and language. 3) To identify mechanisms that enable genes, development and diseases to shape cognition. 4) To promote the application of research results to problems in medicine, artificial intelligence and robotics. In order to foster innovative research, the Center encourages and supports cross-disciplinary research between workers in the fields of neuroscience, psychology, mathematics, computer science, and medicine. In this environment, students have access to world-class research facilities for cellular and systems neurophysiology, functional brain imaging, confocal imaging, transgenic methods, gene-chip technology and high performance computing. A variety of different experimental models are in use, ranging from simple in vitro systems, to whole animal work on transgenic animals, primates and neurologically and psychologically-defined patient populations. As part of the Center's commitment to cross-disciplinary research, we have recently instituted an IGERT program with support from the National Science Foundation. IGERT, which stands for Integrative Graduate Education and Research Training, allows students to receive hands-on training in a relevant crossover discipline. For example, a computer scientist specializing in neural modeling could learn to do neurophysiology, or a psychology student working on functional brain imaging could receive training in advanced statistical analysis of fMRI data. The IGERT option is available to all CNBC students. Students are admitted jointly to a home training program and to the CNBC Training Program. Applications are encouraged from students having interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, or robotics. For a brochure describing the program and application materials, contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu Application materials are also available online. The affiliated PhD programs are: Carnegie Mellon University University of Pittsburgh Computer Science Center for Neuroscience Psychology Mathematics Robotics Psychology Statistics The CNBC training faculty includes: John Anderson (CMU Psychology): ACT-R theory of cognition, problem-solving German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Cameron Carter (Pitt Neurosci., Psych): fMRI, attention, memory, schizophrenia Carson Chow (Pitt Mathematics): spatiotemporal dynamics in neural networks Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Chris Genovese (CMU Statistics): making inferences from scientific data Lori Holt (CMU Psychology): mechanisms of speech perception, perceptual learning John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Marcel Just (CMU Psychology): visual thinking, language comprehension Robert Kass (CMU Statistics): transmission of info. by collections of neurons Eric Klann (Pitt Neuroscience): hippocampal LTP and LTD Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Pat Levitt (Pitt Neurobiology): molecular basis of cortical development Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Brian MacWhinney (CMU Psychology): models of language acquisition Yoky Matsuoka (CMU Robotics): rehabilitation robotics, motor psychophysics James McClelland (CMU Psychology): connectionist models of cognition Paula Monaghan Nichols (Pitt Neurobiology): vertebrate CNS development Carl Olson (CNBC): spatial representations in primate frontal cortex David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia Lynn Reder (CMU Psychology): organization of memory, cognitive processing Jonathan Rubin (Pitt Mathematics): oscillations in networks of coupled neurons Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex William Skaggs (Pitt Neuroscience): representations in rodent hippocampus Peter Strick (Pitt Neurobiology): basal ganglia & cerebellum, motor system David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning See http://www.cnbc.cmu.edu/people for further details. From icann at ai.univie.ac.at Mon Oct 30 11:35:46 2000 From: icann at ai.univie.ac.at (ICANN conference server) Date: Mon, 30 Oct 2000 17:35:46 +0100 Subject: Call for Papers: ICANN 2001 Message-ID: <39FDA362.735EF3C0@ai.univie.ac.at> Call for Papers ============================================================ ICANN 2001 International Conference on Artificial Neural Networks Aug. 21-25, 2001 Vienna, Austria http://www.ai.univie.ac.at/icann the annual conference of the European Neural Network Society ============================================================ We invite submission of full-length papers on work in neural computation covering theory, algorithms, applications or implementation in one of the following areas: - Computational Neuroscience - Data Analysis and Pattern Recognition - Vision and Image Processing - Signal and Time Series Processing - Robotics and Control - Connectionist Cognitive Science - Selforganization and Dynamical Systems Deadline for receipt of papers: !!!!!!!!!!!!!!! Feb 16, 2001 !!!!!!!!!!!!!!! (electronic submission will be possible) Please see our web page for more details. http://www.ai.univie.ac.at/icann ___________________________________________________________ Furthermore, we invite proposals for tutorials, special sessions or workshops on selected topics in neural computation. Deadline for receipt of proposals: !!!!!!!!!!!!! Feb 1, 2001 !!!!!!!!!!!!! Please see our web page for more details http://www.ai.univie.ac.at/icann ___________________________________________________________ ICANN 2001 is organized by the Austrian Research Institute for Artificial Intelligence in cooperation with the Vienna University of Technology's Pattern Recognition and Image Processing Group and Center for Computational Intelligence. General Chair: Georg Dorffner, Austria Program Co-Chairs: Horst Bischof, Austria Kurt Hornik, Austria Organizing Commitee: Arthur Flexer, Austria Karin Hraby, Austria Friedrich Leisch, Austria Andreas Weingessel, Austria Workshop Chair: Christian Schittenkopf, Austria Program Committee: Shun-Ichi Amari, Japan Peter Auer, Austria Pierre Baldi, USA Peter Bartlett, Australia Shai Ben David, Israel Matthias Budil, Austria Joachim Buhmann, Germany Rodney Cotterill, Denmark Gary Cottrell, USA Kostas Diamantaras, Greece Wlodek Duch, Poland Peter Erdi, Hungary Patrick Gallinari, France Wulfram Gerstner, Switzerland Stan Gielen, The Netherlands Stefanos Kollias, Greece Vera Kurkova, Czech Republic Anders Lansner, Sweden Ales Leonardis, Slovenia Hans-Peter Mallot, Germany Christoph von der Malsburg, Germany Klaus-Robert Mller, Germany Thomas Natschlger, Austria Lars Niklasson, Sweden Stefano Nolfi, Italy Erkki Oja, Finland Gert Pfurtscheller, Austria Stephen Roberts, UK Mariagiovanna Sami, Italy Jrgen Schmidhuber, Switzerland Olli Simula, Finland Peter Sincak, Slovakia John Taylor, UK Volker Tresp, Germany Michel Verleysen, Belgium ___________________________________________________________ For questions contact icann at ai.univie.ac.at From David.Cohn at acm.org Mon Oct 30 21:11:01 2000 From: David.Cohn at acm.org (David 'Pablo' Cohn) Date: Mon, 30 Oct 2000 21:11:01 -0500 Subject: Two new papers in the Journal of Machine Learning Research Message-ID: <4.2.0.58.20001030210422.00af08f0@ux7.sp.cs.cmu.edu> [Apologies for the broad distribution. Future announcements of this type will only be posted to jmlr-announce at ai.mit.edu. See the end of this message for information on subscribing.] The Journal of Machine Learning is pleased to announce the availability of two papers in electronic form. ---------------------------------------- Learning with Mixtures of Trees Marina Meila and Michael I. Jordan. Journal of Machine Learning Research 1 (October 2000) pp. 1-48. Abstract This paper describes the mixtures-of-trees model, a probabilistic model for discrete multidimensional domains. Mixtures-of-trees generalize the probabilistic trees of Chow and Liu (1968) in a different and complementary direction to that of Bayesian networks. We present efficient algorithms for learning mixtures-of-trees models in maximum likelihood and Bayesian frameworks. We also discuss additional efficiencies that can be obtained when data are "sparse," and we present data structures and algorithms that exploit such sparseness. Experimental results demonstrate the performance of the model for both density estimation and classification. We also discuss the sense in which tree-based classifiers perform an implicit form of feature selection, and demonstrate a resulting insensitivity to irrelevant attributes. ---------------------------------------- Dependency Networks for Inference, Collaborative Filtering, and Data Visualization David Heckerman, David Maxwell Chickering, Christopher Meek, Robert Rounthwaite, and Carl Kadie. Journal of Machine Learning Research 1 (October 2000), pp. 49-75. Abstract We describe a graphical model for probabilistic relationships--an alternative to the Bayesian network--called a dependency network. The graph of a dependency network, unlike a Bayesian network, is potentially cyclic. The probability component of a dependency network, like a Bayesian network, is a set of conditional distributions, one for each node given its parents. We identify several basic properties of this representation and describe a computationally efficient procedure for learning the graph and probability components from data. We describe the application of this representation to probabilistic inference, collaborative filtering (the task of predicting preferences), and the visualization of acausal predictive relationships. These first two papers of Volume 1 are available at http://www.jmlr.org in PostScript, PDF and HTML formats; a bound, hardcopy edition of Volume 1 will be available in the next year. -David Cohn, Managing Editor, Journal of Machine Learning Research ------- This message has been sent to the mailing list "jmlr-announce at ai.mit.edu", which is maintained automatically by majordomo. To subscribe to the list, send mail to listserv at ai.mit.edu with the line "subscribe jmlr-announce" in the body; to unsubscribe send email to listserv at ai.mit.edu with the line "unsubscribe jmlr-announce" in the body. From melnik at cs.brandeis.edu Mon Oct 30 23:16:53 2000 From: melnik at cs.brandeis.edu (Ofer Melnik) Date: Mon, 30 Oct 2000 23:16:53 -0500 (EST) Subject: DIBA source code release Message-ID: The Decision Intersection Boundary Algorithm describes what a threshold feed-forward neural network is doing in the form of polytopic decision regions. It can be used to visualize and analyze the exact computation being performed by these types of networks. Examples of the kind of information that can be extracted by this algorithm can be seen on the webpage: http://www.demo.cs.brandeis.edu/pr/DIBA The basic DIBA algorithm is being released as C++ source code under the Gnu Public License, without support. It has been compiled under Linux, but should compile under any standard C++ compiler and libraries. The three dimensional output formats available are Matlab, Povray and MHD. -Ofer Melnik ----------------------------------------------------------------- Ofer Melnik PhD melnik at cs.brandeis.edu Volen Center for Complex Systems Ph: (781)-736-2719 Brandeis University (781)-736-DEMO Waltham MA 02254 From masuoka at flab.fujitsu.co.jp Tue Oct 31 00:01:28 2000 From: masuoka at flab.fujitsu.co.jp (Ryusuke Masuoka) Date: Tue, 31 Oct 2000 00:01:28 -0500 Subject: FW: Ph.D. Thesis available: Neural networks learning differential data Message-ID: Dear Connections, Some people seem to have troubles with accessing the URL provided in my attached email. It seems that ~ (tilde) has been replaced with ? (question mark) somewhere. (I know some do not have this problem and I do not know the reason why it happened yet.) The URL is, again the following: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/thesis/thesis.html If you ever see a question mark in front of "masuoka" in the above URL, please replace it with ~ (tilde). Sorry for troubles I might have caused you, and thank you for those who have kindly notified me of the trouble. Regards, Ryusuke -------------- next part -------------- An embedded message was scrubbed... From: unknown sender Subject: no subject Date: no date Size: 38 Url: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/19e87a36/attachment.mht From masuoka at flab.fujitsu.co.jp Sun Oct 15 18:48:27 2000 From: masuoka at flab.fujitsu.co.jp (Ryusuke Masuoka) Date: Sun, 15 Oct 2000 17:48:27 -0500 Subject: Ph.D. Thesis available: Neural networks learning differential data In-Reply-To: <200010131754.LAA13846@grey.colorado.edu> Message-ID: Dear Connectionists, I am pleased to announce the availability of my Ph.D. thesis for download in electronic format. Comments are welcome. Regrads, Ryusuke ------------------------------------------------------------ Thesis: ------- Title: "Neural Networks Learning Differential Data" Advisor: Michio Yamada URL: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/thesis/thesis.html Abstract: -------- Learning systems that learn from previous experiences and/or provided examples of appropriate behaviors, allow the people to specify {\em what} the systems should do for each case, not {\em how} systems should act for each step. That eases system users' burdens to a great extent. It is essential in efficient and accurate learning for supervised learning systems such as neural networks to be able to utilize knowledge in the forms of such as logical expressions, probability distributions, and constraint on differential data along with provided desirable input and output pairs. Neural networks, which can learn constraint on differential data, have already been applied to pattern recognition and differential equations. Other applications such as robotics have been suggested as applications of neural networks learning differential data. In this dissertation, we investigate the extended framework introduce constraints on differential data into neural networks' learning. We also investigate other items that form the foundations for the applications of neural networks learning differential data. First, new and very general architecture and an algorithm are introduced for multilayer perceptrons to learn differential data The algorithm is applicable to learning differential data of orders not only first but also higher than first and completely localized to each unit in the multilayer perceptrons like the back propagation algorithm. Then the architecture and the algorithm are implemented as computer programs. This required high programming skills and great amount of care. The main module is programmed in C++. The implementation is used to conduct experiments among others to show convergence of neural networks with differential data of up to third order. Along with the architecture and the algorithm, we give analyses of neural networks learning differential data such as comparison with extra pattern scheme, how learnings work, sample complexity, effects of irrelevant features, and noise robustness. A new application of neural networks learning differential data to continuous action generation in reinforcement learning and its experiments using the implementation are described. The problem is reduced to realization of a random vector generator for a given probability distribution, which corresponds to solving a differential equation of first order. In addition to the above application to reinforcement learning, two other possible applications of neural networks learning differential data are proposed. Those are differential equations and simulation of human arm. For differential equations, we propose a very general framework, which unifies differential equations, boundary conditions, and other constraints. For the simulation, we propose a natural neural network implementation of the minimum-torque-change model. Finally, we present results on higher order extensions to radial basis function (RBF) networks of minimizing solutions with differential error terms, best approximation property of the above solutions, and a proof of $C^l$ denseness of RBF networks. Through these detailed accounts of architecture, an algorithm, an implementation, analyses, and applications, this dissertation as a whole lays the foundations for applications of neural networks learning differential data as learning systems and will help promote their further applications. ------------------------------------------------------------ Ryusuke Masuoka, Ph.D. Senior Researcher Intelligent Systems Laboratory Fujitsu Laboratories Ltd. 1-4-3 Nakase, Mihama-ku Chiba, 261-8588, Japan Email: masuoka at flab.fujitsu.co.jp Web: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/ ------=_NextPart_000_0041_01C042CD.B80B2180-- From golden at utdallas.edu Tue Oct 31 13:03:37 2000 From: golden at utdallas.edu (Richard Golden) Date: Tue, 31 Oct 2000 12:03:37 -0600 (CST) Subject: FUNDING OPPORTUNITIES FOR GRADUATE WORK IN MATHEMATICAL AND COMPUTATIONAL MODELING (fwd) Message-ID: Students interested in pursuing state-of-the-art graduate research in Mathematical and Computational Modeling are encouraged to apply to the Cognition and Neuroscience Doctoral Program in the School of Human Development at the University of Texas at Dallas (UTD). Our School consists of 39 active faculty members whose research interests span areas in neural, perceptual, and cognitive processes througout the lifespan from young infants to adults and onwards into old age. We sincerely hope that you will inform students about the outstanding opportunities for graduate study at UTD. Our faculty have distinguished themselves at the world-class level in the following areas in mathematical and computational modeling. * MODELS OF FACE PERCEPTION AND COGNITION (Abdi, O'Toole) * SPEECH PERCEPTION MODELS (Assmann) * COCHLEAR IMPLANT MODELS (Tobey) * CABLE EQUATION MODELS OF NEURAL CIRCUITRY USING "GENESIS" (Cauller) * PROBABILISTIC AND CONNECTIONIST LANGUAGE MODELS (Golden) * MATHEMATICAL ANALYSIS AND DESIGN OF ARTIFICIAL NEURAL NETS (Abdi,Golden) * ADVANCED STATISTICAL DATA ANALYSIS METHODS (Abdi, Golden, Rollins) Assistantships and support for conference travel are available to qualified students on a competitive basis. For more information regarding our doctoral program in this area please see our web site located at: www.utdallas.edu/dept/hd/ or contact me (golden at utdallas.edu) for additional information regarding our program. Application materials are available online (www.utdallas.edu/dept/graddean/forms/gradapp.html). AREA FACULTY: Herve Abdi, Ph.D., University of Aix-en-Provence 1980 (www.utdallas.edu/~herve) Peter Assmann, Ph.D., University of Alberta 1985 (www.utdallas.edu/~assmann) Larry Cauller, Ph.D., Northeastern Ohio Universities College of Medicine (www.utdallas.edu/~lcauller) Richard M. Golden, Ph.D., Brown University 1987 (www.utdallas.edu/~golden) Alice O'Toole, Ph.D., Brown University 1988 (www.utdallas.edu/~otoole) Pamela Rollins, Ph.D., Harvard Graduate School of Education 1994 (www.callier.utdallas.edu) Emily Tobey, Ph.D., City University of New York 1981 (www.callier.utdallas.edu) From mzorzi at ux1.unipd.it Mon Oct 2 18:21:26 2000 From: mzorzi at ux1.unipd.it (Marco Zorzi) Date: Tue, 03 Oct 2000 00:21:26 +0200 Subject: graduate student position in connectionist modelling Message-ID: <4.3.2.7.0.20001003001819.00b1c940@ux1.unipd.it> Please post or pass this ad to anyone interested. Thanks, Marco Zorzi GRADUATE STUDENT POSITION A three-years postgraduate position in cognitive neuroscience is available with Dr. Marco Zorzi and Prof. Carlo Umilt? at University of Padova (Italy), Department of Psychology. The project is part of a EU-funded Research Training Network of six sites (University College London, INSERM U334 Orsay, Universit? Catholique de Louvain, University of Innsbruck, University of Trieste, and University of Padova) investigating ?Mathematics and the Brain". The Padova team will focus on the computational bases of mathematical cognition (basic numerical abilities and simple arithmetic), using both experimental and connectionist modeling techniques. Experience in connectionist modelling or neuropsychology, and good programming skills are desirable. The successful applicant will register for a PhD in Psychology. The international network offers an excellent opportunity for gaining experience in a wide range of methodologies in cognitive neuroscience. Short visits and exchanges between laboratories are also planned. Salary and benefits are highly competitive (23000 Euro per year). Applicants must be citizens of a EU country (excluding Italy), EFTA-EEA states, Candidate States, or Israel. Application deadline is 31 October 2000. Send CV, reprints or preprints and the names of two references (preferably by email) to: Dr. Marco Zorzi Prof. Carlo Umilt? Dipartimento di Psicologia Generale Universit? di Padova Via Venezia 8 35131 Padova (Italy) mzorzi at ux1.unipd.it umilta at ux1.unipd.it Dr. Marco Zorzi Dipartimento di Psicologia Generale Universit? di Padova via Venezia 8 35131 Padova tel: +39 049 8276635 fax: +39 049 8276600 email: mzorzi at psico.unipd.it (and) Institute of Cognitive Neuroscience voice: +44 171 3911151 University College London fax : +44 171 8132835 17 Queen Square London WC1N 3AR (UK) http://www.psychol.ucl.ac.uk/marco.zorzi/marco.html From maggini at dii.unisi.it Tue Oct 3 11:52:44 2000 From: maggini at dii.unisi.it (Marco Maggini) Date: Tue, 03 Oct 2000 17:52:44 +0200 Subject: ESANN 2001- Special Session on ANNs for web computing Message-ID: <39DA00CC.4F966CE6@dii.unisi.it> ---------------------------------------------------- | | | ESANN'2001 | | | | 9th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 25-26-27, 2001 | | | | Special session on | | Artificial neural networks for Web computing | | | | Call for papers | ---------------------------------------------------- The Internet represents a new challenging field for the application of machine learning techniques to devise systems which improve the accessibility to the information available on the web. This domain is particular appealing since it is easy to collect large amounts of data to be used as training sets while it is usually difficult to write manually sets of rules that solve interesting tasks. The aim of this special session is to present the state of the art in the field of connectionist systems applied to web computing. The possible fields for applications involve distributed information retrieval issues like the design of thematic search engines, user modeling algorithms for the personalization of services to access information on the web, automatic security management, design and improvement of web servers through prediction of request patterns, and so on. In particular the suggested topics are: - Personalization of the access to information on the web - Recommender systems on the web - Crawling policies for search engines - Focussed crawlers - Analysis and prediction of requests to web servers - Intelligent chaching and proxies - Security issues (e.g. intrusion detection) For the submission refer to the instruction for authors available on the web site of the conference http://www.dice.ucl.ac.be/esann Please note that contributions are to be sent to the address specified in the call for papers and not directly to me. Anyway you must specify the title of this special session in the accompanying letter. I would appreciate also if you could send me an e-mail to announce your submission. Contributions are submitted to a review process and will be rated according to their scientific value. Deadlines --------- Submission of papers December 8, 2000 Notification of acceptance February 5, 2001 Contacts -------- Session Organizer: Marco Maggini Dipartimento di Ingeneria dell'Informazione Universit di Siena Via Roma 56 I-53100 Siena (Italy) e-mail: maggini at dii.unisi.it Conference secretariat: Michel Verleysen D facto conference services phone: + 32 2 420 37 57 27 rue du Laekenveld Fax: + 32 2 420 02 55 B - 1080 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be From rwessel at ucsd.edu Tue Oct 3 18:03:23 2000 From: rwessel at ucsd.edu (Ralf Wessel) Date: Tue, 3 Oct 2000 15:03:23 -0700 Subject: postdoctoral position Message-ID: A POSTDOCTORAL POSITION IN THE NEUROPHYSIOLOGY OF VISUAL MOTION ANALYSIS is available in the laboratory of Ralf Wessel at Washington University/St. Louis beginning in March 2001 (starting date flexible). We study the biophysical mechanisms of visual motion processing at the level of synapses, single neurons and dendrites, and networks. A slice preparation of the chick tectum is used as a model system for these studies. In this slice preparation, visual motion can be simulated via the spatio-temporal stimulation of retinal ganglion cell axons. At the same time motion, sensitive tectal neurons are accessible for electrical recordings, optical imaging, and pharmacological manipulation. Experimental approaches are complemented by computational analysis to understand how the interaction of biophysical mechanisms at multiple levels of neural organization leads to motion and motion contrast sensitivity in vertebrate brains (http://www.physics.wustl.edu/~rw/). The highly interactive Neuroscience community at Washington University provides opportunities for collaborative investigations with experts who study the nervous system at different levels of neural organization (http://www.dbbs.wustl.edu/). St. Louis offers a range of recreational activities and affordable living (http://medinfo.wustl.edu/pub/diso/). Please email CV, brief summary of research experience, contact information of two references, and list of abstracts and publications to: rw at wuphys.wustl.edu. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 1576 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/67ea602b/attachment-0001.bin From tleen at cse.ogi.edu Tue Oct 3 21:46:06 2000 From: tleen at cse.ogi.edu (Todd K. Leen) Date: Tue, 03 Oct 2000 18:46:06 -0700 Subject: Postdoc Position at Oregon Graduate Institute Message-ID: <39DA8BDE.C087A85E@cse.ogi.edu> Postdoctoral Research Position in Statistical Pattern Recognition and Machine Learning for Environmental Forecasting Systems Professors Todd Leen (Computer Science and Engineering, http://www.cse.ogi.edu/~tleen) and Antonio Baptista (Environmental Science and Engineering, http://www.ccalmr.ogi.edu/baptista/baptista.html) are seeking a highly qualified Postdoctoral Research Associate to develop and apply statistical signal processing and pattern recognition technology to Environmental Observation and Forecasting Systems (EOFS). The goals of this NSF ITR project are to develop techniques that enhance the utility of the massive amounts of observational and model data produced by EOFS. Specific aims include detecting sensor corruption in non-stationary spatial-temporal systems, estimating true signals from corrupted sensor data, and detecting and characterizing regimes where numerical model anomalies are likely. The project centers around the COlumbia RIver Estuary (CORIE) an EOFS system that includes numerical models and observing stations for the river estuary and adjacent coastal waters (http://www.ccalmr.ogi.edu/CORIE). The project offers an opportunity to work in a interdisciplinary team with expertise in statistical machine learning, two and three-dimensional finite element models of river dynamics, and year-round data collection stations throughout the estuary. We expect that the project will provide significant challenges and advances in the combination of timeseries with spatial statistics, and machine learning and datamining. Candidates should have a PhD in Statistics, EE, CS, or a related field with experience in machine learning, nonparametric statistics, or neural network modeling. Experience in either timeseries analysis or spatial statistics is an advantage. The initial appointment is for one year, but may be extended depending upon the availability of funding. Candidates able to start by January 1, 2001 will be given preference, although highly qualified candidates who seek a later start date should not hesitate to apply. If you are interested in applying for this position, please mail, fax, or email your CV (ascii text, postscript or pdf file), a letter of application, and a list of three or more references (with name, address, email, and phone number) to: Position #CSE10-1-00 Human Resources Department Oregon Graduate Institute 20000 NW Walker Road Beaverton, OR 97006-8921 E-mail submissions are accepted at jobs at admin.ogi.edu and must include the above information and position number in the message subject line. EEO/AA employer: women, minorities, and individuals with disabilities are encouraged to apply. Questions (not application material) may be director to Professor Todd Leen, tleen at cse.ogi.edu, 503-748-1160 and Professor Antonio Baptista, baptista at ccalmr.ogi.edu, 503-748-1147. ================================================================== Oregon Graduate Institute is an independent graduate school and research institute located 10 miles west of downtown Portland, Oregon. OGI offers Masters and PhD programs in Biochemistry and Molecular Biology, Computer Science and Engineering, Electrical and Computer Engineering, Environmental Science and Engineering, and a Masters program in Management in Science and Technology. Portland offers diverse cultural activities including art, music, theater, entertainment and sports. Portland's extensive park system includes the largest wilderness park within the limits of any city in the US. Beyond the city limits, a 1-2 hour drive brings one to downhill skiing, high desert, forest hiking trails, the Pacific Ocean, and numerous wineries. From S.Singh at exeter.ac.uk Tue Oct 3 05:46:07 2000 From: S.Singh at exeter.ac.uk (S.Singh@exeter.ac.uk) Date: Tue, 3 Oct 2000 10:46:07 +0100 (GMT Daylight Time) Subject: 3 PhD Studentships Available Message-ID: UNIVERSITY OF EXETER, UK SCHOOL OF ENGINEERING AND COMPUTER SCIENCE Department of Computer Science We now invite applications for the following three PhD studentships within our department. Please note the different deadlines for application and project details. 1. PhD Studentship as a Graduate Teaching Assistant Deadline for Application: 13 October, 2000 "Digital video retrieval based on multiple sources of information." The project will develop student skills in areas including neural networks, image analysis, pattern recognition and multimedia information retrieval. Candidates for this studentship should have a degree in computer science, engineering or a related subject. They should have programming skills in C/C++/JAVA and knowledge of the Unix operating system. Previous knowledge of image processing, multimedia or information retrieval is useful but not absolutely necessary. Further information may be obtained from the project supervisors Dr. S Singh (S.Singh at exeter.ac.uk) and Dr. G. Jones (G.J.F.Jones at exeter.ac.uk). The student will also work as a Graduate Teaching Assistant within the department, helping with the teaching of core undergraduate modules also postgraduate modules in their area of research. Assistants are required to attend the SEDA training course at the University and will be expected to undertake up to 150 hours teaching per annum. The SEDA training is aligned to the new National Institute of Learning and Teaching's Associate Part 1 award. 2. PhD Studentship Deadline for Application: 30 October, 2000 "Adaptive and Autonomous Image Understanding Systems" The project will explore intelligent algorithms for adaptive image understanding including context based reasoning. Generic methodology will be developed based on image processing and pattern recognition methods to allow automatic processing of scene analysis data using video sequences. The goal of the project is to develop a seamless system that continuously evolves in real-time with video images. It is expected that prospective applicants have good mathematical and analytical background, with programming experience in C/C++ and Unix/Linux operating systems. 3. PhD Studentship Deadline for Application: 30 October, 2000 "Ultrasound based object recognition and integration with image analysis" Only recently it has been demonstrated that ultrasound can be used to detect objects in controlled environments. It has been used for face recognition and obstacle avoidance. In this project we develop techniques for ultrasound based object recognition and couple this ability with image processing on a mobile platform (robot). The ultrasound method will provide the low resolution cheaper method as a front end triggering the image processing component for more detailed analysis. The research will be tested on indoor and outdoor environment. It is expected that prospective applicants have good mathematical and analytical background, with programming experience in C/C++ and Unix/Linux operating systems. The studentship covers UK/EU fees and maintenance (currently £6,620 pa). Funding for project (1) is for up to four years, and for projects (2,3) for up to three years. The successful candidates should expect to take up the studentship no later than 1 December, 2000 or as soon as possible when negotiated with the department. Applicants should send a letter of application and a CV, including the names and addresses of two referees, to project supervisor, Dr. Sameer Singh, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK (email S.Singh at ex.ac.uk). Applicants should ask their referees to send their references directly to the above address. Informal enquiries can be made 01392-264053; further details of the Department's work in the above areas may be found on the web at www.dcs.ex.ac.uk/research/pann ------------------------------------------- Sameer Singh Department of Computer Science University of Exeter Exeter EX4 4PT United Kingdom Tel: +44-1392-264053 Fax: +44-1392-264067 e-mail: s.singh at ex.ac.uk WEB: http://www.ex.ac.uk/academics/sameer/index.htm ------------------------------------------- From bert at mbfys.kun.nl Thu Oct 5 09:05:36 2000 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 05 Oct 2000 15:05:36 +0200 Subject: Tenure track senior researcher at SNN University Nijmegen Message-ID: <39DC7CA0.3EC96DFA@mbfys.kun.nl> Senior researcher position (tenure track) available at SNN, University of Nijmegen, the Netherlands. Extended deadline. Background: The group consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and Bayesian methods. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focused on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines and other graphical models using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks. Applied research is conducted on computer assisted medical diagnosis, music modeling, genetics and time-series prediction tasks. Since 1997, SNN Nijmegen has founded a company which sells commercial services and products in the field of neural networks, AI and statistics. For more information see http://www.smart-research.nl and http://www.mbfys.kun.nl/SNN Job specification: SENIOR RESEARCHER (TENURE TRACK) The tasks of the senior researcher will be to conduct independent research in one of the above areas. In addition, it is expected that the he/she will initiate novel research and will assist in the supervision of PhD students. The senior researcher should have a PhD in physics, mathematics or computer science and a strong theoretical background in neural networks. The salary will be between Dfl. 5589 and Dfl. 7694 per month, depending on experience. The position is available for 2 years with possible extension to 4 years. After 4 years, a permanent position is anticipated. Applications: Interested candidates should send a letter with a CV and list of publications before november 1 2000 to dr. H.J. Kappen, Stichting Neurale Netwerken, University of Nijmegen, Geert Grooteplein 21, 6525 EZ Nijmegen. For information contact dr. H.J. Kappen, +31 24 3614241. From s.holden at cs.ucl.ac.uk Thu Oct 5 10:58:57 2000 From: s.holden at cs.ucl.ac.uk (Sean Holden) Date: Thu, 05 Oct 2000 15:58:57 +0100 Subject: Chair and Lectureship Message-ID: <39DC9730.17F0A987@cs.ucl.ac.uk> Dear Colleagues, Appended is an announcement of opportunities for a Chair of Bioinformatics and a Lectureship in Intelligent Systems at University College London, part of London University. Readers should note that a Lectureship in the U.K. is roughly the equivalent of an Assistant Professorship in the U.S. Best wishes, Sean. ----------------------------------------------------------------- Dr. Sean B. Holden email: S.Holden at cs.ucl.ac.uk Department of Computer Science phone: +44 (0)20 7679 3708 University College London fax: +44 (0)20 7387 1397 Gower Street, London WC1E 6BT, U.K. http://www.cs.ucl.ac.uk/staff/S.Holden/ ----------------------------------------------------------------- UNIVERSITY COLLEGE LONDON Department of Computer Science and Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX) CHAIR OF BIOINFORMATICS Applications are sought for this new chair, funded through a joint initiative of the UK Research Councils.* The appointee will be expected to create and lead a new, world-class Bioinformatics Unit in the Department of Computer Science at UCL, applying state-of-the-art mathematical and computer science techniques to problems now arising in the life sciences, driven by the post-genomic era. The Unit will also work closely with the UCL Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX) which has a proven track record for interdisciplinary collaboration in these areas. A joint appointment will be made in Computer Science and a Life Sciences department appropriate to the specific life sciences interest of the appointee. We are seeking an individual to lead innovative, interdisciplinary research at the interface between mathematics, computer science and the life sciences. Our aim is to provide the intellectual and practical tools for deriving models of the interactions that determine function of molecules, cells, tissues, and organisms. The successful candidate will have an outstanding reputatation, with experience of leading interdisciplinary research, working effectively with industry and, where appropriate, of successfully managing the commercial application of research. Applications from individuals in any relevant discipline are encouraged. The successful candidate will benefit from a rich environment of relevant collaborative work at UCL. The Research Councils are also providing funding for a lectureship and a significant level of infrastructure support. The appointed Chair will be involved in appointing this lecturer. *In recognition of the crucial role of bioinformatics, the Biotechnology and Biological Research Council (BBSRC), the Engineering and Physical Sciences Research Council (EPSRC), the Medical Research Council (MRC), the Natural Environment Research Council (NERC) and the particle Physics and Astronomy Research Council (PPARC), are funding a new Unit and Chair at UCL. Lectureship in Intelligent Systems In addition to the above, we are creating a further new lectureship in the area of Intelligent Systems or Bioinformatics. Applications are invited from candidates with research experience in either of these areas. Prospective applicants are invited to make informal enquiries to and can obtain further particulars from Professor Steve Wilbur, Head, Department of Computer Science, (telephone +44 (0)20 7679 1397, fax +44 (0)20 7387 1397, e-mail: s.wilbur at cs.ucl.ac.uk) or Professor Anne Warner, FRS, Director of CoMPLEX (telephone +44 (0)20 7679 7279, fax +44 (0)20 7387 3014, e-mail: a.warner at .ucl.ac.uk), Gower Street, London WC1E 6BT, UK. Further details can also be found at http://www.ucl.ac.uk/CoMPLEX. Applications (10 copies for UK-based candidates, one copy for overseas candidates), including a curriculum vitae and the names and addresses of three referees (at least one of whom should be based in a country other than the candidate's country of residence), should be sent to the Provost and President, UCL, Gower Street, London, WC1E 6BT, UK, to arrive no later than 6 November 2000. Working towards Equal Opportunity ------------------------------------------------------------------------------- From P.J.Lisboa at livjm.ac.uk Thu Oct 5 11:58:12 2000 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Thu, 5 Oct 2000 16:58:12 +0100 Subject: First Call for Papers for NNESMED'2001 Message-ID: NNESMED 2001 4th International Conference on Neural Networks and Expert Systems in Medicine and Healthcare Milos Island, Greece, 20-22 June 2001 Web site address: http://www.heliotopos.net/conf/nnesmed2001/ FIRST ANNOUNCEMENT AND CALL FOR PAPERS NNESMED 2001 is organised by the Department of Applied Informatics and Multimedia of the Technological Educational Institute of Crete. Previous conferences held in Plymouth (94,96) and in Pisa (98), attracted a strong international participation. The conference brings together researchers from different AI modalities with a focus on integrated, practical healthcare applications. The traditional single-track format for all presentations fosters discussion and information exchange among those working in these inter-related fields, reflecting increasing interest from clinicians in healthcare informatics at every level. CONFERENCE CHAIR GM Papadourakis Technological Educational Institute of Crete INTERNATIONAL ORGANIZING COMMITTEE EC Ifeachor, B Jervis, P Ktonas, P Lisboa and A Starita INTERNATIONAL PROGRAMME COMMITTEE Y Attikiouzel, G Bebis, N Bourbakis, B Dawant, M Gori, B Jansen, N Karayiannis,D Koutsouris,T Leo, D Linkens, D Lowe, S Michelogiannis, K Rosen, C Schizas, A Sperduti,A Taktak and M Zervakis LOCATION The island of Milos, an entirely volcanic island is located 86 nautical miles from the port of Piraeus, in the south-western part of the Cyclades group of islands, in the Aegean Sea. It is well known for the largest natural harbour in the Mediterranean, its mineral wealth, as well as for its unique beaches of white sand and crystal clear waters. It is also famous for the marble statue known as the Venus de Milo, now located in the Louvre. TOPICS OF INTEREST * Artificial Networks * Data mining * Expert Systems * Verification and validation * Fuzzy logic and fuzzy experts systems * Risk assessment * Neuro-fuzzy * Safety-critical issues * Uncertainty handling * User interface, human computer interaction * Casual probability networks * Computer aided tutoring * Model based reasoning * Intelligent signal processing * Genetic and adaptive search algorithms * Multimedia * Hybrid intelligent systems * Telemedicine and telecare * Chaos theory * Computer assisted diagnosis and prognosis * Machine learning * Auditing and evaluation of healthcare practices * Novelty detection CALL FOR PAPERS Authors are asked to submit, by email to the conference secretariat, 2 copies of extended abstracts, 2 pages in length with single spacing, in English, by 15 January 2001. Extended abstracts should clearly identify the healthcare context of the work, the methodology used, advances made and significance of the results obtained, as well as containing contact details including the corresponding author, mail address, fax numbers and email address. Authors whose abstracts are accepted will be asked to develop them into full papers of 4-6 pages length for inclusion in the conference proceedings. Important deadlines are: Submission of extended abstracts: 15 January 2001 Notification of provisional acceptance: 15 March 2001 Submission of full papers (camera ready): 15 May 2001 The website contains a pre-registration form to ensure that anyone interested in attending the conference receive further information in electronic form. From caruana+ at cs.cmu.edu Wed Oct 4 10:37:21 2000 From: caruana+ at cs.cmu.edu (Rich Caruana) Date: Wed, 04 Oct 2000 10:37:21 -0400 Subject: NIPS*2000 workshop information Message-ID: * * * Post-NIPS*2000 Workshops * * * * * * Breckenridge, Colorado * * * * * * December 1-2, 2000 * * * The NIPS*2000 Workshops will be held Friday and Saturday, December 1 and 2, in Breckenridge Colorado after the NIPS conference in Denver Monday-Thursday, November 27-30. This year there are 18 workshops: - Affective Computing - Algorithms and Technologies for Neuroprosthetics and Neurorobotics - Computation in the Cortical Column - Computational Molecular Biology - Computational Neuropsychology - Cross-Validation, Bootstrap, and Model Selection - Data Fusion -- Theory and Applications - Data Mining and Learning on the Web - Explorative Analysis and Data Modeling in Functional Neuroimaging - Geometric Methods in Learning Theory - Information and Statistical Structure in Spike Trains - Learn the Policy or Learn the Value-Function? - New Perspectives in Kernel-based Learning Methods - Quantum Neural Computing - Representing the Structure of Visual Objects - Real-Time Modeling for Complex Learning Tasks - Software Support for Bayesian Analysis Systems - Using Unlabeled Data for Supervised Learning All workshops are open to all registered attendees. Many workshops also invite submissions. Submissions, and questions about individual workshops, should be directed to each workshop's organizers. Included below is a short description of each workshop. Additional information is available at the NIPS*2000 Workshop Web Page: http://www.cs.cmu.edu/Groups/NIPS/NIPS2000/Workshops/ Information about registration, travel, and accommodations for the main conference and the workshops is available at: http://www.cs.cmu.edu/Web/Groups/NIPS/ Breckenridge is a ski resort a few hours drive from Denver. The daily workshop schedule is designed to allow participants to ski half days, or enjoy other extra-curricular activities. Some may wish to extend their visit to take advantage of the relatively low pre-season rates. We look forward to seeing you in Breckenridge. Rich Caruana and Virginia de Sa NIPS Workshops Co-chairs * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Affective Computing Workshop Co-Chairs: Javier R. Movellan, Institute for Neural Computation, UCSD Marian Bartlett, Institute for Neural Computation, UCSD Gary Cottrell, Computer Science, UCSD Rosalind W. Picard, Media Lab, MIT Description: The goal of this workshop is to explore and discuss the idea of affective computers, i.e., computers that have the ability to express emotions, recognize emotions, and whose behavior is modulated by emotional states. Emotions are a fundamental part of humans intelligence. It may be argued that emotions provide an "operating system" for autonomous agents that need to handle uncertainty of natural environments in a flexible and efficient manner. Connectionist models of emotion dynamics have been developed (Velasquez, 1996) providing examples of computational systems that incorporate emotional dynamics and that are being used in actual autonomous agents (e.g., robotic pets). Emotional skills, especially the ability to recognize and express emotions, are essential for natural communication between humans, and until recently, have been absent from the computer side of the human-computer interaction. For example, autonomous teaching agents and pet robots would greatly benefit from detecting affective cues from the users (curiosity, frustration, insight, anger) and adjusting to them, and also from displaying emotion appropriate to the context. The workshop will bring together leaders in the main research areas of affective computing: emotion recognition, emotion synthesis, emotion dynamics, applications. Speakers from industry will discuss current applications of affective computing, including synthesizing facial expressions in the entertainment industry, increasing the appeal of the pet robots through emotion recognition and synthesis, and measuring galvanic skin response through the mouse to determine user frustration. Format: This will be a one day workshop. The speakers will be encouraged to talk about challenges and controversial topics both in their prepared talks and in the insuing discussions. Since one of the goals of the workshop is to facilitate communication between researchers in different subfields, ample time will be given to questions. The last part of the workshop will be devoted to a discussion of the most promising approaches and ideas that will have emerged during the workshop. Contact Info: Javier R. Movellan Institute for Neural Computation University of California San Diego La Jolla, CA 92093-0515 movellan at inc.ucsd.edu Marian Stewart Bartlett Institute for Neural Computation University of California San Diego La Jolla, CA 92093-0515 marni at inc.ucsd.edu * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Algorithms, technologies, and neural representations for neuroprosthetics and Neurorobotics. Organizers: Simon F Giszter and Karen A Moxon http://neurobio.mcphu.edu/GiszterWeb/nips_ws_2k.html Goals and objectives: The goal of the workshop is to bring together researchers interested in neuroprosthetics and neurorobotics and intermediate representations in the brain with a view to generating a lively discussion of the design principles for a brain to artificial device interface. Speakers will be charged to address (favourably or unfavourably) the idea that the nervous system is built around, or dynamically organizes, low dimensional representations which may be used in (or need to be designed into) neurosprosthetic interfaces and controllers. Some current prosthetics are built around explicit motor representations e.g. kinematic plans. Though controversial, the notion of primitives and low dimensional representations of input and output are gaining favor. These may or may not contain or be used in explicit plans. It is very likely that the appropriate choices of sensory and motor representations and motor elements are critical for the design of an integrated sensory motor prostheses that enables rapid adaptive learning, and creative construction of new motions and planning and execution. With a burgeoning interest in neuroprosthetics it is therefore timely to address how the interfaces to neuroprostheses should be conceptualized: what reprsentations should be extracted, what control elements should be provided and how should these be integrated. We hope to engage a wide range of perspectives to address needs for research and the possibilities enabled by neuroprosthetics. We intend to assemble presentations and discussions from the perspectives of both neural data and theory, of new technologies and algorithms, and of applications or experimental approaches enabled by new and current technologies. Anticipated or fully confirmed speaker/ participants: John Chapin: Neurorobotics Nikos Hatzopoulos: Neural coding in cortex of primates Scott Makeig or Terry Sejnowski: EEG based controllers: representations James Abbas: peripheral FES with CPG models Warren Grill and Michel Lemay Intraspinal FES and force-fields Karen Moxon : sensory prostheses Gerry Loeb : intramuscular prostheses and spinal controls Simon Giszter : spinal primitives and interface Emo Todorov : cortical encoding and representation Igo Krebs : rehabilitation with robots * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * COMPUTATION IN THE CORTICAL COLUMN Organizers: Henry Markram and Jennifer Linden URL: http://www.keck.ucsf.edu/~linden/ColumnWorkshop.html Understanding computation in the cortical column is a holy grail for both experimental and theoretical neuroscience. The basic six-layered neocortical columnar microcircuit, implemented most extensively (and perhaps in its most sophisticated form) in the human brain, supports a huge variety of sensory, cognitive, and motor functions. The secret behind the incredible flexibility and power of cortical columns has remained elusive, but new insights are emerging from several different areas of research. It is a great time for cortical anatomists, physiologists, modellers, and theoreticians to join forces in attempting to decipher computation in the cortical column. In this workshop, leading experimental and theoretical neuroscientists will present their own visions of computation in the cortical column, and will debate their views with an interdisciplinary audience. During the morning session, speakers and panel members will analyze columnar computation from their perspectives as authorities on the anatomy, physiology, evolution, and network properties of cortical microcircuitry. Speakers and panelists in the afternoon session will consider the functional significance of the cortical column in light of their expert knowledge of two columnar systems which have attracted intensive experimental attention to date: the visual cortex of cats and primates, and the barrel cortex of rodents. The goal of the workshop will be to define answers to four questions. ANATOMY: Does a common denominator, a repeating microcircuit element, exist in all neocortex? PHYSIOLOGY: What are the electrical dynamics, the computations, of the six-layered cortical microcircuit? FUNCTION: How do cortical columns contribute to perception? EVOLUTION: How does the neocortex confer such immense adaptability? * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Computational Molecular Biology Organizers: Tommi Jaakkola, MIT Nir Friedman, Hebrew University For more information contact the workshop organizers at: tommi at ai.mit.edu nir at cs.huji.ac.il * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * NIPS*2000 Workshop on Computational Neuropsychology Workshop Organizers: Sara Solla Northwestern University Michael Mozer University of Colorado Martha Farah University of Pennsylvania The 1980's saw two important developments in the sciences of the mind: The development of neural network models in cognitive psychology, and the rise of cognitive neuroscience. In the 1990's, these two separate approaches converged, and one of the results was a new field that we call "Computational Neuropsychology." In contrast to traditional cognitive neuropsychology, computational neuropsychology uses the concepts and methods of computational modeling to infer the normal cognitive architecture from the behavior of brain-damaged patients. In contrast to traditional neural network modeling in psychology, computational neuropsychology derives constraints on network architectures and dynamics from functional neuroanatomy and neurophysiology. Unfortunately, work in computational neuropsychology has had relatively little contact with the Neural Information Processing Systems (NIPS) community. Our workshop aims to expose the NIPS community to the unusual patient cases in neuropsychology and the sorts of inferences that can be drawn from these patients based on computational models, and to expose researchers in computational neuropsychology to some of the more sophisticated modeling techniques and concepts that have emerged from the NIPS community in recent years. We are interested in speakers from all aspects of neuropsychology, including: * attention (neglect) * visual and auditory perception (agnosia) * reading (acquired dyslexia) * face recognition (prosopagnosia) * memory (Alzheimer's, amnesia, category-specific deficits) * language (aphasia) * executive function (schizophrenia, frontal deficits). Further information about the workshop can be obtained at: http://www.cs.colorado.edu/~mozer/nips2000workshop.html Contact Sara Solla (solla at nwu.edu) or Mike Mozer (mozer at colorado.edu) if you are interested in speaking at the workshop. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Call for Papers NIPS*2000 Workshop: Cross-Validation, Bootstrap and Model Selection Organizers: Rahul Sukthankar Compaq CRL and Robotics Institute, Carnegie Mellon Larry Wasserman Department of Statistics, Carnegie Mellon Rich Caruana Center for Automated Learning and Discovery, Carnegie Mellon Electronic Submission Deadline: October 18, 2000 (extended abstracts) Description Cross-validation and bootstrap are popular methods for estimating generalization error based on resampling a limited pool of data, and have become widely-used for model selection. The aim of this workshop is to bring together researchers from both matchine learning and statistics in an informal setting to discuss current issues in resampling-based techniques. These include: * Improving theoretical bounds on cross-validation, bootstrap or other resampling-based methods; * Empirical or theoretical comparisons between resampling-based methods and other forms of model selection; * Exploring the issue of overfitting in sampling-based methods; * Efficient algorithms for estimating generalization error; * Novel resampling-based approaches to model selection. The format for this one day workshop consists of invited talks, a panel discussion and short presentations from accepted submissions. Participants are encouraged to submit extended abstracts describing their current research in this area. Results presented at other conferences are eligible, provided that they are of broad interest to the community and clearly identified as such. Submissions for workshop presentations must be received by October 18, 2000, and should be sent to rahuls=nips at cs.cmu.edu. Extended abstracts should be in Postscript or Acrobat format and 1-2 pages in length. Contact Information The workshop organizers can be contacted by email at , or at the phone/fax numbers listed below. Organizer Email Phone Fax Rahul Sukthankar rahuls at cs.cmu.edu +1-617-551-7694 +1-617-551-7650 Larry Wasserman larry at stat.cmu.edu +1-412-268-8727 +1-412-268-7828 Rich Caruana caruana at cs.cmu.edu +1-412-268-7664 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Workshop Title: Data Fusion -- Theory and Applications Multisensor data fusion refers to the acquisition, processing, and synergistic combination of information gathered by various knowledge sources and sensors to provide a better understanding of the phenomenon under consideration. The concept of fusion underlies many information processing mechanisms in machines and in biological systems. In biological/perceptual systems, information fusion seems to account for remarkable performance and robustness when confronted with a variety of uncertainties. The complexity of fusion processes is due to many factors including uncertainties associated with different information sources, complimentarily of individual sources. For example, modeling, processing, fusion, and interpretation of diverse sensor data for knowledge assimilation and inferencing pose challenging problems, especially when available information is incomplete, inconsistent, and/or imprecise. The potential for significantly enhanced performance and robustness has motivated vigorous ongoing research in both biological and artificial multisensor data fusion algorithms, architectures, and applications. Such efforts deal with fundamental issues including modeling process, architecture and algorithms, information extraction, fusion process, optimization of fused performance, real time (dynamic) fusion etc. The goal of this workshop is to bring together researchers from various diverse fields (learning, human computer interaction, vision, speech, neural biology, etc) to discuss both theoretical and application issues that are relevant across different fields. It aims is to make the NIPS community aware of the various aspects and current status of this field, as well as the problems that remain unsolved. We are calling for participations. Submissions should be sent to the workshop organizers. Workshop Organizers: Misha Pavel pavel at ece.ogi.edu (503)748-1155 (o) Dept. of Electrical and Computer Engineering Oregon Graduate Institute of Science and Technology 20000 NW Walker Road Beaverton, OR 97006 Xubo Song xubosong at ece.ogi.edu (503) 748-1311 (o) Dept. of Electrical and Computer Engineering Oregon Graduate Institute of Science and Technology 20000 NW Walker Road Beaverton, OR 97006 Workshop Web Page: http://www.ece.ogi.edu/~xubosong/FusionWorkshop.html * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Data Mining and Learning on the Web Organizers: Gary William Flake (flake at research.nj.nec.com), Frans Coetzee (coetzee at research.nj.nec.com), and David Pennock (dpennock at research.nj.nec.com) No doubt about it, the web is big. So big, in fact, that many classical algorithms for databases and graphs cannot scale to the distributed multi-terabyte anarchy that is the web. How, then, do we best use, mine, and model this rich collection of data? Arguably, the best approach to developing scalable non-trivial applications and algorithms is to exploit the fact that, both as a graph and as a database, the web is highly non-random. Furthermore, since the web is mostly created and organized by humans, the graph structure (in the form of hyperlinks) encodes aspects of the content, and vice-versa. =20 We will discuss methods that exploit these properties of the web. We will further consider how many of the classical algorithms, which were formulated to minimize worst-case performance over all possible problem instances, can be adapted to the more regular structure of the web. Finally, we will attempt to identify major open research directions. This workshop will be organized into three mini-sessions: (1) Systematic Web Regularities (2) Web Mining Algorithms, and (3) Inferable Web Regularities. All speakers are invited and a partial list of confirmed speakers includes: Albert-L=E1szl=F3 Barab=E1si, Justin Boyan, Rich Caruana, Soumen Chakrabarti, Monika Henzinger, Ravi Kumar, Steve Lawrence, and Andrew McCallum. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * EXPLORATIVE ANALYSIS AND DATA MODELING IN FUNCTIONAL NEUROIMAGING: Arriving at, not starting with a hypothesis Advanced analysis methods and models of neuroimaging data are only marginally represented at the big international brain mapping meetings. This contrasts the broad belief in the Neuroimaging community that these approaches are crucial in the further development of the field. The purpose of this NIPS workshop is to bring together theoreticians developing and applying new methods of neuroimaging data interpretation. The workshop focuses on explorative analysis (a) and modeling (b) of neuroimaging data: a) Higher-order explorative analysis (for example: ICA/PCA, clustering algorithms) can reveal data properties in a data-driven, not hypothesis-driven manner. b) Models for neuroimaging data can guide the data interpretation. The universe of possible functional hypotheses can be constrained by models linking the data to other empirical data, for instance anatomical connectivity (pathway analysis to yield effective connectivity), encephalograghy data, or to behavior (computational function). The talks will introduce the new approaches, in discussions we hope to address benefits and problems of the various methods and of their possible combination. It is intended to discuss not only the theory behind the various approaches, but as well their value for improved data interpretation. Therefore, we strongly encourage theparticipation of neuroimaging experimenters concerned with functional paradigms suggesting the use of nonstandard data interpretation methods. More information can be found on the workshop webpage: http://www.informatik.uni-ulm.de/ni/staff/FSommer/workshops/nips_ws00.html For any requests please contact the workshop organizers: Fritz Sommer and Andrzej Wichert Department of Neural Information Processing University of Ulm D-89069 Ulm Germany Tel. 49(731)502-4154 49(731)502-4257 FAX 49(731)502-4156 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Geometric and Quantum Methods in Learning Organization: Shunichi Amari, Amir Assadi (Chair), and Tomaso Poggio Description. The purpose of this workshop is to attract the attention of the learning community to geometric methods and to take on an endeavor to: 1. lay out a geometric paradigm for formulating profound ideas in learning; 2. to facilitate the development of geometric methods suitable of investigation of new ideas in learning theory. Today's continuing advances in computation make it possible to infuse geometric ideas into learning that would otherwise have been computationally prohibitive. Quantum computation has created great excitement, offering a broad spectrum of new ideas for discovery of parallel-distributed algorithms, a hallmark of learning theory. In addition, geometry and quantum computation together offer a more profound picture of the physical world, and how it interacts with the brain, the ultimate learning system. Among the discussion topics, we envision the following: Information geometry, differential topological and quantum methods for turning local estimates into global quantities and invariants, Riemannian geometry and Feynman path integration as a framework to explore nonlinearity, and information theory of massive data sets. We will also examine the potential impact of learning theory on future development of geometry, and examples of how quantum computation has opened new vistas on design of parallel-distributed algorithms. The participants of the Workshop on Quantum Computation will find this workshop's geometric ideas beneficial for the theoretical aspects of quantum algorithms and quantum information theory. We plan to prepare a volume based on the materials for the workshops and other contributions to be proposed to the NIPS Program Committee. Contact Information Amir Assadi University of Wisconsin-Madison. URL: www.cms.wisc.edu/~cvg E-mail: ahassadi at facstaff.wisc.edu Partial List of Speakers and Panelists Shun-Ichi Amari Amir Assadi Zubin Ghahramani Geoffrey Hinton Tomaso Poggio Jose Principe Scott Mackeig Naoki Saito (tentative) * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Information and Statistical Structure in Spike Trains: How can we calculate what we really want to know? Organizer: Jonathan D. Victor jdvicto at med.cornell.edu Advances in understanding how neurons represent and manipulate information in their spike trains will require a combination of appropriate theoretical, computational, and experimental strategies. The workshop has several aims: (1) By presenting currently available methods in a tutorial-like fashion, we hope to lower the energy barrier to experimentalists who are interested in using information-theoretic and related approaches, but have not yet done so. The presentation of current methods is to be done in a manner that emphasizes the theoretical underpinnings of different strategies and the assumptions and tradeoffs that they make. (2) By provide a forum for open discussion among current practitioners, we hope to make progress towards understanding the relationships of the available techniques, guidelines for their application, and the basis of the differences in findings across preparations. (3) By presenting the (not fully satisfactory) state of the art to an audience that includes theorists, we hope to spur progress towards the development of better techniques, with a particular emphasis on exploiting more refined hypotheses for spike train structure, and developing techniques that are applicable to multi-unit recordings. A limited number of slots are available for contributed presentations. Individuals interested in presenting a talk (approximately 20 minutes, with 10 to 20 minutes for discussion) should submit a title and abstract, 200-300 words, to the organizer by October 22, 2000. Please indicate projection needs (overheads, 2x2 slides, LCD data projector). For further information, please see http://www-users.med.cornell.edu/~jdvicto/nips2000.html * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Title: ====== Reinforcement Learning: Learn the Policy or Learn the Value-Function? Organising Committee ==================== Peter Bartlett (Peter.Bartlett at anu.edu.au) Jonathan Baxter (jbaxter at whizbang.com) David McAllester (dmac at research.att.com) Home page ========= http://csl.anu.edu.au/~bartlett/rlworkshop Workshop Outline ============== There are essentially three main approaches to reinforcement learning in large state spaces: 1) Learn an approximate value function and use that to generate a policy, 2) Learn the parameters of the policy directly, typically using a Monte-Carlo estimate of the performance gradient, and 3) "Actor-Critic" methods that seek to combine the best features of 1) and 2). There has been a recent revival of interest in this area, with many new algorithms being proposed in the past two years. It seems the time is right to bring together researchers for an open discussion of the three different approaches. Submissions are sought on any topic of related interest, such as new algorithms for reinforcement learning, but we are particularly keen to solicit contributions that shed theoretical or experimental light on the relative merits of the three approaches, or that provide a synthesis or cross-fertilization between the different disciplines. Format ====== There will be invited talks and a series of short contributed talks (15 minutes), with plenty of discussion time. If you are interested in presenting at the workshop, please send a title and short abstract to jbaxter at whizbang.com * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * New Perspectives in Kernel-based Learning Methods Nello Cristianini, John Shawe-Taylor, Bob Williamson http://www.cs.rhbnc.ac.uk/colt/nips2000.html Abstract: The aim of the workshop is to present new perspectives and new directions in kernel methods for machine learning. Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel functions in learning systems. Support Vector Machines, Gaussian Processes, kernel PCA, kernel Gram-Schmidt, Bayes Point Machines, Relevance and Leverage Vector Machines, are just some of the algorithms that make crucial use of kernels for problems of classification, regression, density estimation, novelty detection and clustering. At the same time as these algorithms have been under development, novel techniques specifically designed for kernel-based systems have resulted in methods for assessing generalisation, implementing model selection, and analysing performance. The choice of model may be simply determined by parameters of the kernel, as for example the width of a Gaussian kernel. More recently, however, methods for designing and combining kernels have created a toolkit of options for choosing a kernel in a particular application. These methods have extended the applicability of the techniques beyond the natural Euclidean spaces to more general discrete structures. The field is witnessing growth on a number of fronts, with the publication of books, editing of special issues, organization of special sessions and web-sites. Moreover, a convergence of ideas and concepts from different disciplines is occurring. The growth is concentrated in four main directions: 1) design of novel kernel-based algorithms 2) design of novel types of kernel functions 3) development of new learning theory concepts 4) application of the techniques to new problem areas Extended abstracts may be submitted before October 30th to nello at dcs.rhbnc.ac.uk * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Quantum Neural Computing http://web.physics.twsu.edu/behrman/NIPS.htm Recently there has been a resurgence of interest in quantum computers because of their potential for being very much smaller and faster than classical computers, and because of their ability in principle to do heretofore impossible calculations, such as factorization of large numbers in polynomial time. This workshop will explore ways to implement quantum computing in network topologies, thus exploiting both the intrinsic advantages of quantum computing and the adaptability of neural computing. Aspects/approaches to be explored will include: quantum hardware, e.g. nmr, quantumdots, and molecular computing; theoretical and practical limits to quantum and quantum neural computing, e.g., noise and measurability; and simulations. Targeted groups: computer scientists, physicists and mathematicians interested in quantum computing and next-generation computing hardware. Invited speakers will include: Paul Werbos, NSF Program Director, Control, Networks & Computational Intelligence Program, Electrical and Communications Systems Division, who will keynote the workshop. Thaddeus Ladd, Stanford, "Crystal lattice quantum computation." Mitja Perus, Institute BION, Stegne 21, SI-1000 Ljubljana, Slovenia, "Quantum associative nets: A new phase processing model" Ron Spencer, Texas A&M University, "Spectral associative memories." E.C. Behrman, J.E. Steck, and S.R. Skinner, Wichita State University, "Simulations of quantum neural networks." Dan Ventura, Penn State: "Linear optics implementation of quantum algorithms." Ron Chrisley, TBA Send contributed papers, by October 20th, to: Co-chairs: Elizabeth C. Behrman behrman at wsuhub.uc.twsu.edu James E. Steck steck at bravo.engr.twsu.edu This Workshop is partially supported by the National Science Foundation, Grant #ECS-9820606. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Workshop title: Representing the Structure of Visual Objects Web page: http://kybele.psych.cornell.edu/~edelman/NIPS00/index.html Organizers: Nathan Intrator (Nathan_Intrator at brown.edu) Shimon Edelman (se37 at cornell.edu) Confirmed invited speakers: Ron Chrisley (Sussex) John Hummel (UCLA) Christoph von der Malsburg (USC) Pietro Perona (Caltech) Tomaso Poggio (MIT) Greg Rainer (Tuebingen) Manabu Tanifuji (RIKEN) Shimon Ullman (Weizmann) Description: The focus of theoretical discussion in visual object processing has recently started to shift from problems of recognition and categorization to the representation of object structure. The main challenges there are productivity and systematicity, two traits commonly attributed to human cognition. Intuitively, a cognitive system is productive if it is open-ended, that is, if the set of entities with which it can deal is, at least potentially, infinite. Systematicity, even more than productivity, is at the crux of the debate focusing on the representational theory of mind. A visual representation could be considered systematic if a well-defined change in the spatial configuration of the object (e.g., swapping top and bottom parts) were to cause a principled change in the representation (the representations of top and bottom parts are swapped). In vision, this issue (as well as compositionality, commonly seen as the perfect means of attaining systematicity) is, at present, wide open. The workshop will start with an introductory survey of the notions of productivity, systematicity and compositionality, and will consist of presentations by the proponents of some of the leading theories in the field of structure representation, interspersed with open-floor discussion. * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Title: Real-Time Modeling for Complex Learning Tasks The goal of this workshop is to develop a better understanding how to create new statistical learning techniques that can deal with complex, high dimensional data sets, where (possibly redundant and/or irrelevant) data is received continuously from sensors and needs to be incorporated in learning models that may have to change their structure during learning under real time constraints. The workshop aims at bringing together researchers from various theoretical learning frameworks (Bayesian, Nonparametric statistics, Kernel methods, Gaussian processes etc.) and application domains to discuss future research direction for principled approaches towards real-time learning. For further details, please refer to the URL: http://www-slab.usc.edu/events/NIPS2000 * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Software Support for Bayesian Analysis System URL: http://ase.arc.nasa.gov/nips2000 Bayesian Analysis is an established technique for many data analysis applications. The development of application software for any specific analysis problem, however, is a difficult and time-consuming task. Programs must be tailored to the specific problem, need to represent the given statistical model correctly, and should preferably run efficiently. Over the last years, a variety of different libraries, shells, and synthesis systems for Bayesian data analysis has been implemented which are intended to simplify application software development. The goal of this workshop is to bring together developers of such generic Bayesian software packages and tools (e.g., JavaBayes, AutoClass, BayesPack, BUGS, BayesNet Toolbox, PDP++) together with the developers of generic algorithm schemas (more recent ones amenable to automated effort include Structural EM, Fisher Kernel method, mean-field, etc.), and software engineering experts. It is intended as a forum to discuss and exchange the different technical approaches as for example usage of libraries, interpretation of statistical models (e.g., Gibbs sampling), or software synthesis based on generic algorithm schemas. The workshop aims to discuss the potential and problems of generic tools for the development of efficient Bayesian data analysis software tailored towards specific applications. If you are planning to attend this workshop as a particpiant and/or are interested to present your work, please send a short (1-4 pages) system description, technical paper, or position paper to fisch at ptolemy.arc.nasa.gov no later than Wednesday, October, 18 2000. Preliminary PC: Organizers: L. Getoor, Stanford W. Buntine, Dynaptics P. Smyth, UC Irvine B. Fischer, RIACS/NASA Ames M. Turmon, JPL J. Schumann, RIACS/NASA Ames K. Murphy, UC Berkeley * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * NIPS 2000 Workshop and Unlabeled Data Supervised Learning Competition! We are pleased to announce the NIPS 2000 Unlabeled Data Supervised Learning Competition! This competition is designed to compare algorithms and architectures that use unlabeled data to help supervised learning, and will culminate in a NIPS workshop, where approaches and results will be compared. Round three begins soon, so don't delay (it is also still possible to submit results for rounds 1 and 2). More details, are now available at the competition web-site: http://q.cis.uoguelph.ca/~skremer/NIPS2000/ May the best algorithm win! Stefan, Deb, and Kristin * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * From Darryl.Charles at ntlworld.com Fri Oct 6 10:58:32 2000 From: Darryl.Charles at ntlworld.com (Darryl Charles) Date: Fri, 6 Oct 2000 15:58:32 +0100 Subject: Special session at ESANN'2001 on Artificial_Neural_Networks and Early Vision Processing Message-ID: Bruges (Belgium) April 25-26-27, 2001 Call for submission of papers to a special session at ESANN2001 on "Artificial Neural Networks and Early Vision Processing". Organized by Darryl Charles and Colin Fyfe from the University of Paisley. Submission of papers 8 December 2000 Notification of acceptance 5 February 2001 ESANN conference 25-27 April 2001 It is well known that biological visual systems, and in particular the human visual system, are extraordinarily good at extracting deciphering very complex visual scenes. Certainly, if we consider the human visual system to be solving inverse graphic problems then we have not really come close to building artificial systems which are as effective as biological ones. We have much to learn from studying biological visual architecture and the implementation of practical vision based products could be improved by gaining inspiration from these systems. The following are some suggested areas of interest: Unsupervised preprocessing methods e.g. development of local filters, edge filtering. Statistical structure identification e.g. Independent Component Analysis, Factor Analysis, Principal Components Analysis, Canonical Correlation Analysis, Projection pursuit. Information theoretic techniques for the extraction/preservation of information in visual data. Coding strategies e.g. sparse coding, complexity reduction. Binocular disparity. Motion, invariances, colour encoding e.g. optical flow, space/time filters. Topography preservation. The practical application of techniques relating to these topics. Submission to esann at dice.ucl.ac.be by the 8th December 2000. ESSAN2001 web site http://www.dice.ucl.ac.be/esann Darryl Charles and Colin Fyfe darryl.charles at pailsey.ac.uk and colin.fyfe at pailsey.ac.uk Applied Computational Intelligence Research Group School of Information and Communication Technologies University of Paisley Scotland From atascill at ford.com Fri Oct 6 13:58:00 2000 From: atascill at ford.com (Tascillo, Anya (A.L.)) Date: Fri, 6 Oct 2000 13:58:00 -0400 Subject: INNS-IEEE IJCNN 2001 www.geocities.com/ijcnn Message-ID: Greetings Connectionists! www.geocities.com/ijcnn We are pleased to announce the website for the INNS-IEEE International Joint Conference for Neural Networks, 2001, in Washington, D.C., July 14-19, 2001. Mark your calendars! Several neural network challenge problems will be offered this year, and an automated abstract submission form will be online November 1. After the legal experts get done reviewing my T-shirt design (fingers crossed), it will be displayed and offered on the website for sale along with registration. One less shirt to pack! A reminder will be sent on or near November 1. Please email atascill at ford.com if you would like to be placed on the mailing list for reminders to all open and close dates. All comments and suggestions are welcome (sightseeing, venue, etc.), and anyone who feels that they should have been asked to help organize/referee, and haven't heard from someone yet, email me now, and I'll forward your name (and qualifications) to the appropriate parties..... Publicity Chair Anya Tascillo From magnus at cs.man.ac.uk Mon Oct 9 09:45:21 2000 From: magnus at cs.man.ac.uk (Magnus Rattray) Date: Mon, 09 Oct 2000 14:45:21 +0100 Subject: Chair and lectureship in Bioinformatics Message-ID: <39E1CBF1.AF14B817@cs.man.ac.uk> Readers of this list may be interested in the following posts: The Department of Computer Science and the School of Biological Sciences at the University of Manchester seek applications for a Research Councils funded Chair in Bioinformatics, and for an associated lectureship. The University already has a strong collaborative research activity in Bioinformatics, and with the support of the UK Research Councils, is seeking to grow and strengthen that activity, specifically in the area of Post Genomic Bioinformatics. The Chair: The appointee will be expected to establish and lead collaborative research activities that bring advanced computational techniques to bear on problems involving information management, analysis and visualization, specifically in the context of genomic data. It is anticipated that the appointee will have an international reputation for work in bioinformatics, and will possess skills that complement those of people currently working at Manchester. The salary will be negotiable from ?37,500 p.a. The Lectureship: The appointee will be expected to bring computational skills of relevance to post genomic bioinformatics. Specifically, applicants with experience of statistical modelling, machine learning and information visualization are particularly encouraged to apply, but these topics should not be considered as excluding other relevant areas. The salary will be in the range ?18731 - ?23256 or ?24227 - ?30967 per annum. For further particulars and an application form, please contact the Director of Personnel, The University of Manchester, Manchester M139PL (tel: (+44) 161 275 2028, fax: (+44) 161 275 2471, quoting appropriate reference number. Informal enquiries can be made to Prof. Norman Paton (tel: (+44) 161 275 6910, email: norm at cs.man.ac.uk or Prof. Steve Oliver (tel: (+44) 161 606 7260, email steve.oliver at man.ac.uk). Closing date: 27 October 2000. From lemm at uni-muenster.de Mon Oct 9 11:50:16 2000 From: lemm at uni-muenster.de (Joerg_Lemm) Date: Mon, 9 Oct 2000 17:50:16 +0200 (CEST) Subject: Papers on Bayesian Quantum Theory Message-ID: Dear Colleagues, The following papers are available at http://pauli.uni-muenster.de/~lemm/ 1. Bayesian Reconstruction of Approximately Periodic Potentials for Quantum Systems at Finite Temperatures. (Lemm, Uhlig, Weiguny) 2. Inverse Time--Dependent Quantum Mechanics. (Lemm) (to appear in Phys. Lett. A) 3. Bayesian Inverse Quantum Theory. (Lemm, Uhlig) (to appear in Few-Body Systems) 4. Hartree-Fock Approximation for Inverse Many-Body Problems. (Lemm,Uhlig) Phys. Rev. Lett. 84, 4517--4120 (2000) 5. A Bayesian Approach to Inverse Quantum Statistics. (Lemm,Uhlig,Weiguny) Phys. Rev. Lett. 84, 2068-2071 (2000) In this series of papers a nonparametric Bayesian approach is developed and applied to the inverse quantum problem of reconstructing potentials from observational data. While the specific likelihood model of quantum mechanics may be mainly of interest for physicists, the presented techniques for constructing adapted situation specific prior processes (Gaussian processes for approximate invariances, mixtures of Gaussian processes, hyperparameters and hyperfields) are also useful for general empirical learning problems including density estimation, classification and regression. PAPERS: ======================================================================== Bayesian Reconstruction of Approximately Periodic Potentials for Quantum Systems at Finite Temperatures. by Lemm, J. C. , Uhlig, J., and A. Weiguny MS-TP1-00-4, arXiv:quant-ph/0005122 http://pauli.uni-muenster.de/~lemm/papers/pp.ps.gz Abstract: The paper discusses the reconstruction of potentials for quantum systems at finite temperatures from observational data. A nonparametric approach is developed, based on the framework of Bayesian statistics, to solve such inverse problems. Besides the specific model of quantum statistics giving the probability of observational data, a Bayesian approach is essentially based on "a priori" information available for the potential. Different possibilities to implement "a priori" information are discussed in detail, including hyperparameters, hyperfields, and non--Gaussian auxiliary fields. Special emphasis is put on the reconstruction of potentials with approximate periodicity. Such potentials might for example correspond to periodic surfaces modified by point defects and observed by atomic force microscopy. The feasibility of the approach is demonstrated for a numerical model. ======================================================================== Inverse Time--Dependent Quantum Mechanics. by Lemm, J. C. MS-TP1-00-1, arXiv:quant-ph/0002010 (to appear in Phys.Lett. A) http://pauli.uni-muenster.de/~lemm/papers/tdq.ps.gz Abstract: Using a new Bayesian method for solving inverse quantum problems, potentials of quantum systems are reconstructed from time series obtained by coordinate measurements in non--stationary states. The approach is based on two basic inputs: 1. a likelihood model, providing the probabilistic description of the measurement process as given by the axioms of quantum mechanics, and 2. additional "a priori" information implemented in form of stochastic processes over potentials. ======================================================================== Bayesian Inverse Quantum Theory. by Lemm, J. C. and Uhlig, J. MS-TP1-99-15, arXiv:quant-ph/0006027 (to appear in Few-Body Systems) http://pauli.uni-muenster.de/~lemm/papers/biqt.ps.gz Abstract: A Bayesian approach is developed to determine quantum mechanical potentials from empirical data. Bayesian methods, combining empirical measurements and "a priori" information, provide flexible tools for such empirical learning problems. The paper presents the basic theory, concentrating in particular on measurements of particle coordinates in quantum mechanical systems at finite temperature. The computational feasibility of the approach is demonstrated by numerical case studies. Finally, it is shown how the approach can be generalized to such many--body and few--body systems for which a mean field description is appropriate. This is done by means of a Bayesian inverse Hartree--Fock approximation. ======================================================================== Hartree-Fock Approximation for Inverse Many-Body Problems. by Lemm, J. C. and Uhlig, J. MS-TP1-99-10, arXiv:nucl-th/9908056 Phys. Rev. Lett. 84, 4517--4120 (2000) http://pauli.uni-muenster.de/~lemm/papers/ihf3.ps.gz Abstract: A new method is presented to reconstruct the potential of a quantum mechanical many--body system from observational data, combining a nonparametric Bayesian approach with a Hartree--Fock approximation. "A priori" information is implemented as a stochastic process, defined on the space of potentials.The method is computationally feasible and provides a general framework to treat inverse problems for quantum mechanical many--body systems. ======================================================================== A Bayesian Approach to Inverse Quantum Statistics. by Lemm, J. C., Uhlig, J., and Weiguny, A. MS-TP1-99-6, arXiv:cond-mat/9907013 Phys. Rev. Lett. 84, 2068-2071 (2000) http://pauli.uni-muenster.de/~lemm/papers/iqs.ps.gz Abstract: A nonparametric Bayesian approach is developed to determine quantum potentials from empirical data for quantum systems at finite temperature. The approach combines the likelihood model of quantum mechanics with a priori information on potentials implemented in form of stochastic processes. Its specific advantages are the possibilities to deal with heterogeneous data and to express a priori information explicitly in terms of the potential of interest. A numerical solution in maximum a posteriori approximation is obtained for one--dimensional problems. The number of measurements being small compared to the degrees of freedom of a nonparametric estimate, the results depend strongly on the implemented a priori information. ======================================================================== Dr. Joerg Lemm Universitaet Muenster Email: lemm at uni-muenster.de Institut fuer Theoretische Physik Phone: +49(251)83-34922 Wilhelm-Klemm-Str.9 Fax: +49(251)83-36328 D-48149 Muenster, Germany http://pauli.uni-muenster.de/~lemm ======================================================================== From DominikD at cybernetics.com.au Tue Oct 10 20:44:24 2000 From: DominikD at cybernetics.com.au (Dominik Dersch) Date: Wed, 11 Oct 2000 11:44:24 +1100 Subject: career opportunities at Crux Cybernetics Message-ID: <2F8D50B141B3D311AF5100A0C9FC05580416EF@UCSMAILSVR> We have two open positions at Crux Cybernetics for a Research and Development Team Leader and a Research and Development Software Developer at our Sydney, CBD location. Please reply only to recruitment at cybernetics.com.au --------------------------------------------------- Research and Development Team Leader The Research and Development Group (RDG) at Crux Cybernetics is responsible for research, development and commercial implementation of machine learning, artificial intelligence and constraint satisfaction application software using web, wireless, and e-commerce technologies. The group has a broad range of skills covering areas like speech recognition, image processing and financial time series analysis. To find out more about Crux Cybernetics go to our web page www.cybernetics.com.au RDG requires the services of a person to act as a research and development specialist and team leader. The primary responsibilities are leading a team to carry out research, consultancy, design, & implementation of artificial intelligent software solutions. Utilisation and interfacing with several related technologies and business process models and methodologies and keeping up to date with relevant research will be required. Essential: =B7 A Ph. D. in Physics, Engineering or applied Mathematics in an Artificial Intelligence topic =B7 Publication track record in relevant areas =B7 At least three years experience in applied research and development =B7 Excellent communication and presentation skills =B7 Able to write clear and accurate research proposals and reports =B7 Fluent in C++, C, PERL and shell scripting languages =B7 Familiar in at least one of the following prototyping and analysis tools: Matlab, IDL, Splus, Mathematica or Statit =B7 Experience in html and cgi =B7 NT, Unix & Linux operation system experience =B7 Leadership qualities =B7 Management potential Desirable: =B7 Grant application experience =B7 Track record of successful managing all phases of a project. =B7 Object-oriented design experience =B7 UML experience =B7 Business or Systems analysis =B7 Other OO languages =B7 Commercial e/m-commerce and e/m-finance experience Reference: 091000 Posted: 091000 Location of Position: CBD, Sydney, Australia To apply: All applicants must provide a letter of application that specially addresses your suitability for the role described above. Please email this letter together with a current resume to recruitment at cybernetics.com.au -------------------------------------------------------- Research and Development Software Developer The Research and Development Group (RDG) at Crux Cybernetics is responsible for research, development and commercial implementation of machine learning, artificial intelligence and constraint satisfaction application software using web, wireless, and e-commerce technologies. The group has a broad range of skills covering areas like speech recognition, image processing and financial time series analysis. To find out more about Crux Cybernetics go to our web page www.cybernetics.com.au RDG requires the services of a person to act as a research and development specialist. The primary responsibilities are to carry out research, consultancy, design, & implementation of artificial intelligent software solutions under the supervision of the team leader. Utilisation and interfacing with several related technologies and business process models and methodologies and keeping up to date with relevant research will be required. Essential: =B7 A recent degree in Physics, Engineering or applied Mathematics in an Artificial Intelligence topic =B7 Excellent communication and presentation skills =B7 Able to write clear and accurate research proposals and reports =B7 Fluent in C++, C, PERL and shell scripting languages =B7 Familiar in at least one of the following prototyping and analysis tools: Matlab, IDL, Splus, Mathematica or Statit =B7 NT, Unix & Linux operation system experience =B7 Willingness to explore new research areas =B7 Enthusiasm and genuine interest in commercialising AI research Desirable: =B7 Publication track record in relevant areas =B7 Experience in applied research and development =B7 Experience in html and cgi =B7 Experience in managing a project. =B7 Object-oriented design experience =B7 UML experience =B7 Business or Systems analysis =B7 Other OO languages =B7 Commercial e/m-commerce and e/m-finance experience Reference: 091000B Posted: 091000 Location of Position: CBD, Sydney To apply: All applicants must provide a letter of application that specially addresses your suitability for the role described above. Please email this letter together with a current resume to recruitment at cybernetics.com.au From philh at cogs.susx.ac.uk Wed Oct 11 09:08:02 2000 From: philh at cogs.susx.ac.uk (Phil Husbands) Date: Wed, 11 Oct 2000 14:08:02 +0100 Subject: Research Professorship Message-ID: <39E46620.82E99578@cogs.susx.ac.uk> Research Professorship School Of Cognitive And Computing Sciences, University of Sussex, UK Professor Of Neural Computation Ref 484 Applicants are invited to apply for a permanent professorship position within the Computer Science and Artificial Intelligence Subject Group of the School of Cognitive and Computing Sciences. The expected start date is 1 January 2001 or as soon as possible thereafter. Candidates should be able to show evidence of significant research achievement in Neural Computation. The successful applicant will be expected to expand significantly the existing high research profile of the Group in this area. It is intended that the post will carry a reduced teaching load. The salary is negotiable - the current minimum professorial salary is 37,493 per annum. Informal enquiries may be made to Dr Des Watson, tel +44 1273 678045, email desw at cogs.susx.ac.uk. Details of the School are available at http://www.cogs.susx.ac.uk Closing date: Friday 27 October 2000. Application forms and further particulars are available from and should be returned to Staffing Services Office, Sussex House, University of Sussex, Falmer, Brighton, BN1 9RH, tel +44 1273 678706. Further details, downloadable forms etc. at http://www.susx.ac.uk/Units/staffing/personnl/vacs/vac484.shtml From zhaoping at gatsby.ucl.ac.uk Thu Oct 12 09:38:47 2000 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Thu, 12 Oct 2000 14:38:47 +0100 (BST) Subject: Paper on computational design and nonlinear dynamics of a V1 model Message-ID: Title: Computational design and nonlinear dynamics of a recurrent network model of the primary visual cortex Author: Zhaoping Li Accepted for publication in Neural Computation available at: http://www.gatsby.ucl.ac.uk/~zhaoping/preattentivevision.html Abstract: Recurrent interactions in the primary visual cortex makes its output a complex nonlinear transform of its input. This transform serves pre-attentive visual segmentation, i.e., autonomously processing visual inputs to give outputs that selectively emphasize certain features for segmentation. An analytical understanding of the nonlinear dynamics of the recurrent neural circuit is essential to harness its computational power. We derive requirements on the neural architecture, components, and connection weights of a biologically plausible model of the cortex such that region segmentation, figure-ground segregation, and contour enhancement can be achieved simultaneously. In addition, we analyze the conditions governing neural oscillations, illusory contours, and the absence of visual hallucinations. Many of our analytical techniques can be applied to other recurrent networks with translation invariant neural connection structures. From bvr at stanford.edu Thu Oct 12 21:24:17 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Thu, 12 Oct 2000 18:24:17 -0700 Subject: NIPS*2000: Reminder -- Registration and Accommodations Deadlines Message-ID: <4.2.0.58.20001012182340.00dd7cb0@bvr.pobox.stanford.edu> Neural Information Processing Systems -- Natural and Synthetic Monday November 27 - Saturday December 2, 2000 Denver and Breckenridge, Colorado Early registration for NIPS*2000 ends November 1. Hotel rooms for the main conference at the Denver Marriott City Center (1-800-228-9290) will be held at the special conference rate ($80/night single, $90/night double) only until November 6. Rooms for the workshops at the Beaver Run Resort (1-800-288-1282) will be held at the special conference rates only until October 31, and at the Great Divide Lodge (1-800-321-8444) until November 9. Detailed information on hotel accommodations, transportation, registration, and the conference program is available at the NIPS web site http://www.cs.cmu.edu/Web/Groups/NIPS From oreilly at grey.colorado.edu Fri Oct 13 13:54:23 2000 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Fri, 13 Oct 2000 11:54:23 -0600 Subject: Paper on Generalization in Interactive Networks Message-ID: <200010131754.LAA13846@grey.colorado.edu> The following preprint is now available for downloading: ftp://grey.colorado.edu/pub/oreilly/papers/oreilly00_gen_nc.pdf *or* ftp://grey.colorado.edu/pub/oreilly/papers/oreilly00_gen_nc.ps Generalization in Interactive Networks: The Benefits of Inhibitory Competition and Hebbian Learning Randall C. O'Reilly Department of Psychology University of Colorado at Boulder In press at Neural Computation Abstract: Computational models in cognitive neuroscience should ideally use biological properties and powerful computational principles to produce behavior consistent with psychological findings. Error-driven backpropagation is computationally powerful, and has proven useful for modeling a range of psychological data, but is not biologically plausible. Several approaches to implementing backpropagation in a biologically plausible fashion converge on the idea of using bidirectional activation propagation in interactive networks to convey error signals. This paper demonstrates two main points about these error-driven interactive networks: (a) they generalize poorly due to attractor dynamics that interfere with the network's ability to systematically produce novel combinatorial representations in response to novel inputs; and (b) this generalization problem can be remedied by adding two widely used mechanistic principles, inhibitory competition and Hebbian learning, that can be independently motivated for a variety of biological, psychological and computational reasons. Simulations using the Leabra algorithm, which combines the generalized recirculation (GeneRec) biologically-plausible error-driven learning algorithm with inhibitory competition and Hebbian learning, show that these mechanisms can result in good generalization in interactive networks. These results support the general conclusion that cognitive neuroscience models that incorporate the core mechanistic principles of interactivity, inhibitory competition, and error-driven and Hebbian learning satisfy a wider range of biological, psychological and computational constraints than models employing a subset of these principles. - Randy +----------------------------------------------------------------+ | Dr. Randall C. O'Reilly | | | Assistant Professor | Phone: (303) 492-0054 | | Department of Psychology | Fax: (303) 492-2967 | | Univ. of Colorado Boulder | Home: (303) 448-1810 | | Muenzinger D251C | Cell: (720) 839-7751 | | 345 UCB | email: oreilly at psych.colorado.edu | | Boulder, CO 80309-0345 | www: psych.colorado.edu/~oreilly | +----------------------------------------------------------------+ From masuoka at flab.fujitsu.co.jp Sun Oct 15 18:48:27 2000 From: masuoka at flab.fujitsu.co.jp (Ryusuke Masuoka) Date: Sun, 15 Oct 2000 18:48:27 -0400 Subject: Ph.D. Thesis available: Neural networks learning differential data In-Reply-To: <200010131754.LAA13846@grey.colorado.edu> Message-ID: Dear Connectionists, I am pleased to announce the availability of my Ph.D. thesis for download in electronic format. Comments are welcome. Regrads, Ryusuke ------------------------------------------------------------ Thesis: ------- Title: "Neural Networks Learning Differential Data" Advisor: Michio Yamada URL: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/thesis/thesis.html Abstract: -------- Learning systems that learn from previous experiences and/or provided examples of appropriate behaviors, allow the people to specify {\em what} the systems should do for each case, not {\em how} systems should act for each step. That eases system users' burdens to a great extent. It is essential in efficient and accurate learning for supervised learning systems such as neural networks to be able to utilize knowledge in the forms of such as logical expressions, probability distributions, and constraint on differential data along with provided desirable input and output pairs. Neural networks, which can learn constraint on differential data, have already been applied to pattern recognition and differential equations. Other applications such as robotics have been suggested as applications of neural networks learning differential data. In this dissertation, we investigate the extended framework introduce constraints on differential data into neural networks' learning. We also investigate other items that form the foundations for the applications of neural networks learning differential data. First, new and very general architecture and an algorithm are introduced for multilayer perceptrons to learn differential data The algorithm is applicable to learning differential data of orders not only first but also higher than first and completely localized to each unit in the multilayer perceptrons like the back propagation algorithm. Then the architecture and the algorithm are implemented as computer programs. This required high programming skills and great amount of care. The main module is programmed in C++. The implementation is used to conduct experiments among others to show convergence of neural networks with differential data of up to third order. Along with the architecture and the algorithm, we give analyses of neural networks learning differential data such as comparison with extra pattern scheme, how learnings work, sample complexity, effects of irrelevant features, and noise robustness. A new application of neural networks learning differential data to continuous action generation in reinforcement learning and its experiments using the implementation are described. The problem is reduced to realization of a random vector generator for a given probability distribution, which corresponds to solving a differential equation of first order. In addition to the above application to reinforcement learning, two other possible applications of neural networks learning differential data are proposed. Those are differential equations and simulation of human arm. For differential equations, we propose a very general framework, which unifies differential equations, boundary conditions, and other constraints. For the simulation, we propose a natural neural network implementation of the minimum-torque-change model. Finally, we present results on higher order extensions to radial basis function (RBF) networks of minimizing solutions with differential error terms, best approximation property of the above solutions, and a proof of $C^l$ denseness of RBF networks. Through these detailed accounts of architecture, an algorithm, an implementation, analyses, and applications, this dissertation as a whole lays the foundations for applications of neural networks learning differential data as learning systems and will help promote their further applications. ------------------------------------------------------------ Ryusuke Masuoka, Ph.D. Senior Researcher Intelligent Systems Laboratory Fujitsu Laboratories Ltd. 1-4-3 Nakase, Mihama-ku Chiba, 261-8588, Japan Email: masuoka at flab.fujitsu.co.jp Web: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/ From jf218 at hermes.cam.ac.uk Mon Oct 16 08:53:31 2000 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Mon, 16 Oct 2000 13:53:31 +0100 (BST) Subject: Five years post-doc position on Comput. Neurosci. In-Reply-To: Message-ID: The Babraham Institute, Cambridge Two Postdoctoral positions available in Neuroscience Applications are invited for two postdoctoral scientists to join a group of systems neuroscientists within the Laboratory of Cognitive and Developmental Neuroscience investigating how the brain encodes visual and olfactory cues associated with recognition or both social and non-social objects. Current research employs behavioural, computational, neurophysiological (including multi-array electrophysiological recording) and functional neuroanatomical approaches using rodent and sheep models. Behavioural Neuroscientist/In vivo Electrophysiologist (Ref. KK/EP/3) Applications are invited for a post-doctoral scientist to work on a BBSRC funded grant investigating both behavioural and neurophysiological aspects of face perception and imagery in sheep. Candidates must either be experienced in behavioural assessments of perception/learning in animals or with in vivo electrophysiological recording methods. Training in aspects of behavioural or electrophysiological methodologies required, but not covered by the applicant?s previous experience, will be provided by existing members of the group. Experience with computational analysis of complex data would be an advantage. The appointment is for 3 years in the first instance. Computational Neuroscientist (Ref. JF/P/2) This post is available initially for 5 years. It would suit an individual with experience in computational analysis and modelling of sensory system functioning and will mainly involve utilisation of electrophysiological data from multi-array recording experiments. The individual would also be expected to work closely with electrophysiologists both within the group and the USA and to co-ordinate with other UK-based Computational Neuroscientists involved with the projects. The group already has excellent computational facilities to deal with the large amounts data associated multi-array recording experiments Informal enquiries on these Neuroscience vacancies should be directed to Dr. Keith Kendrick, Head of Neurobiology Programme: tel: 44(0) 1223 496385, fax. 44(0)1223 496028, e-mail keith.kendrick at bbsrc.ac.uk Starting salary for both positions in the range ?19,500 - ?23,000 per annum. Benefits include a non-contributory pension scheme, 25 days leave and 10? public holidays a year. On site Refectory, Nursery and Sports & Social Club as well as free car parking. Further details and an application form available from the Personnel Office, The Babraham Institute, Babraham, Cambridge CB2 4AT. Tel. 01223 496000, e-mail babraham.personnel at bbsrc.ac.uk. The closing date for these positions is 23rd October 2000. AN EQUAL OPPORTUNITIES EMPLOYER An Institute supported by the Biotechnology and Biological Sciences Research Council Jianfeng Feng The Babraham Institute Cambridge CB2 4AT UK http://www.cus.cam.ac.uk/~jf218 From j-patton at northwestern.edu Mon Oct 16 15:55:02 2000 From: j-patton at northwestern.edu (Jim Patton) Date: Mon, 16 Oct 2000 14:55:02 -0500 Subject: postdoc - Rehab Robotics Message-ID: <4.2.0.58.20001016143514.00aca940@merle.acns.nwu.edu> Postdoctoral Fellowship in Rehabilitation Robotics Position: Postdoctoral Fellow Organization: Sensory Motor Performance Program, Northwestern University and the Rehabilitation Institute of Chicago Location: Chicago, Illinois Posted: 10/16/00 Deadline: 12/16/00 Description: The Sensory Motor Performance Program (SMPP) is a multidisciplinary research laboratory located at the Rehabilitation Institute of Chicago (RIC) and affiliated with Northwestern University Medical an Engineering Schools. Members of SMPP perform basic research in the areas of musculoskeletal biomechanics and neural control of motion. Specific emphasis is placed on the study of musculoskeletal and neurological diseases that influence movement control. We are currently seeking a Postdoctoral Research Associate to join our study on movement control in normal and hemiparetic stroke subjects. The research involves the application of robotics technology to the study of the neuropathology of following stroke. The applicant will benefit from the mentorship of W. Z. Rymer and F. A. Mussa-Ivaldi, both established leaders in the field. Salary is contingent on educational background & experience. More information about this position and our research is available at http://www.smpp.nwu.edu/. Qualifications: Applicants will be expected to hold an earned doctorate in Biomedical Engineering or related discipline, with a record of research in motor control, robotics, biomechanics, neuroscience, or related field. The ideal candidate will augment our group's expertise in clinical neuromechanical analysis, modeling of multijoint limb movement, control theory, haptics, system identification, neural networks, or learning theory. Emphasis will be placed on early start time. RIC is an Affirmative Action/Equal Opportunity Employer. Woman and minority applicants are encouraged to apply. Hiring is contingent on eligibility to work in the United States. Contact: Please send a letter, vita, and the names, addresses and email addresses of 3-4 references to: James L. Patton, Ph.D. Rehabilitation Institute of Chicago Room 1406 345 East Superior, Chicago, Illinois USA 60611 ______________________________________________________________________ J A M E S P A T T O N , P H . D . Research Associate, Sensory Motor Performance Program Rehabilitation Institute of Chicago. Postdoctoral Fellow Physical Medicine & Rehabilitation Northwestern University Medical School 345 East Superior Room 1406 Chicago, IL 60611 NOTE: my phone number has changed! 312-238-1277 (OFFICE) -2208 (FAX) -1232 (LAB) WEB: EMAIL: _______________________________________________________________________ From jsteil at TechFak.Uni-Bielefeld.DE Tue Oct 17 02:04:32 2000 From: jsteil at TechFak.Uni-Bielefeld.DE (Jochen Jakob Steil) Date: Tue, 17 Oct 2000 08:04:32 +0200 Subject: Job offer: Neural Methods in Robotics and Visualisation Message-ID: <39EBEBF0.1B327B1B@techfak.uni-bielefeld.de> Dear Colleagues: The research group Neuroinformatics (Prof. Helge Ritter) at the University of Bielefeld is offering two research project positions for a Research Assistant with salary according to BAT-IIa. The positions will be affiliated with the Special Collaborative Research Unit (Sonderforschungsbereich) SFB 360: Situated Artificial Communicators ( http://www.sfb360.uni-bielefeld.de/sfbengl.html ). Research topics of the two projects are (i) neural approaches for visual robot arm instruction based on teaching by showing and (ii) development of a human-machine interface for state visualization and configuration of distributed system components of the situated communicator prototype system. Both projects provide the opportunity for a dissertation. Applicants should have a university degree (Masters or german diploma) in computer science, electrical engineering or physics and should have a good knowledge in Unix/C/C++ programming. A good background in the fields of robotics/computer vision/ visualization/neural networks is desireable. The University of Bielefeld follows a policy to increase the proportion of female employees in fields where women are underrepresented and is therefore particularly encouraging women for an application. Applications of accordingly qualified disabled persons are welcome. Further information can be obtained from our group homepage: http://www.TechFak.Uni-Bielefeld.DE/techfak/ags/ni/ the university homepage: http://www.Uni-Bielefeld.DE Applications should be sent to: Prof. Dr. Helge Ritter Arbeitsgruppe Neuroinformatik Technische Fakultaet Universitaet Bielefeld 33 501 Bielefeld mailto:helge at techfak.uni-bielefeld.de From sylee at ee.kaist.ac.kr Tue Oct 17 02:46:56 2000 From: sylee at ee.kaist.ac.kr (Soo-Young Lee) Date: Tue, 17 Oct 2000 15:46:56 +0900 Subject: Faculty Position at KAIST Message-ID: <007901c03806$0ab27280$329ef88f@kaist.ac.kr> A faculty position is open at the Department of Electrical Engineering at Korea Advanced Institute of Science and Technology (KAIST). Although all areas of electrical engineering are eligible, top researchers in new interdisciplinary areas such as neural networks, biologically-motivated signal processing, artificial life, and bioelectronics will have higher priority. Unlike other BK(Brain Korea' 21) Research Professor position it is a regular faculty position with full faculty responsibility. The new faculty may also work closely with the Brain Science Research Center, the main research organization of Korean Brain Science and Engineering Research Program sponsored by Ministry of Science and Technology. The application deadline is October 27th, 2000. For details please visit www.kaist.ac.kr. Soo-Young Lee Professor, Department of Electrical Engineering Director, Brain Science Research Center Korea Advanced Institute of Science and Technology 373-1 Kusong-dong, Yusong-gu Taejon 305-701 Korea (South) Tel: +82-42-869-3431 / Fax: +82-42-869-8570 From wolfskil at MIT.EDU Tue Oct 17 16:06:40 2000 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Tue, 17 Oct 2000 16:06:40 -0400 Subject: ANN: Advances in Neural Information Processing Systems 12 Message-ID: I thought readers of this list might be interested in this book. For more information please visit http://mitpress.mit.edu/promotions/books/SOLDHF00 Advances in Neural Information Processing Systems 12 edited by Sara A. Solla, Todd K. Leen, and Klaus-Robert M|ller The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented. Sara A. Solla is Professor of Physics and Astronomy at Northwestern University and of Physiology at Northwestern University Medical School. Todd K. Leen is Professor of Computer Science and Engineering, and of Electrical and Computer Engineering, at Oregon Graduate Institute of Science and Technology. Klaus-Robert M|ller is Associate Professor of Computer Science at the University of Potsdam and Senior Researcher at GMD-FIRST. 7 x 10, 1098 pp., cloth ISBN 0-262-19450-3 -------------------------------------------------------------------------------- Jud Wolfskill 617.253.2079 phone Associate Publicist 617.253.1709 fax MIT Press wolfskil at mit.edu 5 Cambridge Center http://mitpress.mit.edu Fourth Floor Cambridge, MA 02142 From sml at essex.ac.uk Wed Oct 18 09:07:57 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Wed, 18 Oct 2000 14:07:57 +0100 Subject: paper available: fast dictionary search Message-ID: <6A8CC2D6487ED411A39F00D0B7847B66E77E64@sernt14.essex.ac.uk> Dear All, The following paper on graph-based dictionary search (due to appear in Pattern Recognition Letters) is available at: http://algoval.essex.ac.uk/papers/dictionary/prl.ps As far as I am aware, the method described is unique in that the retrieval speed is independent of the size (number of entries) in the dictionary. The core of the method is based on a lazy matrix parser for probabilistic context-free grammars, which are a type of probabilistic graphical model. Therefore, it may be interesting to explore other possible uses of the method in that area. As always, comments are welcome. Best regards, Simon Lucas ---------------------------------------------------- Title: Efficient graph-based dictionary search and its application to text-image searching Keywords: dictionary search, text image indexing, graph search Abstract --------- This paper describes a novel method for applying dictionary knowledge to optimally interpret the confidence-rated hypothesis sets produced by lower-level pattern classifiers. This problem arises whenever image or video databases need to be scanned for textual content, and where some of the text strings are expected to be strings from a dictionary. The method is especially appropriate for large dictionaries, as might occur in vehicle registration number recognition for example. The problem is cast as enumerating the paths in a graph in best-first order given the constraint that each complete path is a word in some specified dictionary. The solution described here is of particular interest due to its generality, flexibility and because the time to retrieve each path is independent of the size of the dictionary. Synthetic results are presented for searching dictionaries of up to 1 million UK postcodes given graphs that correspond to insertion, deletion and substitution errors. We also present initial results from processing real noisy text images. -------------------------------------------------- Dr. Simon Lucas Senior Lecturer and MSc E-commerce Director Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom Email: sml at essex.ac.uk http://cswww.essex.ac.uk -------------------------------------------------- From angelo at soc.plymouth.ac.uk Wed Oct 18 09:51:13 2000 From: angelo at soc.plymouth.ac.uk (Angelo Cangelosi) Date: Wed, 18 Oct 2000 06:51:13 -0700 (MST) Subject: Postdoc positions in experimental and connectionist models of spatial language Message-ID: Two 2-year Post-doc Research Fellows (1 neural net modelling + 1 experimental psychology) Salary: #18222 - pay award pending - RF Scale The Development of a Psychological Plausible Computational Model for Spatial Language Use and Comprehension. Two post-doctoral fellows are required for a period of two years in the first instance to work on a collaborative project with Dr Kenny Coventry (Centre for Thinking and Language, Department of Psychology) and Dr Angelo Cangelosi (Centre for Neural and Adaptive Systems, School of Computing) at the University of Plymouth. The objective of the research is to develop a computational model for spatial language using neural networks based on experimental data with human participants. One of the positions will primarily involve the design and collection of experimental data from human participants, and the other post will primarily involve connectionist computational modelling. However, candidates possessing both computational and experimental skills are particularly encouraged to apply. Ref : 3918/HSCI - (based in the Centre for Neural and Adaptive Systems, School of Computing), the applicant must have a PhD in one of the Cognitive Sciences (Computer Science/Psychology/Linguistics), with expertise in cognitive and neural network modelling and programming skills. Ref : 3917/HSCI - (based in the Centre for Thinking and Language, Department of Psychology), the applicant must have a PhD in one of the Cognitive Sciences (Psychology/Linguistics/Computer Science), with expertise in experimental methodologies and analyses. The appointees will be working collaboratively as part of a larger interdisciplinary research team (see http://psy.plym.ac.uk/research/slg/slg.html and www.tech.plymouth.ac.uk/soc/staff/angelo). Anyone wishing to discuss the posts should contact Dr Kenny Coventry (kcoventry at plymouth.ac.uk) or Dr Angelo Cangelosi (acangelosi at plymouth.ac.uk). Application details can be obtained from The Personnel Department (personnel at plymouth.ac.uk, 6 Portland Villas, Drake Circus, Plymouth, PL4 8AA), or from Drs. Coventry and Cangelosi. Plymouth University is an Equal Opportunities employer. CLOSING DATE :THURSDAY 2 NOVEMBER 2000 From heiko.wersing at hre-ftr.f.rd.honda.co.jp Thu Oct 19 11:32:27 2000 From: heiko.wersing at hre-ftr.f.rd.honda.co.jp (Heiko Wersing) Date: Thu, 19 Oct 2000 17:32:27 +0200 Subject: Paper on stability conditions for design of linear threshold recurrent networks Message-ID: <39EF140B.A939BB63@hre-ftr.f.rd.honda.co.jp> Dear Connectionists, the following paper has recently been accepted for publication in Neural Computation. It can be downloaded from my university homepage at http://www.techfak.uni-bielefeld.de/~hwersing/ Comments and questions are highly appreciated. ---------------------------------------------- "Dynamical stability conditions for recurrent neural networks with unsaturating piecewise linear transfer functions" by Heiko Wersing, Wolf-Juergen Beyn, and Helge Ritter. Abstract: We establish two conditions which ensure the non-divergence of additive recurrent networks with unsaturating piecewise linear transfer functions, also called linear threshold or semilinear transfer functions. As was recently shown by Hahnloser (Nature 405, 2000) networks of this type can be efficiently built in silicon and exhibit the coexistence of digital selection and analogue amplification in a single circuit. To obtain this behaviour, the network must be multistable and non-divergent and our conditions allow to determine the regimes where this can be achieved with maximal recurrent amplification. The first condition can be applied to nonsymmetric networks and has a simple interpretation of requiring that the strength of local inhibition must match the sum over excitatory weights converging onto a neuron. The second condition is restricted to symmetric networks, but can also take into account the stabilizing effect of non-local inhibitory interactions. We demonstrate the application of the conditions on a simple example and the orientation-selectivity model of Ben-Yishai et al. (1995). We show that the conditions can be used to identify in their model regions of maximal orientation-selective amplification and symmetry breaking. ------------------------------------------------- Best wishes -- Heiko Wersing Future Technology Research HONDA R&D EUROPE (DEUTSCHLAND) GmbH Carl-Legien-Str. 30 63073 Offenbach /Main Germany Tel.: +49-69-89011741 Fax: +49-69-89011749 e-mail: heiko.wersing at hre-ftr.f.rd.honda.co.jp From erol at starlab.net Fri Oct 20 03:46:15 2000 From: erol at starlab.net (Erol Sahin) Date: Fri, 20 Oct 2000 09:46:15 +0200 (CEST) Subject: Jobs@Starlab: Brussels and Barcelona Message-ID: Dear colleague, Starlab Brussels and Starlab Barcelona are now actively recruiting researchers in various fields from all over the world. Below I append a list of positions that may be of interest to the connectionists community. For a full list of positions--both at Brussels and Barcelona please check this site: http://www.starlab.org/jobs/ Please distribute and post as appropriate (or disregard if not interested!) This site will be updated regularly. Best regards, thanks for your time, Erol Sahin, Ph.D. Chief Scientist Starlab Research Laboratories Tel: +32-2-740 0768 Boulevard Saint-Michel 47 http://www.starlab.org Brussels 1040 BELGIUM E-mail:erol at starlab.net ------------------------------------------------------------ STARLAB IS LOOKING FOR INQUISITIVE AND ENTREPRENEURIAL MINDS TO WORK AT THE ROBOTICS LAB IN STARLAB BRUSSELS, BELGIUM To apply, please e-mail your resume and the names of three references along with a paragraph of your DREAM ROBOTICS PROJECT to: Dr Erol Sahin E-mail: erol at starlab.net Phone : +32-2-740 0768 Fax : +32-2-742 9654 Interest in interdisciplinary research activities a must. Remuneration: the best in the market. Consideration will be given to any outstanding research proposals. No part-timers, please. Starlab is a place where a 100 years means nothing. Here, you will find world-class research scientists not only doing what they do best, but also creating wealth through the entrepreneurship and spin-offs of their scientific ideas and inventions. Starlab's strength lies in the cross-fertilization of multiple disciplines grouped according to Bits, Atoms, Neurons, and Genes (BANG). Book a place on this adventure as Starlab takes you through time and space, riding on the wave of an impending IPO. Senior Scientist (Robotics):The candidate is expected to help the leading of the robotics research at Starlab. He should have a PhD level or more experience in robotics with an active research record. The ideal candidate has been involved in the writing and leading of robotics projects, is experienced in electromechanical building of robots and has a good grasp of ADAPTIVE METHODS such as NEURAL NETWORKS and genetic algorithms. Research Scientist (Robotics):The candidate is expected to involve in research projects which wil include electromechanical robot building, programming. He should have an MS level or more experience in robotics with experience in electromechanical building of robots and a proven competence in programming. Having experience in building robot simulators and using ADAPTIVE METHODS such as NEURAL NETWORKS and genetic algorithms would be a plus. Research Scientist (Robotics): In this position you will be developing hybrid robots. You will be working on the boundary between wearables research, intelligent clothing, augmented reality and robotics, and will be linking into BIOLOGICAL and NEUROSCIENCE as well. You will need knowledge of and experience in wearables research, embedded computing, sensor technology and DSP. You will also need an interest in PHYSIOLOGY and BRAIN SCIENCES. ------------------------------------------------------------ STARLAB IS LOOKING FOR INQUISITIVE AND ENTREPRENEURIAL MINDS TO WORK IN THE NEW STARLAB LABORATORY IN BARCELONA, SPAIN To apply, please send your CV and a letter of interests/research plan to Dr Giulio Ruffini E-mail: giulio at starlab.net Phone : +32-2-740 07 40 Fax : +32-2-742 96 54 Interest in interdisciplinary research activities a must. Remuneration: the best in the market. Consideration will be given to any outstanding research proposals. No part-timers, please. Senior Scientist to lead the Marine biology group: The tasks include setting up a research team. This includes ordering and setting up equipment, possibly including a small research submarine (!), and creating and leading a small (3-5-person) research group. Starlab wants to create a marine biology research center in Barcelona, focusing on extremophiles and ocean pollution studies. Applications are invited from candidates with a Ph.D. in marine biology. The applicant should have a proven record of scholarly publication, as well as several years of postdoctoral experience in the field Senior Scientist to lead the Genes group: The tasks include creating and leading a small (3-5-person) research group. Our goal is to build a center for gene therapy ("virus domestication lab" in the case of viral therapy) in Starlab Barcelona. Applications are invited from candidates with a Ph.D. in genetics and experience in viral therapy. Robotics or AI Senior Scientist: The candidate is expected to lead the robotics and AI research at Starlab Barcelona. He should have a Ph.D. level or more experience in robotics and/or AI with an active research record. The ideal candidate has been involved in the writing and leading of AI/robotics projects, is experienced in electromechanical building of robots and has a good grasp of adaptive/evolutive methods. Possible focus on multi-robotics and emergence. This research will partially be conducted remotely using the CAM-BRAIN computer at Starlab Brussels. Electronics Hardware Engineer for the Earth Observation Group. The tasks include ordering components and setting up the relevant equipment as well as the necessary skills to develop RF hardware prototypes for GPS applications, as well as writing the appropriate embedded software. Applications are invited from candidates with a M.Sc. or PhD in Electrical/Electronics Engineering with GPS experience. Senior Scientist EEG Analysis/Sleep Lab: We are seeking a project leader to build an EEG lab to study memory formation, develop novel diagnostic tools, and carry out research in sleep/lucidity. The chosen candidate will have the resources to put a team together and buy the necessary equipment. Candidates should have the necessary experience, including directing the construction work needed (Faraday cage, etc). The EEG team will have the opportunity to work closely together with outstanding signal processing specialists from the Future Earth Observation Systems group, as well as with the AI and neurology teams. Lab technician position in the Neurons Group: The applicant will be involved in neurodegeneration and neurogenesis research projects by combining neuroanatomical and molecular approaches. We are looking for highly motivated and open-minded people. Preference is given to technicians with experience in morphological and/or molecular Labs. Systems Manager: The tasks include ordering components and setting up the relevant equipment, and the necessary skills to set up and maintain an up to date Linux and Windows network as well as internet connectivity. Applications are invited from candidates with strong experience. Ph.D. and experience with numerical computation highly desired. From mpessk at guppy.mpe.nus.edu.sg Fri Oct 20 04:40:35 2000 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Fri, 20 Oct 2000 16:40:35 +0800 (SGT) Subject: TR on Efficient Incremental SVM Computations Message-ID: A Useful Bound for Incremental Computations in SVM Algorithms S.S. Keerthi and C.J. Ong National University of Singapore Abstract: A simple bound is given that is useful for checking the optimality of points whose lagrange multipliers take bound values. This bound is very inexpensive to compute and is useful in various scenarios. To download a gzipped postscript file containing the report, go to: http://guppy.mpe.nus.edu.sg/~mpessk/svm.shtml From anderson at cs.colostate.edu Fri Oct 20 13:26:52 2000 From: anderson at cs.colostate.edu (Chuck Anderson) Date: Fri, 20 Oct 2000 11:26:52 -0600 Subject: Tenure-track positions open at Colorado State University, Fort Collins, CO Message-ID: <39F0805C.B87096D1@cs.colostate.edu> Several tenure-track positions are open in the computer science at Colorado State University. We have a strong AI group in neural networks, reinforcement learning, planning, genetic algorithms, and computer vision, involving six of our 14 faculty. We have collaborative research projects with math, engineering and neurobiology departments at CSU and with local industry. Read more about our AI program at http://www.cs.colostate.edu/aigroup.html Chuck Anderson associate professor Department of Computer Science anderson at cs.colostate.edu Colorado State University http://www.cs.colostate.edu/~anderson Fort Collins, CO 80523-1873 office: 970-491-7491, FAX: 970-491-2466 Tenure-Track Faculty Positions Colorado State University Department of Computer Science The Department of Computer Science at Colorado State University solicits applications for at least two tenure-track faculty positions, beginning Fall 2001. The appointments will be preferably made at the level of assistant professor, but appointment at a more senior level is also possible for candidates who can demonstrate a strong connection to ongoing department research. Applicants must have a Ph.D. in computer science, computer engineering, or a related field. Applicants will be expected to teach undergraduate and graduate courses, and they must demonstrate potential for excellence in research and teaching. The Computer Science Department has 700 undergraduate majors and 80 graduate students enrolled in Master's and doctoral programs. The department currently has 17 tenure-track faculty, with strong research programs in artificial intelligence, software engineering, and parallel and distributed computation. Computer facilities are excellent, and there are ample opportunities for research collaborations with local industry. Colorado State University, with an enrollment of 22,000 students, is located in Fort Collins, Colorado, an attractive community of over 100,000 people, at the base of the Front Range of the Rocky Mountains, 65 miles north of Denver. The northern Front Range offers a wide range of outdoor recreational activities. More information about the department and its research programs can be obtained from the department's home page at http://www.cs.colostate.edu Applicants should send a curriculum vitae and letters from at least three professional references to: Faculty Search Committee, Computer Science Department, Colorado State University, Fort Collins, CO 80523. Please include a statement indicating how your background and interests match the expectations of the position(s) described above. The department's telephone number is 970-491-5862, and email inquiries should be directed to faculty-search at cs.colostate.edu. Screening of applications will begin November 1, 2000, and continue until the position is filled. Colorado State University is an EEO/AA employer. Office of Equal Opportunity: 101 Student Services. From kasigvardt at ucdavis.edu Fri Oct 13 16:19:01 2000 From: kasigvardt at ucdavis.edu (Karen A. Sigvardt) Date: Fri, 13 Oct 2000 13:19:01 -0700 Subject: Postdoc position at UC Davis: modeling basal ganglia in Parkinson's Message-ID: POSTDOCTORAL FELLOW University of California Davis, Center for Neuroscience and Department of Neurology. A postdoctoral fellow with strong quantitative skills is sought to perform analysis and modeling of neural activity recorded from basal ganglia and thalamus in Parkinson's Disease patients. The goal of the project is to understand the dynamics of the basal ganglia-thalamocortical network and how it relates to behavior and to the motor symptoms of PD. The project is led by a collaborative team of investigators: Drs. Karen Sigvardt of UC Davis, Charles Gray of Montana State University and Nancy Kopell of Boston University. UCD Center for Neuroscience has a strong, diverse faculty with research interests ranging from cellular to cognitive neuroscience (http://neuroscience.ucdavis.edu ). Candidates with a Ph.D. in Neuroscience, Physics, Computer Science or related fields are encouraged to apply. Salary is commensurate with experience. Send current CV, cover letter, relevant reprints and two references to: Dr. Karen Sigvardt, Center for Neuroscience, UC Davis, 1544 Newton Court, Davis CA 95616 or e-mail kasigvardt at ucdavis.edu. UC Davis is an EOE/AA employer. Thanks Karen Karen A. Sigvardt, Ph.D. Adjunct Professor of Neurology Center for Neuroscience University of California Davis 1544 Newton Court Davis CA 95616 Phone: (530) 757-8520 Lab: (530) 754-5022 Fax: (530) 757-8827 email: kasigvardt at ucdavis.edu From halici at metu.edu.tr Sat Oct 21 13:57:07 2000 From: halici at metu.edu.tr (Ugur HALICI) Date: Sat, 21 Oct 2000 20:57:07 +0300 Subject: 2ndCFP: Brain-Machine'2000-Notification of deadline & additional information Message-ID: <39F1D8F3.68B9F3DA@metu.edu.tr> SECOND CALL FOR PAPERS ---------------------------------------------------------- Notification of deadline & additional information ---------------------------------------------------------- BRAIN - MACHINE WORKSHOP 20-22 December 2000, Ankara, Turkey ---------------------------------------------------------- Conference Homepage: http://heaven.eee.metu.edu.tr/~vision/brainmachine.html ---------------------------------------------------------- The workshop aims to bring together the reasearchers working in Brain Research, Vision and Machine Intelligence ---------------------------------------------------------- Detailed list of topics, keyspeakers/invited talks, information on paper submission, proceedings/journals, hotels & tours are available on the conference home page. ---------------------------------------------------------- Important Dates: November 1, 2000 Paper submisison deadline November 10, 2000 Notification of acceptance November 20, 2000 Early registration December 20-22, 2000 Workshop ---------------------------------------------------------- Registration Fee: 200 USD, Before Nov. 20 250 USD, After Nov. 20 Students: 120 USD, Before Nov. 20 150 USD, After Nov. 20 ------------------------------------------------------------- Tours to be organised to ISTANBUL (the former capital of three successive empires - Roman, Byzantine and Ottoman) CAPPADOCIA (seven layer underground cities, fairy chimneys, churches carved out of the tough rocks) -------------------------------------------------------------- Contact Person: UGUR HALICI, Computer Vision and Artificial Neural Networks Res.Lab. Prof. of Dept. of Electrical and Electronics Eng. Middle East Technical University, 06531, Ankara, Turkey email: halici at metu.edu.tr fax: (+90) 312 210 1261 http://heaven.eee.metu.edu.tr/~halici/ http://heaven.eee.metu.edu.tr/~vision/ -------------------------------------------------------------- From pr230 at cus.cam.ac.uk Sun Oct 22 03:15:01 2000 From: pr230 at cus.cam.ac.uk (P. Roper) Date: Sun, 22 Oct 2000 08:15:01 +0100 (BST) Subject: TWO YEAR POSTDOCTORAL POSITION AT THE UNIVERSITY OF UTAH Message-ID: Applications are invited for a two-year postdoctoral position in mathematical/computational neuroscience to work with Paul Bressloff who joins the mathematical biology group at the University of Utah in Jan 2001. The successful candidate will also interact with Jenny Lund and colleagues in the newly-formed systems neuroscience group within the Medical School. RESEARCH AREA: Mathematical/computational models of visual cortex QUALIFICATIONS: Experience in computational modeling of complex systems essential. Some background in computational neuroscience desirable but not essential. START DATE: after Jan 1st 2001 Anyone interested in this position should e-mail Paul Bressloff at P.C.Bressloff at Lboro.ac.uk ---------------------------------------- Dr. Peter Roper Laboratory of Computational Neuroscience Babraham Institute Cambridge University CAMBS, CB2 4AT, UK email pr230 at cam.ac.uk From mozer at cs.colorado.edu Sun Oct 22 15:31:33 2000 From: mozer at cs.colorado.edu (Mike Mozer) Date: Sun, 22 Oct 2000 13:31:33 -0600 Subject: positions in machine learning at University of Colorado at Boulder Message-ID: <200010221931.e9MJVXf09998@neuron.cs.colorado.edu> University of Colorado at Boulder Department of Computer Science Tenure Track Positions The Department of Computer Science of the University of Colorado at Boulder is seeking applications for a number of tenure track faculty positions. While we expect most of the appointments to be at the Assistant Professor level, we will consider outstanding candidates at all levels. The Department is currently recruiting to fill two positions in the area of machine learning. We are particularly interested applicants whose work expands the theoretical foundations of machine learning, and applicants who apply theoretically-grounded techniques to solving practical, real-world problems and/or understanding the brain from a cognitive neuroscience perspective. The University of Colorado has a diverse faculty interested in issues of neural and statistical computation (http://www.cs.colorado.edu/~mozer/ns.html), and is particularly strong in the areas of speech recognition, natural language processing, computational modeling of human cognition, and human-machine interfaces. An explosion of local start-ups with research activities in machine learning enriches the university community and furnishes a wealth of collaborative opportunities. The Department has 39 faculty (including five new members who have joined this year), 191 graduate students, and 565 undergraduates. The Department has received four successive five-year NSF CER and RI awards to support its computing infrastructure and collaborative research among its faculty, most recently for the period 2000-2005. In the recent NSF ITR competition our faculty participated in proposals funded for more than $13.5M. The faculty is also proud of its record in offering an outstanding educational experience to our students. We are seeking new colleagues who share our commitment to the ideals of the research university as a uniquely valuable institution in our society, combining the creation of new knowledge with the shaping of future leaders. Our location in Boulder offers a pleasant college town ambience, easy access to the great outdoors, and participation in a vibrant, rapidly-growing high tech industry community. A recent study by a major telecommunications startup identified Boulder County as the most attractive setting nationally to which to recruit staff. Our alumni are playing important roles in creating the next generation of technology and technology companies here and elsewhere. More information about the Department can be found at http://www.cs.colorado.edu. Review of applications will begin immediately, and continue as long as positions are open. We expect to have positions in future years as well as for appointments beginning in academic year 2001-2002, and we welcome inquiries about these future opportunities. Because we have a number of positions available, we would be glad to receive inquiries from research collaborators interested in joining our faculty together. We also welcome applications from academic couples wishing to co-locate. Applicants should send a current curriculum vitae, the names of four references, and one-page statements of research and teaching interests to Professor Clayton Lewis, Search Committee Chair, Department of Computer Science, Campus Box 430, University of Colorado, Boulder, CO 80309-0430. The University of Colorado at Boulder is committed to diversity and equality in education and employment. From celiasmith at uwaterloo.ca Mon Oct 23 12:28:12 2000 From: celiasmith at uwaterloo.ca (Chris Eliasmith) Date: Mon, 23 Oct 2000 11:28:12 -0500 Subject: Postdoc in Computational Neuroscience Message-ID: <01C03CE4.5D01E830.celiasmith@uwaterloo.ca> ************************************************************************ A POSTDOCTORAL POSITION in COMPUTATIONAL NEUROSCIENCE is available in Charles Anderson's Computational Neuroscience Research Group (CNRG) in the Dept. of Anatomy and Neurobiology, Washington University School of Medicine in St. Louis. This position is available immediately and for a duration of up to 3 years. A competitive salary package will be offered depending on the qualifications of the candidate. We have active research in a wide range of topics that are focused on the development of a general framework for realistic modeling of neural systems. Our approach is based on fundamental principles of signal processing, motor control theory and statistical inference. Ongoing projects in close collaboration with experimentalists include models of visual cortex (with David Van Essen and Greg DeAngelis), parietal working memory (Larry Snyder), and the vestibular ocular motor system (Dora Angelaki and Steve Highstein). We would like to expand our models of cortical processing of visual information and are open to modeling the function of the cerebellum (with Tom Thach). This research position requires strong analytic and computer modeling skills. Candidates with training in engineering or physics are especially encouraged to apply even if they have no background in neuroscience. What is required is a strong interest in learning about these fascinating computational systems (see http://stp.wustl.edu/~compneuro for more information). Washington University is a leading research institute in all areas of neuroscience. Dr. Anderson's research group is in the Dept. of Anatomy and Neurobiology, chaired by David Van Essen. Besides strong collaborations with the many experimental neuroscientists in the department, there are active interactions with the Neural Engineering division of the rapidly growing Biomedical Engineering Department at Washington University, as well as the Biology, Physics, and Electrical Engineering Departments. Please send a CV, a brief statement of research experience and interests, and the names and contact information of two references to: cha at shifter.wustl.edu. Applications can also be mailed to: Dr. Charles H. Anderson Dept. Anatomy and Neurobiology Campus Box 8108 Washington University School of Medicine 660 S. Euclid Ave. St. Louis, MO 63124 ************************************************************************ From rampon at tin.it Tue Oct 24 02:42:50 2000 From: rampon at tin.it (Salvatore Rampone) Date: Tue, 24 Oct 2000 08:42:50 +0200 Subject: Paper on Function approximation from noisy data Message-ID: <001101c03d85$ac39fcc0$35c9d8d4@brainstorm> From stefan.wermter at sunderland.ac.uk Tue Oct 24 09:19:58 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Tue, 24 Oct 2000 14:19:58 +0100 Subject: cognitive systems research Message-ID: <39F58C7E.B186E744@sunderland.ac.uk> The journal of cognitive systems research invites recent and new books to be sent for review, e.g. in cognitive sciences, neural networks, hybrid systems, language processing, vision etc. We are also interested to hear from researchers who are interested to write a review which will get published as a brief article in the journal cognitive systems research. Also, if you have recently read a new interesting, challenging, controversial book in the scope of the journal please let us know at the address below and send two copies of the book to the address below. We are particularly interested in new books and new review article writers. More details and a list of books for review can be found on http://www.his.sunderland.ac.uk/cognitive.html If you interested in writing a article please let me know. best wishes, Stefan Wermter *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From ASIM.ROY at asu.edu Tue Oct 24 13:03:51 2000 From: ASIM.ROY at asu.edu (Asim Roy) Date: Tue, 24 Oct 2000 10:03:51 -0700 Subject: Summary of panel discussion at IJCNN'2000 on the question: DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK? Message-ID: A non-text attachment was scrubbed... Name: not available Type: multipart/alternative Size: 36952 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/885a1105/attachment-0001.bin From ASIM.ROY at asu.edu Tue Oct 24 13:03:51 2000 From: ASIM.ROY at asu.edu (Asim Roy) Date: Tue, 24 Oct 2000 10:03:51 -0700 Subject: [repost] Summary of panel discussion at IJCNN'2000 on the question: DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK? Message-ID: [ Reposted to correct a formatting error. -- moderator ] A panel discussion on the question: "DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK?" took place this July at IJCNN'2000 (International Joint Conference on Neural Networks) in Como, Italy. Following persons were on the panel: 1) DAN LEVINE; 2) LEE GILES; 3) NOEL SHARKEY; 4) ALESSANDRO SPERDUTI; 5) RON SUN; 6) JOHN TAYLOR 7) STEFAN WERMTER; 8) PAUL WERBOS; 9) ASIM ROY. This was the fourth panel discussion at these IJCNN conferences on the fundamental ideas of connectionism. The abstract below summarizes the issues/questions that were addressed by this panel. The fundamental contention was that the basic connectionist framework as outlined by Rumelhart et al. in their many books and papers has no mechanism for rule extraction (reading of weights, etc. from a network) or rule insertion (constructing and embedding rules into a neural network) as is required by many rule-learning mechanisms (both symbolic and fuzzy ones). This is not a dispute about whether humans can and do indeed learn rules from examples or whether such rules can indeed be embedded in a neural network. It is about whether the connectionist framework let's one do that; that is, whether it allows one to create the algorithms required for doing such things as rule insertion and rule extraction. As pointed out in the abstract below, the cell-based distributed control mechanism of connectionism empowers only individual cells (neurons) with the capability to modify/access the connection strengths and other parameters of a network; no other outside agent can do that, as is required by rule insertion and extraction techniques. These rule-learning techniques are, in fact, moving away from the cell-based distributed control notions of connectionism and using broader control theoretic notions where it is assumed that there are parts of the brain that control other parts. In fact, it can be shown that every connectionist algorithm, from back-propagation to ART to SOM, goes beyond the original framework of cell-based distributed control and uses the broader control theoretic notion that there are parts of the brain that control other parts. There is, of course, nothing wrong with this broader control theoretic notion because there is enough neurobiological evidence about neurotransmitters and neuromodulators to support it. This debate once more points out the limitations of connectionism. Ron Sun notes that "Clearly the death knell of strong connectionism has been sounded." With regard to rule extractors in rule-learning schemes, John Taylor notes that "There do not seem to be there similar rule extractors of the connection strengths." On the same issue, Paul Werbos says: "But I would not call it a "neural network" method exactly (even though neural net learning is used) because I do not believe that real organic brains contain that kind of hardwired readout device." Noel Sarkey says: "Currently there seems little reason (or evidence) to even think about the idea of extracting rules from our neural synapses - otherwise why can we not extract our bicycle riding rules from our brain" and "However, the relationship between symbolic rules and how they emerge from connectionist nets or even whether or not they really exist has never been resolved in connectionism." For those interested, summaries of prior debates on the basic ideas of connectionism are available at the CompNeuro website at Caltech. Here is a partial list of the debate summaries available there. www.bbb.caltech.edu/compneuro/cneuro99/0079.html - Some more questions in the search for sources of control in the brain www.bbb.caltech.edu/compneuro/cneuro98/0088.html - BRAINS INTERNAL MECHANISMS - THE NEED FOR A NEW PARADIGM www.bbb.caltech.edu/compneuro/cneuro97/0069.html - COULD THERE BE REAL-TIME, INSTANTANEOUS LEARNING IN THE BRAIN? www.bbb.caltech.edu/compneuro/cneuro97/0043.html - CONNECTIONIST LEARNING: IS IT TIME TO RECONSIDER THE FOUNDATIONS? www.bbb.caltech.edu/compneuro/cneuro97/0040.html - DOES PLASTICITY IMPLY LOCAL LEARNING? AND OTHER QUESTIONS www.bbb.caltech.edu/compneuro/cneuro96/0047.html - Connectionist Learning - Some New Ideas/Questions Some of the summaries are also available at the CONNEC_L website: [ More results from www.shef.ac.uk ] Asim Roy Arizona State University -------------------------------------------------------------------------- DOES CONNECTIONISM PERMIT READING OF RULES FROM A NETWORK? Many scientists believe that the symbolic (crisp) and fuzzy (imprecise and vague) rules learned, used and expressed by humans are embedded in the networks of neurons in the brain - that these rules exist in the connection weights, the node functions and in the structure of the network. It is also believed that when humans verbalize these rules, they simply "read" the rules from the corresponding neural networks in their brains. Thus there is a growing body of work that shows that both fuzzy and symbolic rule systems can be implemented using neural networks. This body of work also shows that these fuzzy and symbolic rules can be retrieved from these networks, once they have been learned, by procedures that generally fall under the category of rule extraction. But the idea of rule extraction from a neural network involves certain procedures - specifically the reading of parameters from a network - that are not allowed by the connectionist framework that these neural networks are based on. Such rule extraction procedures imply a greater freedom and latitude about the internal mechanisms of the brain than is permitted by connectionism, as explained below. In general, the idea of reading (extracting) rules from a neural network has a fundamental conflict with the ideas of connectionism. This is because the connectionist networks by "themselves" are inherently incapable of producing the "rules," that are embedded in the network, as output, since the "rules" are not supposed to be the outputs of connectionist networks. And in connectionism, there is no provision for an external source (a neuron or a network of neurons), in a sense a third party, to read the rules embedded in a particular connectionist network. Some more clarification perhaps is needed on this point. The connectionist framework, in the use mode, has provision only for providing certain inputs (real, binary) to a network through its input nodes and obtaining certain outputs (real, binary) from the network through its output nodes. That is, in fact, the only "mode of operation" of a connectionist network. In other words, that is all one can get from a connectionist network in terms of output - nothing else is allowed in the connectionist framework. So no symbolic or fuzzy rules can be "output" or "read" by a connectionist network. The connectionist network, in a sense, is a "closed entity" in the use mode; no other type of operation, other than the regular input-output operation, can be performed by or with the network. There is no provision for any "extra or outside procedures" in the connectionist framework to examine and interpret a network, to look into the rules it's using or the internal representation it has learned or created. So, for example, the connectionist framework has no provision for "reading" a weight from a network or for finding out the kind of rule/constraint learned by a node. The existence of any "outside procedure" for such a task, in existence outside of the network where the rules are, would go against the basic connectionist philosophy. Connectionism has never stated that the networks can be "examined and accessed in ways" other than the input-output mode. So there is nothing in the connectionist framework that lets one develop procedures to read and extract rules from a network. So a rule extraction procedure violates in a major way the principles of connectionism by invoking a means of extracting the weights and rules and other information from a network. There is no provision/mechanism in the connectionist framework for doing that. So the whole notion of rules existing in a network, that can be accessed and verbalized as necessary, is contradictory to the connectionist philosophy. There is absolutely no provision for "accessing networks/rules" in the connectionist framework. Connectionism forgot about the need to extract rules. -------------------------------------------------------------------------- LEE GILES Early Connectionism/NNs * McCulloch, Pitts (MC) 40's: models that were basically circuit design, suggestive but very primitive. * Kleene, 50's: MC networks as regular expressions and grammars. Early AI. * Minsky, 60's: MC networks as logic, automata, sequential machines, design rules. More AI. (not the perceptron work!) Foundations of early high level VLSI design. Early connectionism/NNs always had rules and logic as part of their philosophy and implementation. Late 20th Century Connectionism/NNs * Rules - vital part of AI. * Empirical & theoretical work on rules extractable, encodeable and trainable in many if not all connectionist systems. * Most recent work in data mining * Future work in SVMs Rules are more important but not essential in some applications; natural language processing, expert systems and speech processing systems used in many applications. 21st Century Connectionism/NNs * Knowledge & information discovery and extraction * Knowledge prediction Rules and laws are important * New connectionism/NN - challenges * Applications will continue to be important * Cheap and plentiful data will be everywhere * Text * Nontext - sensor, audio, video, etc. * Pervasive computing and information access * New connectionism/NN future? * Integration philosophically, theoretically and empirically with other areas of AI, computer and engineering science continues (rules will become more important) * Biology and chips will play a new role Foundations of Connectionism/NNs * Rules were always a theoretical and philosophical part of the connectionist/nn models. * If/then rules, automata, graphs, logic * Importance? * Comfort factor * New knowledge * Autonomous systems - communication -------------------------------------------------------------------------- DAN LEVINE There are really two basic types of problems that involve encoding rules in neural networks. One type involves inferring rules from a series of interactions with the environment that entail some regularity. An example would be a cognitive task (such as the Wisconsin Card Sorting Test used by clinical neuropsychologists) in which a subject is positively reinforced for certain types of actions and needs to discern the general rule guiding the actions which will be rewarded. The other type of problem involves successfully performing cognitive tasks that are guided by an externally given rule. One example is the Rapid Information Processing task, in which the subject is instructed to press a key when he or she sees three odd or three even digits in a row. In other words, the neural network needs to translate the explicit verbal rule into connection strengths that will guide motor performance in a way that accords with that rule. The first type of problem has already been simulated in a range of neural networks including some by my own group and by John Taylor's group. These networks typically require modulatory transmitters that allow reward signals to bias selective attention or to selectively strengthen or weaken existing rule representations. Modulatory transmitters are also involved in models in progress of the second type of problem. In this case, activation of a rule selectively biases appropriate representations of and connections among working memory representations of objects, object categories, and motor actions relevant to following the particular rule. The framing of the question, "Does connectionism allow ...," suggests implicitly that "connectionism" refers to a specified class of neural architectures. However, connectionist and neural network models have been in the last several years increasingly less restricted in their structures. Modelers who began working within specific "schools" such as adaptive resonance (like my own group and Stephen Grossberg's) or back propagation (like Jonathan Cohen's group) have developed models that are guided as much, if not more, by known brain physiology and anatomy as by the original "modeling school." Hence the question should be rephrased "What form of connectionism allows ... ." -------------------------------------------------------------------------- NOEL SHARKEY One of the main themes of 1980s connectionism was the unconscious application of rules. This comes from Cognitive Psychology (where many of the major players started). There are very many charted behaviors, such as reading or riding a bicycle, where participants can perform extremely well and yet cannot explicitly state the rules that they are using. One of the main goals of Cognitive Psychologists was to find tasks that would enable them to probe at unconscious skills. Such skills were labeled as "automatic" in contrast to the slower, more intensive "controlled" or "strategic" processes. From the limited amount that psychologists talk beyond their data, strategic processes were vaguely considered to have something to do with awareness or conscious processes. This was a hot potato to be avoided. But there is no single coherent philosophy covering all of connectionism. Researchers in the 1980s, including myself, began to experiment with methods for extracting the rules from connectionist networks. This was partly motivated by psychological considerations but mainly to advance the field in computing and engineering terms. In my lab, the interest in rule extraction was to help to specify the behavior of physical systems, such as engines, which are notoriously difficult to specify in other ways. For example, if a neural network could learn to perform a difficult-to-specify task, then, if rules could be extracted from that net, a crude specification could be begun. However, the relationship between symbolic rules and how they emerge from connectionist nets or even whether or not they really exist has never been resolved in connectionism. It seems clear that we can propositionalize our rules and pass them on to other people. Simple rules such as, "eating is not allowed in this room" appear to be learned instantly from reading them in linguistic form, yet we have not seen a universally accepted connectionist explanation for this type of phenomenon and we certainly do not extract these rules from our nervous system after the fact. Now imagine we do extract rules directly from our brains, how would this be done. If we follow from the lessons of rule extraction techniques for neural networks, there are two distinctive methods which may be called internal and external. Internal is where the process of extraction operates on network parameters such as weight values or hidden unit activations. External is where the mechanism uses the input and output relation to calculate the rules - it is assumed that the neural network will have filtered out most of the noise. Currently there seems little reason (or evidence) to even think about the idea of extracting rules from our neural synapses - otherwise why can we not extract our bicycle riding rules from our brain. It seems that the only real option would be to "run the net" and calculate the rules from the input output relations. Nonetheless, this is not a well informed answer, nor is there likely to be one at present. This is an issue that needs considerably more research. -------------------------------------------------------------------------- ALESSANDRO SPERDUTI Before arguing about the possibility to extract rules from a neural network, the concept itself of "rule" should be clarified. In fact, when talking about rules in this context, everybody has in mind the concept of "symbolic rule", i.e., a rule that involves discrete entities. Moreover, the semantics of these entities is defined by a subjective "interpretation" function. However, the concept of rule is far more general and it can involve in general any kind of entity or variable, subject to the constraint that a finite description (i.e., representation) of the entity exists and can be used. Thus, a rule involving continuous variables and/or entities is as well legitimate, and in many cases useful. Consequently, the question about the capability of a connectionist system to capture "symbolic rules" in such a form to permit easy reading is conditional to the nature of the learned function: if it is discrete, the posed question is meaningful. Assuming a discrete nature for the learned function, however, there is no guarantee that a trained neural network will encode the function in a way that allows easy reading of "symbolic rules", whose representation, by the way, is in principle arbitrary with respect to the representational primitives (neurons) of the neural network. -------------------------------------------------------------------------- RON SUN Many early connectionist models have some significant shortcomings. For example, the limitations due to the regularity of their structures led to, e.g., difficulty in representing and interpreting symbolic structures (despite some limited successes that we have seen). Other limitations are due to learning algorithms used by such models, which led to, e.g., lengthy training (requiring many repeated trials); complete I/O mappings must be known a priori; etc. There are also limitations in terms of biological relevance. For example, these models may bear only remote resemblance to biological processes; they are far less complex than biological NNs, and so on. In coping with these difficulties, two forms of connectionism emerged: Strong connectionism adheres strictly to the precepts of connectionism, which may be unnecessarily restrictive and incur huge cost for some symbolic processing. On the other hand, weak connectionism (or hybrid connectionism) encourages the incorporation of both symbolic and subsymbolic processes: reaping the benefit of connectionism while avoiding its shortcomings. There have been many theoretical and practical arguments for hybrid connectionism; see e.g. Sun (1994). In light of this background, how do we answer the question of whether there can be ``rule reading" in connectionist models? Here is my three-fold answer: (1) Psychologically speaking, the answer is yes. For example, Smith, Langston and Nisbet (1992), Hadley (1990), Sun (1995) presented strong cases for the existence of EXPLICIT rules in psychological processes, based on psychological experimental data, theoretical arguments, thought experiments, and cognitive modeling. If connectionist models are to become general cognitive models, they should be able to handle the use and the learning of such explicit rules too. (2) Methodologically speaking, the answer is also yes. Connectionism is merely a methodology, and not an exclusive one --- to be used to the exclusion of other methodologies. Considering our lack of sufficient neurobiological understanding at present, a dogmatic or strict view on ``neural plausibility" is not warranted. (3) Computationally speaking, the answer is again yes. By now, we know that we can implement ``rule reading" in many ways computationally, e.g., (a) in symbolic forms (which leads to hybrid connectionism), or (b) in connectionist forms (which leads to connectionist implementationalism). Some such implementations may have as good neurobiological plausibility as any other connectionist models. The key point is: To remove the strait-jacket of strong connectionism: we should advocate (1) methodological connectionism, treating it as one possible approach, not to the exclusion of others. and (2) weak connectionism (hybrid connectionism), encouraging the incorporation of non-NN representations and processes. Clearly, the death knell of strong connectionism has been sounded. It's time for a more open-minded framework in which we conduct our research. My own group has been conducting research in this way for more than a decade. For the work by my group along these lines, see http://www.cecs.missouri.edu/~rsun -------------------------------------------------------------------------- JOHN TAYLOR Getting a Connectionist Network to Explain its Rules. JG Taylor, Dept of Mathematics, King's College, Strand, London WC2R2LS, UK. email: john.g.taylor at kcl.ac.uk Accepting that rules can be extracted from trained neural networks by a range of techniques, I first addressed the problem of how this might occur in the brain. There do not seem to be there similar rule extractors of the connection strengths. In the brain are two extremes: implicit and explicit rules. Implicit skills, which implement rules in motor responses, are not based on an explicit knowledge of the rules implemented by the neural networks of the motor cortex. It is in explicit rules, as supported by language and the inductive/deductive process that rules are created by human experience. Turning to language, I described what is presently known from brain imaging about the coding of semantics and syntax in sites in the brain. These both make heavy use of the frontal recurrent cortico-thalamo-NRT circuits, and it can be conjectured to be the architecture used to build phrase structure analysers (through suitable recurrence), guided by 'virtual actions'. These are the basis for syntactic rules and also rules for causal inference, as seen in what is called 'predictive coding' in frontal lobes in monkeys. Thus rule development is undoubtedly supported by such architectures and styles of processing. There is no reason why it cannot be ultimately be implemented in a connectionist framework. Such a methodology would enable a neural system to learn to talk about, and develop, its own explicit rules (although never the implicit ones), and hence solve part of the problem raised by Asim Roy. Implicit rules can be determined by the rule-extraction methods I noted at the beginning. -------------------------------------------------------------------------- PAUL WERBOS Asim Roy has asked us to address many very different, though related issues. A condensed response: (1) A completely seamless interface between rule-based "white box" descriptions and neural net learning techniques already exists. I have a patent on "elastic fuzzy logic" (see Gupta and Sinha eds); Fukuda and Yaeger have effectively applied essentially the same method. But I would not call it a "neural network" method exactly (even though neural net learning is used) because I do not believe that real organic brains contain that kind of hardwired readout device. (2) Where, in fact, DOES symbolic reasoning arise in biology? Some of my views are summarized in the book "The Evolution of Human Intelligence" (see www.futurefoundation.org). Curiously, there is a connection to a previous panel Asim organized, addressing memory-based learning. In the most primitive mammal brains, I theorized back in 1977 that there is an interplay between two levels of learning: (1) a slow-learning but powerful system which generalizes from current experience AND from memory; (2) a fast-learning but poorly-generalizing heteroassociative memory system. (e.g. See my chapter in Roychowdhury et al, Theoretical Advances...). At IJCNN2000, Mike Denham described the "what when where" system of the brain. I theorize that some (or all) primates extended the heteroassociative memory system, to include "who what when where," using mirror neurons to provide an encoding of the experience of other primates. In other words, even monkeys probably have the power to generalize from the (directly observed) experience of OTHER MONKEYS, which they can reconstruct without any higher reasoning faculties. I theorize that human intelligence is basically an extension of this underlying capability, based on a biological system to reconstruct experience of others communicated first by dance (as in the Bushman dance), and later by "word movies." Symbolic reasoning ala Aristotle and Plato, and propositional language ala English, are not really biologically based as such, but learned based on modern culture, and rooted in the biology which supports dance and word movies. If there can be such a thing as truly biologically rooted symbolic/semiotic intelligence, we aren't there yet; modern humanity is only a kind of missing link, a halfway house between other primates and that next level. (For more detail, see the last chapter of my book "The Roots of Backpropagation," Wiley 1994, which also includes the first published work on true backpropagation, and the chapter in Kunio Yasue et al eds, ...Consciousness... forthcoming from John Benjamins.) -------------------------------------------------------------------------- STEFAN WERMTER Linking Neuroscience, Connectionism and Symbolic Processing Does connectionism permit reading of rules from a network? There are at least two main answers to this question. Researchers from Knowledge Engineering and Representation would argue that it has been done successfully, that it is useful if it helps to understand the networks, or they might not even care whether reading is part of connectionism or external symbolic processes. Connectionist representations can be represented as symbolic knowledge at higher abstraction levels. Symbolic extraction may be not part of connectionism in the strict sense, but symbolic knowledge can emerge from connectionist networks. This may lead to a better understanding and also to the possibility for combining connectionist knowledge with symbolic knowledge sources. Researchers from Cognitive Science, Neuroscience, on the other hand, would argue that in real neurons in the brain there is no symbolic reading mechanism, that symbolic processing emerges based on dynamics of spreading of activation in cortical cell assemblies and that there may be rule-like behavior emerging from neural elements. It would be useful in the future to explore constraints and principles from cognitive neuroscience for building more plausible neural network architectures since there is a lot of new evidence from fmri, eeg, meg experiments. Furthermore, new computational models of spiking neural networks, pulse neural networks, cell assemblies have been designed which promise to link neuroscience with connectionist and even symbolic processing. We are leading the exploration of such efforts in the EmerNet project www.his.sunderland.ac.uk/emernet/. Computational models can benefit from emerging vertical hybridization and abstraction: 1. The symbolic abstraction level is useful for abstract reasoning but lacks preferences. 2. The connectionist knowledge has preferences but still lacks neuroscience reality. 3. Neuroscience knowledge is biologically plausible but architecture and dynamic processing are computationally extremely complex. Therefore we argue for an integration of all three levels for building neural and intelligent systems in the future. ************************************************************************** ******************************** BIOSKETCHES -------------------------------------------------------------------------- DAN LEVINE Web site: www.uta.edu/psychology/faculty/levine -------------------------------------------------------------------------- LEE GILES http://www.neci.nj.nec.com/homepages/giles/html/bio.html -------------------------------------------------------------------------- NOEL SHARKEY Noel Sharkey is an interdisciplinary researcher. Currently a full Professor in the department Computer Science at the university of Sheffield, he holds a Doctorate in Experimental Psychology, is a Fellow of the British Computer Society, a Fellow of the Institution of Electrical Engineers, and a member of the British Experimental Psychology Society. He has worked as a research associate in Computer Science at Yale University, USA, with the AI and Cognitive Science groups and as a senior research associate in psychology at Stanford University, USA, where he has also twice served as a visiting assistant professor. His other jobs have included a "new blood" lecturship (English assistant professor) in Language and Linguistics at Essex University, U.K. and a Readership in Computer Science at Exeter. His editorial work includes Editor-in-Chief of the journal Connection Science, editorial board of Robotics and Autonomous Systems, and editorial board of AI Review. He was Chairman of the IEE professional group A4 (AI) and founding chairman of IEE professional group A9 (Evolutionary and Neural Computing). He has edited special issues on modern developments in autonomous robotics for the journals Robotics and Autonomous Systems, Connection Science, and Autonomous Robots. Noel's intellectual pursuits are in the area of biologically inspired adaptive robotics. In recent years Noel has been involved with the public understanding of science, engineering, technology and the arts. He makes regular appearances on TV as judge and commentator of robot competitions and is director of the Creative Robotics Unit at Magna (CRUM) with projects in flying swarms of robots and in the evolution of cooperation in collective robots. -------------------------------------------------------------------------- ALESSANDRO SPERDUTI Alessandro Sperduti received his education from the University of Pisa, Italy ("laurea" and Doctoral degrees in 1988 and 1993, respectively, all in Computer Science.) In 1993 he spent a period at the International Computer Science Institute, Berkeley, supported by a postdoctoral fellowship. In 1994 he moved back to the Computer Science Department, University of Pisa, where he was Assistant Professor, and where he presently is Associate Professor. His research interests include pattern recognition, image processing, neural networks, hybrid systems. In the field of hybrid systems his work has focused on the integration of symbolic and connectionist systems. He contributed to the organization of several workshops on this subject and he served also in the program committee of conferences on Neural Networks. Alessandro Sperduti is the author or co-author of around 70 refereed papers mainly in the areas of Neural Networks, Fuzzy Systems, Pattern Recognition, and Image Processing. Moreover, he gave several tutorials within international schools and conferences, such as IJCAI `97 and IJCAI `99. He acted as Guest Co-Editor of the IEEE Transactions on Knowledge and Data Engineering for a special issue on Connectionist Models for Learning in Structured Domains, and of the journal Cognitive Systems Research for a special issue on Integration of Symbolic and Connectionist Information Processing Systems. -------------------------------------------------------------------------- RON SUN Ron Sun is an associate professor of computer engineering and computer science at the University of Missouri-Columbia. He received his Ph.D in 1991 from Brandeis University. Dr. Sun's research interests center around the study of intellegence and cognition, especially in the areas of hybrid neural networks model, machine learning, and connectionist knowledge representation and reasoning, He is the author of over 100 papers, and has written, edited or contributed to 15 books, including authoring the book {\it Integrating Rules and Connectionism for Robust Commonsense Reasoning}. and co-editing {\it Computational Architectures Integrating Neural and Symbolic Processes}. For his paper on models of human reasoning, he received the 1991 David Marr Award from Cognitive Science Society He organized and chaired the Workshop on Integrating Neural and Symbolic Processes, 1992, and the Workshop on Connectionist-Symbolic Integration, 1995, as well as co-chairing the Workshop on Cognitive Modeling, 1996 and the Workshop on Hybrid Neural Symbolic Systems, 1998. He has also been on the program committees of the National Conference on Artificial Intelligence (AAAI-93, AAAI-97, AAAI-99), International Joint Conference on Neural Networks (IJCNN-99 and IJCNN-2000), International Two-Stream Conference on Expert Systems and Neural Networks, and other conferences, and has been an invited/plenary speaker for some of them. Dr. Sun is the editor-in-chief of Cognitive Systems Research (Elsevier). He also serves on the editorial boards of Connection Science, Applied Intelligence, and Neural Computing Surveys. He was a guest editor of a special issue of the journal Connection Science and a special issue of IEEE Transactions on Neural Networks, both on hybrid intelligent models. He is a senior member of IEEE. -------------------------------------------------------------------------- JOHN TAYLOR Trained as a theoretical physicist in the Universities of London and Cambridge. Positions in Universities in the UK, USA, Europe in physics and mathematics. Created the Centre for Neural Networks at King's College, London, in 1990, and is still its Director. Appointed Professor of Mathematics, King's College London in 1972, and became Emeritus Professor of Mathematics of London University in 1996. Was Guest Scientist at the Research Centre in Juelich, Germany, 1996-8, working on brain imaging and data analysis. Has been consultant in Neural Networks to several companies. Is presently Director of Research on Global Bond products and Tactical Asset Allocation for a financial investment company involved in time series prediction and European Editor-in-Chief of the journal Neural Networks. He was President of the International Neural Network Society (1995) and the European Neural Network Society (1993/4). He is also editor of the series Perspectives in Neural Computing. Has been on the Advisory Board of the Brain Sciences Institute, RIKEN in Tokyo since 1997. Has published over 500 scientific papers (in theoretical physics, astronomy, particle physics, pure mathematics, neural networks, higher cognitive processes, brain imaging, consciousness), authored 12 books, edited 13 others, including the titles When the Clock Struck Zero (Picador Press, 1994), Artificial Neural Networks (ed, North-Holland, 1992), The Promise of Neural Networks (Springer, 1993), Mathematical Approaches to Neural Networks (ed, Elsevier, 1994), Neural Networks (ed, A Waller, 1995) and The Race for Consciousness (MIT Press, 1999). Started research in neural networks in 1969. Present research interests are: financial and industrial applications; dynamics of learning processes and multi-state synapses; stochastic neural chips and their applications (the pRAM chip); brain imaging and its relation to neural networks; neural modelling of higher cognitive brain processes, including consciousness. Has funded research projects from the EC (on building a hybrid symbolic/subsymbolic processor), from British Telecom.(on Intelligent Agents) and from EPSRC on Building a Neural Network Language System to learn syntax and semantics. -------------------------------------------------------------------------- STEFAN WERMTER Stefan Wermter is Full Professor of Computer Science and Research Chair in Intelligent Systems at the University of Sunderland, UK. He is an Associate Editor of the journal Connection Science and serves on the Editorial Board of the journals Cognitive Systems Research, and Neural Computing Surveys. He has written or edited three books as well as more than 70 articles. He is also Coordinator of the international EmerNet network for neural architectures based on neuroscience and head of the intelligent system group http://www.his.sunderland.ac.uk/. He holds a Diplom from Dortmund University, a MSc from the University of Massachusetts, a PhD and Habilitation from Hamburg University. Stefan Wermters research interests are in Neural Networks, Hybrid Systems, Cognitive Neuroscience, Natural Language Processing, Artificial Intelligence and Bioinformatics. The motivation for this research is twofold: How is it possible to bridge the large gap between real neural networks in the brain and high level cognitive performance? How is it possible to build more effective systems which integrate neural and symbolic technologies in hybrid systems? Based on this motivation Wermter has directed and worked on several projects, e.g. on hybrid neural/symbolic systems for text processing and speech/language integration. Furthermore, he has research interests in Knowledge Extraction from Neural Networks, Interactive Neural Network Agents, Cognitive Neuroscience, Fuzzy Systems as well as the Integration of Speech/Language/Image Processing. -------------------------------------------------------------------------- From stiber at u.washington.edu Thu Oct 26 14:49:50 2000 From: stiber at u.washington.edu (Prof. Michael Stiber) Date: Thu, 26 Oct 2000 11:49:50 -0700 (PDT) Subject: Faculty positions at the University of Washington, Bothell Message-ID: <14840.31950.401515.882178@kasei.bothell.washington.edu> UNIVERSITY OF WASHINGTON, BOTHELL Computing and Software Systems http://www.bothell.washington.edu/CSS Assistant, Associate & Full Professor Positions Innovative and growing computer science program has multiple openings for tenure-track faculty. The Computing & Software Systems (CSS) program at the University of Washington, Bothell (UWB) offers opportunities to: * Establish new research programs, participate in the on-going research efforts, and/or collaborate with local high-tech companies; * Teach state-of-the-art, interdisciplinary computing courses; * Expand program options for undergraduates (e.g., embedded systems, graphics design); and * Participate in designing a new Master's degree program in CSS. Well-qualified candidates in all related areas, including but not limited to databases, networking, distributed systems, and web technology, are encouraged to apply. The CSS Program, in its fifth year of existence, emphasizes and rewards interdisciplinary scholarship and a balance between teaching, research and service. Current research interests include: biocomputing, embedded systems, computer graphics, digital multimedia in distributed environments, HCI, and software engineering. Successful candidates will have the options of participating in these areas and/or establishing new research programs. The undergraduate degree program stresses development of competencies associated with life-long learning, critical thinking, problem solving, communication and management -- as well as contemporary programming and technical skills. Situated in the midst of the Northwest's technology corridor, the fastest growing information technology region in the country, UWB offers opportunities for collaborative work with industry. The University of Washington, Bothell is part of the three-campus University of Washington -- one of the premier institutions of higher education in the U.S. QUALIFICATIONS AND DUTIES Candidates will have a doctorate (required prior to date of appointment) in a relevant field. Successful candidates will be expected to teach a variety of courses, conduct scholarly research, and contribute to program and institution building in this innovative, interdisciplinary environment. Faculty are also expected to advise students during their internship with industry/community partners. A determination of rank and tenure at the assistant, associate or full professor will be commensurate with the qualifications of the individual. APPLICATION INFORMATION Please send a cover letter, statement of teaching philosophy, statement of research interests, and vita. Also, please arrange for three letters of reference to be sent to: Janet McDaniel Coordinator, CSS Search Committee University of Washington, Bothell Campus mail: Box 358534 18115 Campus Way NE Bothell, WA 98011-8246 The University of Washington is building a culturally diverse faculty and strongly encourages applications from female and minority candidates. Review of applications will begin November 15, 2000, and positions will remain open until they are filled. The University of Washington is an Equal Opportunity/Affirmative Action Employer From nello at dcs.rhbnc.ac.uk Fri Oct 27 08:31:43 2000 From: nello at dcs.rhbnc.ac.uk (Nello Cristianini) Date: Fri, 27 Oct 2000 13:31:43 +0100 (BST) Subject: Support Vector Book: Reprint Now Available Message-ID: The second reprint of the Support Vector Book is now finally available via Cambridge University Press, Amazon (US, UK, DE), and other online bookshops. Apologies for the delay in reprinting, we did not anticipate such high demand. Details at: http://www.support-vector.net/ AN INTRODUCTION TO SUPPORT VECTOR MACHINES (and other kernel-based learning methods) N. Cristianini and J. Shawe-Taylor Cambridge University Press 2000 ISBN: 0 521 78019 5 From adr at adrlab.ahc.umn.edu Fri Oct 27 12:29:37 2000 From: adr at adrlab.ahc.umn.edu (A. David Redish) Date: Fri, 27 Oct 2000 11:29:37 -0500 Subject: Postdoctoral position available - Please post Message-ID: <200010271629.LAA03243@adrlab.ahc.umn.edu> Please post. adr POSTDOCTORAL POSITIONS AVAILABLE Postdoctoral positions are immediately available in my laboratory in the Department of Neuroscience at the University of Minnesota. Research interests include behavioral neuroscience, with specific interests in spatial cognition in rats. My laboratory does both experimental and computational work and postdocs would be expected to participate in both aspects. Experimentally, we record from ensembles of neurons in awake, behaving animals. Computationally, we model at the systems-level using models that are tightly constrained by experimental data. Current projects include understanding aspects of hippocampal activity, of the head direction system, and of the basal ganglia. Candidates should have recently completed (or be about to complete) their PhD or MD and should have some experience in either computational or experimental neuroscience. Pay will be commesurate with NIH standards. Those interested should contact me via email at the following address redish at ahc.umn.edu The University of Minnesota is an equal-opportunity employer. ----------------------------------------------------- A. David Redish redish at ahc.umn.edu Assistant Professor http://www.cbc.umn.edu/~redish Department of Neuroscience, University of Minnesota 6-145 Jackson Hall 321 Church St SE Minneapolis MN 55455 ----------------------------------------------------- From Emmanuel.Dauce at cert.fr Fri Oct 27 12:44:52 2000 From: Emmanuel.Dauce at cert.fr (Emmanuel Dauce) Date: Fri, 27 Oct 2000 18:44:52 +0200 (MET DST) Subject: DYNN'2000 Program Message-ID: <200010271644.SAA09955@bigorre.cert.fr> ************************************************************ INTERNATIONAL WORKSHOP ON "DYNAMICAL NEURAL NETWORKS AND APPLICATIONS" Bielefeld, November 20-24, 2000 http://www.cert.fr/anglais/deri/dauce/DYNN2000 ************************************************************ Dear colleagues, The organizing committee of DYNN'2000 is pleased to announce the programm of the international workshop to be held in Bielfeld, Germany from Monday, Nov.24 to Friday Nov 24, 2000. This multidisciplinary workshop is devoted to the study of natural aand artificial neural networks viewed as dynamical systems evolving under environmental pressure For further information, visit our website . After the previous editions of Toulouse (1996) and Stockhom (1998), DYNN'2000 will take place at the ZIF in Bielefeld University in the frame of the Forschungsgruppe "The Sciences of Complexity: From Mathematics to Technology to a Sustainable World." Students are strongly encouraged to attend DYNN'2000. The registration fees are specially low for doctorate students and some university rooms are reserved for their lodging during the workshop. Details regarding how students can apply for these rooms are available on the DYNN'2000 web site. We apologize, in advance, for any redundant messages that you may receive. Thank you for your time in reading this message. If you have any further questions, you may contact me at: samuelid at supaero.fr . Manuel Samuelides DYNN'2000 Workshop Chair ------------------------------------------------------------------------ Program of DYNN'2000 INTERNATIONAL WORKSHOP ON "DYNAMICAL NEURAL NETWORKS AND APPLICATIONS" Bielefeld, November 20-24, 2000 ------------------------------------------------------------------------ Monday November 20, 2000 16:00-16:30 Registration and welcome of registrated people 16:30-19:00 Opening Session Chair: Manuel Samuelides, ENSAE/UPS Toulouse, France. ---------------------------------------------------- 16:30Opening Allocution for the millenium edition of DYNN workshop Philippe Blanchard, ZIF, Bielefeld, Germany 17:00Information Theoretic Approach to Neural Coding Jean-Pierre Nadal, Ecole normale Sup=E9rieure, Paris, France 18:00Modeling the dynamical construction of meaning in language Bernard Victorri, Ecole Normale Sup=E9rieure, Paris, France ------------------------------------------------------------------------ Tuesday November 21, 2000 9:00-12:00 Dynamics and Self-organization 1 Chair: Philippe Blanchard, Bielefeld, Germany. ---------------------------------------------------- 09:00Spike-Time dependent learning rules Wulfram Gerstner, EPFL, Lausanne, Switzerland 09:50An introduction to Self-Organized Criticality Bruno Cessac Institut Non-Lin=E9aire de Nice, France 10:30Coffee Break 10:50Avalanches of activity in neural networks: finite size effects Christian Eurich Universitaet Bremen, Germany 11:40SOC properties of PCNN and application to image processing Xavier Clastres ONERA/UPS, Toulouse, France 12:00-14:00 Lunch at ZIF 14:00-17:00 Control 1 Chair: Jean-Pierre Nadal, ENS, Paris, France ---------------------------------------------------- A Control approach framework for attentional 14:00brain processing John Taylor King's College, London, United Kingdom 15:00Cognition and Autonomy of Robots from the Dynamical Systems Jun Tani Sony Computer Laboratory,Tokyo, Japan 15:40Dynamical systems for perception and action in autonomous robots. Michael Herrmann, MPI fuer Stroemungsforschung, Germany 16:20Information transmission of arms coordination and movement direction in the motor cortex. Valeria Del Prete, SISSA, Trieste, Italy. 16:40Information transmission of arms coordination and movement Jim Vaccaro US Air Force Rome Laboratory, USA 17:00-17:30 Coffee break 17:30-19:00 Mathematical modelling of nervous system. Chair: Jacques Demongeot, IUF, Grenoble, France ---------------------------------------------------- 17:30A Control approach framework for attentional brain processing John Taylor King's College, London, United Kingdom 18:30Fractal model for neural network organization in breathing center in mouse. Arthur Luczak, Jagellonian University, Krakow, Poland. ------------------------------------------------------------------------ Wednesday, November 22, 2000 9:00-12:00 Dynamics and Self-organization 2 Chair: Bruno Cessac, INL, Nice, France ---------------------------------------------------- 09:00 Positive regulation cycles and dynamics of neural network Jacques Demongeot, Grenoble, Institut Universitaire de France. 09:40 Estimation in interacting point processes models: A gibbsian approximation approach. Thierry Herv=E9, IMAG, Grenoble, France. 10:20 Coffee Break 10:40 Dynamical memories on large recurrent neural networks. Sebastiano Stramaglia, Istituto Elaborazione Segnali ed Immagini, Bari, Italy. 11:20 Dynamical memories on large recurrent neural networks. Emmanuel Dauc=E9, ONERA/DTIM, Toulouse, France. 11:40 Using small parameter changes to access various limit-cycles in random networks. Patrick Mc Guire, ZIF Universitatet Bielefeld, University of Arizona, USA. 12:00-14:00 Lunch at ZIF 14:00-16:00 Control 2 Chair : John Taylor, King's College, London, UK ---------------------------------------------------- 14:00Importance of the interaction Dynamics in the imitation processes: From temporal sequence learning to implicit reward communication. Philippe Gaussier, ETIS/ENSEA, Cergy-Pontoise, France. 14:40Recurrent neural networks for robotic control and planning Mathias Quoy, ETIS/ENSEA, Cergy-Pontoise, France. 15:20Dynamic Neural Fields for Robot Control Axel Steinhage, Ruhr-Universit=E4t, Bochum, Germany. 16:00-16:30 Coffee break 16:30-18:30 Spiking Neurons 1 Chair: Jacques Demongeot, IUF, Grenoble, France ---------------------------------------------------- 16:30Recurrent competition, synaptic dynamics and optimal coding in primary visual cortex Klaus Obermayer, Technische Universit=E4t Berlin, Germany. 17:30Influence of noise on neuronal dynamics: from single units to interacting systems Khashayar Pakdaman, INSERM-U244, Paris, France. 18:10Back to spike coding in networks trained with sigmoidal neurons. Christelle Godin CEA Grenoble, France. ------------------------------------------------------------------------ Thursday, November 23, 2000 9:00-12:20 Dynamics and Self-organization 3 Chair: Philippe Blanchard, Bielefeld, Germany ---------------------------------------------------- 09:00 A large deviation approach of random reccurent neural network dynamics. Manuel Samuelides, ENSAE, ONERA, Toulouse, France. 10:00 A mathematical approach to the problem of synchronization in the cortex Ruedi Stoop, University/ETH, Zurich. 10:40 Coffee Break 11:00 Frustrated dynamics in small Hopfield networks. Hugues Bersini, IRIDIA/ULB, Bruxelles, Belgique. 11:40 Geometrical analysis of associative memory dynamics and practical application. Toshiki Kindo Matsushita Research Institute, Tokyo Inc., Japan. 12:00-14:00 Lunch at ZIF 14:00-17:00 Spiking Neurons 2 Chair: Jacques Demongeot, IUF, Grenoble, France ---------------------------------------------------- 14:00Reverse Engineering the Visual System with Networks of Asynchronously Spiking Neurons Simon Thorpe, CNRS/CERCO, Toulouse, France. 15:00Synfire Chain in Coincidence Detectors. Ikeda Kazushi, Kyoto University, Japan. 15:40A generative model for temporal hebbian synaptic plasticity. Laurent Perrinet, ONERA/DTIM, Toulouse, France. 16:00Rapid dynamic feature binding in feedforward spiking neural vector networks. Sander Bohte, University of Amsterdam, Netherlands. 16:20Toward a realistic neuron learning rule. Jianfeng Feng Sussex University Brighton BN1, UK. 16:40-17:00 Coffee break 17:30-19:00 Vision Chair:Philippe Gaussier, ETIS/ENSEA, Cergy-Pontoise, France ---------------------------------------------------- 17:00Advantages in Image Processing using Dynamic Neurons Jason Kinser George Mason University, USA. 17:40Optical flow based on the Pulse Coupled Neural Network. Watanabe Takashi Saitama University, Japan. 18:20Analysis of Development of Oriented Receptive Fields in Linsker's Model Tadashi Yamazaki Tokyo Institute of Technology, Japan. 18:40A Neural Network Approach for Cancer Diagnosis Based on Dynamical Data Sets Andreas Degenhard, Institute of Cancer Research, Great Britain. ------------------------------------------------------------------------ Friday, November 24, 2000 9:00-12:20 Mean-field theory Chair: John Taylor, King's College, London, UK ----------------------------------------------------- 9:50 Mean field theory for stochastic neural networks and graphical models. Bert Kappen University of Nijmege, Netherlands. 10:40 Coffee Break 11:00 On the landscape of attractors in Hopfield-like neural networks. Daniel Gandolfo CPT-CNRS, Luminy, France. 11:40 Dynamics of associative memory networks with synaptic depression. Dimitri Bibitchkov Max-Planck Institute Gottingen, Germany. From Frank.Pasemann at rz.uni-jena.de Sat Oct 28 07:43:36 2000 From: Frank.Pasemann at rz.uni-jena.de (Frank Pasemann) Date: Sat, 28 Oct 2000 13:43:36 +0200 Subject: OPEN POSITION PhD/Postdoc Message-ID: <39FABBE8.C1979173@rz.uni-jena.de> Please post: The TheoLab - Research Unit for Structure Dynamics and the Evolution of Systems - Friedrich Schiller University Jena, Germany - invites PhD candidates and Phd recipients with pronounced interests in the field of "Evolution and Dynamics of Neural Controllers for Autonomous Systems" to apply for a PhD Studentship (BAT IIa/2-O) / Postdoctoral Fellowship (BAT IIa-O). Positions are open immediately and they are limited to May 31, 2002. Extensions may be possible. Applicants should have experience in computer simulations (C, C++, Graphics, Linux). A background in the fields of Dynamical Systems Theory, Neural Networks, and/or Embodied Cognitive Science is favorable. Experiments can be done with Khepera robots or software agents. The theoretically oriented activities will be concerned with the relationship between behavior of agents and driving network structures/dynamics, theory of evolutionary processes, optimization of evolutionary algorithms, development of behavior oriented learning strategies for recurrent networks. For further information see http://www.theorielabor.de. Successful applicants may benefit from TheoLabs association with the Max Planck Institute for Mathematics in the Sciences, Leipzig, and the Institute for Medical Statistics, Informatics and Documentation, University Jena. Applications (CV, two academic referees) should be send as soon as possible to Prof. Dr. Frank Pasemann TheorieLabor Friedrich-Schiller-Universit?t Jena Ernst-Abbe-Platz 4, D-07740 Jena Tel: xx49-3641-949530 frank.pasemann at rz.uni-jena.de From Dave_Touretzky at cs.cmu.edu Mon Oct 30 00:36:35 2000 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Mon, 30 Oct 2000 00:36:35 -0500 Subject: graduate study in the neural basis of cognition Message-ID: <18291.972884195@skinner.boltz.cs.cmu.edu> The Center for the Neural Basis of Cognition (CNBC) offers interdisciplinary doctoral training in affiliation with seven PhD programs at Carnegie Mellon University and the University of Pittsburgh. See our web site http://www.cnbc.cmu.edu for detailed information. The Center is dedicated to the study of the neural basis of cognitive processes. The Center's goals are: 1) To elucidate cellular and molecular mechanisms of information coding and processing by neural circuits. 2) To understand how neural circuits interact in functional systems that form the substrate for cognitive processes, including perception, thought, behavior and language. 3) To identify mechanisms that enable genes, development and diseases to shape cognition. 4) To promote the application of research results to problems in medicine, artificial intelligence and robotics. In order to foster innovative research, the Center encourages and supports cross-disciplinary research between workers in the fields of neuroscience, psychology, mathematics, computer science, and medicine. In this environment, students have access to world-class research facilities for cellular and systems neurophysiology, functional brain imaging, confocal imaging, transgenic methods, gene-chip technology and high performance computing. A variety of different experimental models are in use, ranging from simple in vitro systems, to whole animal work on transgenic animals, primates and neurologically and psychologically-defined patient populations. As part of the Center's commitment to cross-disciplinary research, we have recently instituted an IGERT program with support from the National Science Foundation. IGERT, which stands for Integrative Graduate Education and Research Training, allows students to receive hands-on training in a relevant crossover discipline. For example, a computer scientist specializing in neural modeling could learn to do neurophysiology, or a psychology student working on functional brain imaging could receive training in advanced statistical analysis of fMRI data. The IGERT option is available to all CNBC students. Students are admitted jointly to a home training program and to the CNBC Training Program. Applications are encouraged from students having interests in biology, neuroscience, psychology, engineering, physics, mathematics, computer science, or robotics. For a brochure describing the program and application materials, contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu Application materials are also available online. The affiliated PhD programs are: Carnegie Mellon University University of Pittsburgh Computer Science Center for Neuroscience Psychology Mathematics Robotics Psychology Statistics The CNBC training faculty includes: John Anderson (CMU Psychology): ACT-R theory of cognition, problem-solving German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Cameron Carter (Pitt Neurosci., Psych): fMRI, attention, memory, schizophrenia Carson Chow (Pitt Mathematics): spatiotemporal dynamics in neural networks Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Steve DeKosky (Pitt Neurobiology): neurodegenerative human disease William Eddy (CMU Statistics): analysis of fMRI data Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language Chris Genovese (CMU Statistics): making inferences from scientific data Lori Holt (CMU Psychology): mechanisms of speech perception, perceptual learning John Horn (Pitt Neurobiology): synaptic plasticity in autonomic ganglia Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Marcel Just (CMU Psychology): visual thinking, language comprehension Robert Kass (CMU Statistics): transmission of info. by collections of neurons Eric Klann (Pitt Neuroscience): hippocampal LTP and LTD Roberta Klatzky (CMU Psychology): human perception and cognition Richard Koerber (Pitt Neurobiology): devel. and plasticity of spinal networks Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision Pat Levitt (Pitt Neurobiology): molecular basis of cortical development Michael Lewicki (CMU Comp. Sci.): learning and representation David Lewis (Pitt Neuroscience): anatomy of frontal cortex Brian MacWhinney (CMU Psychology): models of language acquisition Yoky Matsuoka (CMU Robotics): rehabilitation robotics, motor psychophysics James McClelland (CMU Psychology): connectionist models of cognition Paula Monaghan Nichols (Pitt Neurobiology): vertebrate CNS development Carl Olson (CNBC): spatial representations in primate frontal cortex David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia Lynn Reder (CMU Psychology): organization of memory, cognitive processing Jonathan Rubin (Pitt Mathematics): oscillations in networks of coupled neurons Walter Schneider (Pitt Psych.): fMRI, models of attention & skill acquisition Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex William Skaggs (Pitt Neuroscience): representations in rodent hippocampus Peter Strick (Pitt Neurobiology): basal ganglia & cerebellum, motor system David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning See http://www.cnbc.cmu.edu/people for further details. From icann at ai.univie.ac.at Mon Oct 30 11:35:46 2000 From: icann at ai.univie.ac.at (ICANN conference server) Date: Mon, 30 Oct 2000 17:35:46 +0100 Subject: Call for Papers: ICANN 2001 Message-ID: <39FDA362.735EF3C0@ai.univie.ac.at> Call for Papers ============================================================ ICANN 2001 International Conference on Artificial Neural Networks Aug. 21-25, 2001 Vienna, Austria http://www.ai.univie.ac.at/icann the annual conference of the European Neural Network Society ============================================================ We invite submission of full-length papers on work in neural computation covering theory, algorithms, applications or implementation in one of the following areas: - Computational Neuroscience - Data Analysis and Pattern Recognition - Vision and Image Processing - Signal and Time Series Processing - Robotics and Control - Connectionist Cognitive Science - Selforganization and Dynamical Systems Deadline for receipt of papers: !!!!!!!!!!!!!!! Feb 16, 2001 !!!!!!!!!!!!!!! (electronic submission will be possible) Please see our web page for more details. http://www.ai.univie.ac.at/icann ___________________________________________________________ Furthermore, we invite proposals for tutorials, special sessions or workshops on selected topics in neural computation. Deadline for receipt of proposals: !!!!!!!!!!!!! Feb 1, 2001 !!!!!!!!!!!!! Please see our web page for more details http://www.ai.univie.ac.at/icann ___________________________________________________________ ICANN 2001 is organized by the Austrian Research Institute for Artificial Intelligence in cooperation with the Vienna University of Technology's Pattern Recognition and Image Processing Group and Center for Computational Intelligence. General Chair: Georg Dorffner, Austria Program Co-Chairs: Horst Bischof, Austria Kurt Hornik, Austria Organizing Commitee: Arthur Flexer, Austria Karin Hraby, Austria Friedrich Leisch, Austria Andreas Weingessel, Austria Workshop Chair: Christian Schittenkopf, Austria Program Committee: Shun-Ichi Amari, Japan Peter Auer, Austria Pierre Baldi, USA Peter Bartlett, Australia Shai Ben David, Israel Matthias Budil, Austria Joachim Buhmann, Germany Rodney Cotterill, Denmark Gary Cottrell, USA Kostas Diamantaras, Greece Wlodek Duch, Poland Peter Erdi, Hungary Patrick Gallinari, France Wulfram Gerstner, Switzerland Stan Gielen, The Netherlands Stefanos Kollias, Greece Vera Kurkova, Czech Republic Anders Lansner, Sweden Ales Leonardis, Slovenia Hans-Peter Mallot, Germany Christoph von der Malsburg, Germany Klaus-Robert Mller, Germany Thomas Natschlger, Austria Lars Niklasson, Sweden Stefano Nolfi, Italy Erkki Oja, Finland Gert Pfurtscheller, Austria Stephen Roberts, UK Mariagiovanna Sami, Italy Jrgen Schmidhuber, Switzerland Olli Simula, Finland Peter Sincak, Slovakia John Taylor, UK Volker Tresp, Germany Michel Verleysen, Belgium ___________________________________________________________ For questions contact icann at ai.univie.ac.at From David.Cohn at acm.org Mon Oct 30 21:11:01 2000 From: David.Cohn at acm.org (David 'Pablo' Cohn) Date: Mon, 30 Oct 2000 21:11:01 -0500 Subject: Two new papers in the Journal of Machine Learning Research Message-ID: <4.2.0.58.20001030210422.00af08f0@ux7.sp.cs.cmu.edu> [Apologies for the broad distribution. Future announcements of this type will only be posted to jmlr-announce at ai.mit.edu. See the end of this message for information on subscribing.] The Journal of Machine Learning is pleased to announce the availability of two papers in electronic form. ---------------------------------------- Learning with Mixtures of Trees Marina Meila and Michael I. Jordan. Journal of Machine Learning Research 1 (October 2000) pp. 1-48. Abstract This paper describes the mixtures-of-trees model, a probabilistic model for discrete multidimensional domains. Mixtures-of-trees generalize the probabilistic trees of Chow and Liu (1968) in a different and complementary direction to that of Bayesian networks. We present efficient algorithms for learning mixtures-of-trees models in maximum likelihood and Bayesian frameworks. We also discuss additional efficiencies that can be obtained when data are "sparse," and we present data structures and algorithms that exploit such sparseness. Experimental results demonstrate the performance of the model for both density estimation and classification. We also discuss the sense in which tree-based classifiers perform an implicit form of feature selection, and demonstrate a resulting insensitivity to irrelevant attributes. ---------------------------------------- Dependency Networks for Inference, Collaborative Filtering, and Data Visualization David Heckerman, David Maxwell Chickering, Christopher Meek, Robert Rounthwaite, and Carl Kadie. Journal of Machine Learning Research 1 (October 2000), pp. 49-75. Abstract We describe a graphical model for probabilistic relationships--an alternative to the Bayesian network--called a dependency network. The graph of a dependency network, unlike a Bayesian network, is potentially cyclic. The probability component of a dependency network, like a Bayesian network, is a set of conditional distributions, one for each node given its parents. We identify several basic properties of this representation and describe a computationally efficient procedure for learning the graph and probability components from data. We describe the application of this representation to probabilistic inference, collaborative filtering (the task of predicting preferences), and the visualization of acausal predictive relationships. These first two papers of Volume 1 are available at http://www.jmlr.org in PostScript, PDF and HTML formats; a bound, hardcopy edition of Volume 1 will be available in the next year. -David Cohn, Managing Editor, Journal of Machine Learning Research ------- This message has been sent to the mailing list "jmlr-announce at ai.mit.edu", which is maintained automatically by majordomo. To subscribe to the list, send mail to listserv at ai.mit.edu with the line "subscribe jmlr-announce" in the body; to unsubscribe send email to listserv at ai.mit.edu with the line "unsubscribe jmlr-announce" in the body. From melnik at cs.brandeis.edu Mon Oct 30 23:16:53 2000 From: melnik at cs.brandeis.edu (Ofer Melnik) Date: Mon, 30 Oct 2000 23:16:53 -0500 (EST) Subject: DIBA source code release Message-ID: The Decision Intersection Boundary Algorithm describes what a threshold feed-forward neural network is doing in the form of polytopic decision regions. It can be used to visualize and analyze the exact computation being performed by these types of networks. Examples of the kind of information that can be extracted by this algorithm can be seen on the webpage: http://www.demo.cs.brandeis.edu/pr/DIBA The basic DIBA algorithm is being released as C++ source code under the Gnu Public License, without support. It has been compiled under Linux, but should compile under any standard C++ compiler and libraries. The three dimensional output formats available are Matlab, Povray and MHD. -Ofer Melnik ----------------------------------------------------------------- Ofer Melnik PhD melnik at cs.brandeis.edu Volen Center for Complex Systems Ph: (781)-736-2719 Brandeis University (781)-736-DEMO Waltham MA 02254 From masuoka at flab.fujitsu.co.jp Tue Oct 31 00:01:28 2000 From: masuoka at flab.fujitsu.co.jp (Ryusuke Masuoka) Date: Tue, 31 Oct 2000 00:01:28 -0500 Subject: FW: Ph.D. Thesis available: Neural networks learning differential data Message-ID: Dear Connections, Some people seem to have troubles with accessing the URL provided in my attached email. It seems that ~ (tilde) has been replaced with ? (question mark) somewhere. (I know some do not have this problem and I do not know the reason why it happened yet.) The URL is, again the following: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/thesis/thesis.html If you ever see a question mark in front of "masuoka" in the above URL, please replace it with ~ (tilde). Sorry for troubles I might have caused you, and thank you for those who have kindly notified me of the trouble. Regards, Ryusuke -------------- next part -------------- An embedded message was scrubbed... From: unknown sender Subject: no subject Date: no date Size: 38 Url: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/19e87a36/attachment-0001.mht From masuoka at flab.fujitsu.co.jp Sun Oct 15 18:48:27 2000 From: masuoka at flab.fujitsu.co.jp (Ryusuke Masuoka) Date: Sun, 15 Oct 2000 17:48:27 -0500 Subject: Ph.D. Thesis available: Neural networks learning differential data In-Reply-To: <200010131754.LAA13846@grey.colorado.edu> Message-ID: Dear Connectionists, I am pleased to announce the availability of my Ph.D. thesis for download in electronic format. Comments are welcome. Regrads, Ryusuke ------------------------------------------------------------ Thesis: ------- Title: "Neural Networks Learning Differential Data" Advisor: Michio Yamada URL: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/thesis/thesis.html Abstract: -------- Learning systems that learn from previous experiences and/or provided examples of appropriate behaviors, allow the people to specify {\em what} the systems should do for each case, not {\em how} systems should act for each step. That eases system users' burdens to a great extent. It is essential in efficient and accurate learning for supervised learning systems such as neural networks to be able to utilize knowledge in the forms of such as logical expressions, probability distributions, and constraint on differential data along with provided desirable input and output pairs. Neural networks, which can learn constraint on differential data, have already been applied to pattern recognition and differential equations. Other applications such as robotics have been suggested as applications of neural networks learning differential data. In this dissertation, we investigate the extended framework introduce constraints on differential data into neural networks' learning. We also investigate other items that form the foundations for the applications of neural networks learning differential data. First, new and very general architecture and an algorithm are introduced for multilayer perceptrons to learn differential data The algorithm is applicable to learning differential data of orders not only first but also higher than first and completely localized to each unit in the multilayer perceptrons like the back propagation algorithm. Then the architecture and the algorithm are implemented as computer programs. This required high programming skills and great amount of care. The main module is programmed in C++. The implementation is used to conduct experiments among others to show convergence of neural networks with differential data of up to third order. Along with the architecture and the algorithm, we give analyses of neural networks learning differential data such as comparison with extra pattern scheme, how learnings work, sample complexity, effects of irrelevant features, and noise robustness. A new application of neural networks learning differential data to continuous action generation in reinforcement learning and its experiments using the implementation are described. The problem is reduced to realization of a random vector generator for a given probability distribution, which corresponds to solving a differential equation of first order. In addition to the above application to reinforcement learning, two other possible applications of neural networks learning differential data are proposed. Those are differential equations and simulation of human arm. For differential equations, we propose a very general framework, which unifies differential equations, boundary conditions, and other constraints. For the simulation, we propose a natural neural network implementation of the minimum-torque-change model. Finally, we present results on higher order extensions to radial basis function (RBF) networks of minimizing solutions with differential error terms, best approximation property of the above solutions, and a proof of $C^l$ denseness of RBF networks. Through these detailed accounts of architecture, an algorithm, an implementation, analyses, and applications, this dissertation as a whole lays the foundations for applications of neural networks learning differential data as learning systems and will help promote their further applications. ------------------------------------------------------------ Ryusuke Masuoka, Ph.D. Senior Researcher Intelligent Systems Laboratory Fujitsu Laboratories Ltd. 1-4-3 Nakase, Mihama-ku Chiba, 261-8588, Japan Email: masuoka at flab.fujitsu.co.jp Web: http://lettuce.ms.u-tokyo.ac.jp/~masuoka/ ------=_NextPart_000_0041_01C042CD.B80B2180-- From golden at utdallas.edu Tue Oct 31 13:03:37 2000 From: golden at utdallas.edu (Richard Golden) Date: Tue, 31 Oct 2000 12:03:37 -0600 (CST) Subject: FUNDING OPPORTUNITIES FOR GRADUATE WORK IN MATHEMATICAL AND COMPUTATIONAL MODELING (fwd) Message-ID: Students interested in pursuing state-of-the-art graduate research in Mathematical and Computational Modeling are encouraged to apply to the Cognition and Neuroscience Doctoral Program in the School of Human Development at the University of Texas at Dallas (UTD). Our School consists of 39 active faculty members whose research interests span areas in neural, perceptual, and cognitive processes througout the lifespan from young infants to adults and onwards into old age. We sincerely hope that you will inform students about the outstanding opportunities for graduate study at UTD. Our faculty have distinguished themselves at the world-class level in the following areas in mathematical and computational modeling. * MODELS OF FACE PERCEPTION AND COGNITION (Abdi, O'Toole) * SPEECH PERCEPTION MODELS (Assmann) * COCHLEAR IMPLANT MODELS (Tobey) * CABLE EQUATION MODELS OF NEURAL CIRCUITRY USING "GENESIS" (Cauller) * PROBABILISTIC AND CONNECTIONIST LANGUAGE MODELS (Golden) * MATHEMATICAL ANALYSIS AND DESIGN OF ARTIFICIAL NEURAL NETS (Abdi,Golden) * ADVANCED STATISTICAL DATA ANALYSIS METHODS (Abdi, Golden, Rollins) Assistantships and support for conference travel are available to qualified students on a competitive basis. For more information regarding our doctoral program in this area please see our web site located at: www.utdallas.edu/dept/hd/ or contact me (golden at utdallas.edu) for additional information regarding our program. Application materials are available online (www.utdallas.edu/dept/graddean/forms/gradapp.html). AREA FACULTY: Herve Abdi, Ph.D., University of Aix-en-Provence 1980 (www.utdallas.edu/~herve) Peter Assmann, Ph.D., University of Alberta 1985 (www.utdallas.edu/~assmann) Larry Cauller, Ph.D., Northeastern Ohio Universities College of Medicine (www.utdallas.edu/~lcauller) Richard M. Golden, Ph.D., Brown University 1987 (www.utdallas.edu/~golden) Alice O'Toole, Ph.D., Brown University 1988 (www.utdallas.edu/~otoole) Pamela Rollins, Ph.D., Harvard Graduate School of Education 1994 (www.callier.utdallas.edu) Emily Tobey, Ph.D., City University of New York 1981 (www.callier.utdallas.edu)